Hi guys i found these beasts
http://sklep.xtreem.pl/7-laptopy-dl...karta_graficzna-nvidia_gtx970m-nvidia_gtx980m
-
too bad theyre not giving out any specs yet on the 6700HQ
-
Meaker@Sager Company Representative
I'd treat any pre-release information as unreliable at the moment.
-
Da eff? So either that site is lying, or G-Sync is somehow working with Optimus now, or new P67xSx/P65xSx is MXM.
-
They are quite reliable and gamenab said that G-sync should work on optimus too (more work needed)...
-
Nvidia ordered a hit on Gamenab. He ded, bro.
-
i know this but at least we know that there is way to make G-crap work on optimus.
-
nice, so then well have double crap machines with craptimus and crap-sync?
Sent from my Nexus 5 using TapatalkBullrun likes this. -
Just because Gamenab said so? He jumped the gun and said a lot of things in the beginning that were not true.
What's so bad about G-Sync? VRR tech is inherently a good thing, Nvidia is just being Nvidia and fleecing everyone for it. On another note, Intel will be supporting Adaptive-Sync in the future, which leaves Nvidia the odd man out. But I doubt Nvidia cares when it has 80% share of the dGPU market.Prema likes this. -
i was just referring to legion343's post
i myself have no definitive opinion on gsync yet, but it wouldnt surprise me if this tech is really overhyped...
Sent from my Nexus 5 using Tapatalk -
In reality, it's the Intel GPU that would need to support adaptive sync which should be possible and it would surprise me if they didn't try given their focus on the IGP in the last few generations. The entire point of the PSR feature was to save power which Intel has been pushing really hard for the past few generations.
It's a highly mis-understood tech. Fact is, it's incredible. But one of those things you don't really notice until it's gone. -
On an iGPU-only notebook, sure, but Intel support alone is not enough for Optimus G-Sync seeing as Nvidia is intent on making money on G-Sync licenses whether or not it's contributing anything beyond a sticker and extra branding. All the extra DRM mobile G-Sync is shackled with is enough proof of that.
-
I'm working on the fact that Adaptive Sync and PSR (which is all mobile G-Sync is) is part of the eDP implementation which happens entirely within the realm of the IGP in an Optimus laptop. I can't see how Nvidia could possibly get around this if the NV chip is only writing out to the Intel framebuffer. It has no control over refresh-rates or the display.
Not that there isn't also some work required by NV though. Assuming you could get the Intel bit to do adaptive sync, the NV chip now has to write to the IGP framebuffer and somehow trigger refreshes through the IGP. Can't even begin to see how that's possible. If the IGP in the 6700K is anything to go by, then the Skylake HQ processors already support eDP1.3 which is pretty much the barrier for entry to adaptive sync, so it's entirely possible. -
Meaker@Sager Company Representative
If the IGP supports it then it should not matter too much in optimus mode as the dGPU sends complete frames to the IGP, the display timing is totally up to the IGP.
-
I'm not saying how they did that. Maybe they managed to do it through optimus or they permanently disabled iGPU or used manual switch like in last gigabyte and msi GT72...
-
I don't think it needs MXM... just no Optimus XD.
Don't the ASUS machines have soldered cards but a Gsync model? -
Without Optimus the battery time will be quite bad, maybe this has something to do with Intel's just announced support for adaptive sync?
I think all Asus laptops for sale ATM have soldered cards, but i could be mistaken. -
Well battery life could be bad sure, but you don't have to buy the Gsync model. It's not like it changes the specs of the laptop.
Gsync model = ~2 hours BL
Regular model = ~4 hours BL -
I know, I just assumed 6GB 970M and 8GB 980M were MXM since the BGA versions have half the VRAM
-
@octiceps - there is a 6GB 970M and 8GB 980M integrated, the Gigabyte chassis among others have this option, it's just that the current Clevo integrated versions are 3GB/4GB.
Last edited: Aug 21, 2015 -
Typos? But yeah I meant Clevo.
-
Yup sorry, I realised just after I clicked Post Reply but my interweb connection dropped out so couldn't imediately edit!
-
As XMG said there's higher vRAM integrated versions, but even weirder, there's low vRAM MXM versions. MSI's GT72 has been selling with 3GB 970Ms in pre-built configs on places like amazon etc.
-
Meaker@Sager Company Representative
There are 4gb 980m mxm models around too.
-
Given that Intel confirmed Adaptive Sync is coming "after" Skylake, I'm wondering if this is just some poor marketing or mistake (they don't specify desktop or mobile though, so it's still possible mobile may get it simply due to the eDP benefits).
Alternatively, it may be something like 1/2 of the mini-DP ports being connected directly to the NV GPU allowing you to use an external G-Sync panel. -
Not overhyped, overpriced... it's really a treat to see, just the artificial requirements imposed by Ngreedia is a bit troublesome.D2 Ultima likes this.
-
@XMG: Will there be a non-gsync version available, given that Gsync seems to limit the display to 1080p? Am I correct in assuming that p506/p706 will be the gsync versions with Full HD screens, and h506/h706 the non-gsync ones with higher resolution options?
Last edited: Sep 23, 2015 -
No, G-sync doesn't limit it to 1080p, it's the screen selected for this model. It can be any resolution, just needs to be "nVidia approved". Although I don't know if there are any 4k 17" LCD's yet.
-
Meaker@Sager Company Representative
Yes some very new panels.
-
That's what we have done with the Skylake "XMG U" and "Schenker W" series, so I guess you could assume that the same logic applies to our new P and H series - I couldn't possibly confirm anything until they launch though!
Customer feedback strongly favours the opinion that gamers want G-Sync but mainly prefer FHD, which fits our gaming orientated XMG sub brand, whereas owners with a more professional usage leaning are more inclined to want 4K but have no need for G-Sync - so the [W]ork series has these options. -
eDP 1.2a will have adaptative Sync directly applied to the screen...all of your screens (LP156WFXX, B156HANXXX) are 1.2 eDP panels...when adaptative sync will come, G-sync will be an overpriced useless thing...
but yes we'll have to wait a while for that...standards take a lot of time to become..standards as manufacturers need to deplete their previous standard stocks first... -
Except the 15" Clevo gets a 4k G-sync panel. I'd so much rather have a 3K or 2560x1440 LCD. But all you get is either 1080p or 4k. Nothing in between.deepfreeze12 likes this.
-
Yes, I agree, a 3K or 2.5K with G-Sync would be the perfect balance between high-res gaming and playable frame rates in games. FHD isn't asking too much of todays hardware, and 4K pushes it way too hard, if you don't have a SLI setup.HTWingNut likes this.
-
*points at sig*
Sent from my Nexus 5 using Tapatalk -
*points at missing g-sync*
-
*laughs it off and keeps pointing happily at sig*
Sent from my Nexus 5 using TapatalkTomJGX likes this. -
I still don't see the hype about GSync... Never seen tearing in game and actually, its perfectly fine for me without that...
-
Well yeah, especially if you have high framerates, but if your framerates are close to 30, it's gonna be better with the G-Sync. So if it's a poorly optimized game or/and a very heavy one, I guess it's a nice added little bonus to have.
Never hurts to have more extras.
-
Because you have to see it first *ba-dum tsss*James D, TomJGX and deepfreeze12 like this.
-
Support.1@XOTIC PC Company Representative
Yeah, it helps out most between 30 and 60 fps. So it might be all that important on newer systems now that can run games well, and don't have much tearing, but it might be a nice thing to have as a system ages.
-
I had it for a little while, and I personally didn't see a whole lot of difference, but I see it as a way to future-proof your machine. With G-SYNC, not only will you get 6-12 months extra out of your machine, it will make even current games look more fluid. It essentially negates the drawbacks we currently face (no V-SYNC and screen tearing, or V-SYNC on and lag/latency).
Most people who've had it now swear by it.
One way to think of it is like so: Imagine you have enough money for either a GTX 980M, or a GTX 970M + G-SYNC. While the 980M will pump out more FPS, the 970M+G-SYNC combo will look better/smoother, even with less FPS.
Either way, currently looks like Clevo machines will cost about $100 more to include G-SYNC, and at that price, I'm thinking there's no reason NOT to get it. You can turn it off if you want. -
Meaker@Sager Company Representative
It really works well if you overclock your monitor too since it allows you take advantage of scenes where you go over 60fps but does not punish you if you can't.
-
Support.1@XOTIC PC Company Representative
Even if the screen has more than 60hz (overclocked or stock refresh), it would still look better and more fluid than just 60hz. So there would be some benefit if it is a 75hz by default, whether the gysnc is used or not.
-
Meaker@Sager Company Representative
Yes but if you are sensitive to tearing and your GPU is performing at 65fps. At 60hz V-Sync you get 60fps, but at say 80hz you would only get 40fps (as half of 80hz). With G-sync you would get your full 65fps (and hz).
-
I'm pretty sure all the "75Hz" panels are just 60Hz panels that are overclocked by the OEMs to 75Hz anyway.
-
they indeed are, since there arent any panels that run 75hz stock in their specs
Sent from my Nexus 5 using Tapatalk -
Meaker@Sager Company Representative
Define overclocked? If a part is binned and quality tested at higher frequencies then that's not overclocking. That's shipping it at a higher frequency.
-
semantics
Sent from my Nexus 5 using Tapatalk
Thin clevo G-sync laptops
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Legion343, Aug 20, 2015.