-
yrekabakery Notebook Virtuoso
-
Well let's take it as the trade off for not running the dGPU when not needed? Which is the entire point of Optimus
It just sucks that most OEMs wont include the option to turn it off, probably due to difficulties in designing another pass to the display from the dGPU -
yrekabakery Notebook Virtuoso
Yeah but that difference is startling at times. 256 FPS vs. 138 FPS in CS:GO is insane, and maybe for someone who plays it competitively that could be a real dealbreaker. The Optimus performance can't even get the full utility of the 144Hz panel.
And this seems to be a recent development too. Optimus vs. non-Optimus didn't have this kind of regression prior to Turing laptops.Prototime, Papusan, Vasudev and 1 other person like this. -
It might just be something about the implementation on this laptop.
-
As explained in the YouTube video, Optimus utilizes the GPU for rendering the graphics, but then still passes through to the CPU graphics for output to the monitor, introducing an overhead on the CPU. The more a game's FPS is CPU dependent, the larger the FPS difference between Optimus and G-Sync FPS will be. On one end is CS:GO, known for being notoriously CPU bottlenecked so it sees a large increase from being released from that overhead. On the other end, games which are totally GPU bottlenecked and minimally CPU dependent will experience minimal loss of FPS from incurring the Optimus overhead. Most games will typically be somewhere in the middle, with between 7% and 20% FPS gain.slimmolG likes this. -
I'm not bothered by optimus. Asus bundled software lets you just active discrete graphics only and that's that, no more optimus. Just had to do this once and it only runs in g-sync mode all the time.
-
yrekabakery Notebook Virtuoso
Perhaps, although it should be noted that the video creator tested both the Asus GX701 and Dell G5 and observed the same issue.
I’m not sure if that’s the reason. AC Odyssey and Watch Dogs 2, games which are more CPU-intensive (he showed how WD2 used 100% of the i7-8750H with or without Optimus) saw less of a performance drop than DOTA 2 and CS:GO, which do not fully utilize the CPU. Ghost Recon: Wildlands should be completely GPU-bound at those settings on that setup, yet still saw a performance drop. -
Mastermind5200 Notebook Virtuoso
I personally don't see why vendors don't go for dGPU only/iGPU only, its the best solution IMO
-
iGPU only means less possibility to Clean out Your Wallet. Add in more features then charge you more
Last edited: May 13, 2019Dennismungai and TBoneSan like this. -
Mastermind5200 Notebook Virtuoso
iGPU only is a feature I can see many people wanting, heavily increased battery life for when you don't need your dGPU active.Prototime likes this. -
Which in the end will lower the manufacturers / OEM and resellers earnings. Not in favor of them. Why should they bother with what the consumers want?
-
Does this happen with Turing only?
-
yrekabakery Notebook Virtuoso
Optimus had a small FPS impact in previous GPU generations as well, but to nowhere near this degree. -
I actually commented on the YT video explaining the performance differentials, but I'll chuck it in here:
TL;DR Optimus has 2 different bottlenecks, one is CPU utilization (CPU-Capped games) and the other is PCIE bus utilization (high framerates).
If you happen to trigger both of these conditions you'll see a massive difference like in CS:GO. If you trigger neither of them, then performance will be almost the same like in Witcher 3.Last edited: May 14, 2019bobbie424242, yrekabakery, hfm and 2 others like this. -
Mastermind5200 Notebook Virtuoso
Whats worse is that eGPUs have a X4/x2 PCH link, not direct CPU link. Hoping a manufacturer comes up with a x16 link like the GS30 or Thunderbolt 4 is a x8 3.0 or x4 4.0 connection, and maybe depending on vendor, through the CPU
hfm likes this. -
The Thunderbolt 3 standard has been released and will likely be integrated fully into the USB4.0 spec. It will presumably be merged with PCIE 4.0.
The key difference will be how it's linked. As you say, current laptops are all linked via PCH which is an Intel limitation (Intel mobile CPUs only allow direct PCIE connections for graphics devices). Some desktop boards have add-in cards which allow Thunderbolt 3 to be connected directly to the CPU and the bottlenecking is significantly reduced. However, TB3 for eGPU applications is entirely pointless on desktop hardware.
Ryzen based mobile CPUs could theoretically be connected in that manner as they have 1x8 for the GPU and 1x4 for NVME. You could take the NVME lanes and connect them to a Titan Ridge controller. -
Mastermind5200 Notebook Virtuoso
I doubt any vendor would implement a Ryzen CPU and a Intel TB controller, unless like you said, its somehow integrated into PCIE 4/USB 4
-
This has always been the case since Optimus was first introduced. Optimus was always rubbish for a workstation or gaming laptop.
Don't buy BGA equipped workstation/gaming laptops, simple. -
Even gangsters like me find this ludicrous
-
The best compromise is a MUX chip as used in ASUS and Clevo units. Having the choice is always better.
The fact is the Nvidia GPU idle power consumption is still far too high to be competitive and there is a real market for business/gaming hybrid laptops. I'd certainly use an Optimus laptop and deal with some performance deficit than be stuck with Intel iGPU only.
AFAIK part of the release of the Thunderbolt 3 standard to USB-IF means that third parties could begin manufacturing their own Thunderbolt 3 controllers.
The main thing stopping AMD+TB3 integration was due to Intel requiring any device carrying Thunderbolt 3 to be certified through Intel. Theoretically that restriction has also been lifted so in the short term we could see Ryzen + Titan Ridge pairings until said third parties come to the table with their own implementations. -
makes next to no diff direct on the CPU or not, unless youre for some reason hammering the DMI bandwidth while using the egpu
-
Almost everything important is funneled through the DMI channel so it wouldn't be all that hard to come up with reasons to create issues.
Keep in mind, that you don't necessarily have to saturate all x4 DMI channels to cause an impact. You would only need to create latency conditions. Off the top of my head, texture streaming could do this. -
Thankfully it's not that simple, texture streaming in fact doesn't cause issues at least in the latest games. I've had no issues. Yes almost everything is funneled but cmon, what uses 40Gb? or 3200MB a second besides an SSD or TB3 GPU.
-
thegreatsquare Notebook Deity
My MSI GT72 w/ 980m doesn't have optimus and whatever extra that cost me was worth it. I'm sure I'll be making a request thread for the short list of gaming laptops with manual iGPU/dGPU switching in approximately two years from today. -
Thanks to yrekabakery for bringing this issue to our attention.
I google searched for more info.
In windows, go to "Display Settings" or "Display Properties"
It will show which display is connected to which graphics processor.
intruder16 and yrekabakery like this. -
Note that on Linux (using an Optimus setup) you can turn off entirely the dGPU with bbswitch, so it will consume 0W (instead of about 8W at idle) and reduce idle temp as well (about -10 degrees celsius on my Quadro P600). I do not think this is possible on Windows.
-
yrekabakery Notebook Virtuoso
In an Optimus setup the dGPU definitely does not consume 8W at idle. That’s what a power-hungry GPU like the GTX 1080 consumes without Optimus. -
I did more test and you might be right: running xorg using the proprietary nvidia driver, I notice about a +3W consumption at idle vs using the iGPU with the dGPU off with bbswitch.
However, if running xorg with the iGPU without the dGPU off, I observe +8W. And whenever the dGPU is on (idle or not) the increase in temperature is significant.yrekabakery likes this. -
Windows shuts off the Nvidia GPU automatically based on the driver rules. That being said, many users unknowingly use sub-optimal settings (from a power-saving perspective) which results in the Nvidia GPU being enabled and used for basic apps like Browsers. i.e Lots of people immediately change the "Preferred Graphics Processor" from Auto -> High Performance which forces any GPU accelerated app, no matter how mundane, ot fire up the Nvidia GPU.
Furthermore, bbswitch cannot work around situations where multiple monitors are in use. Because the external display ports are almost always connected directly to the Nvidia GPU (in the case of Clevo), this will cause the Nvidia GPU to activate regardless.
The Nvidia Proprietary driver ships with the nvidia-prime app to switch the entire DWM to use iGPU or dGPU. By default it is in dGPU mode so you will see increased idle power. -
I'm not certain the Windows driver can entirely switch off the dGPU so it uses 0W (what bbswitch does). In my experience and with the dGPU idle, total wattage is always higher
on Windows than on Linux with the dGPU switched off with bbswitch. On that Linux setup (iGPU only, dGPU off) it is as low as 2-3W for total laptop power consumption at idle (everything else being quite optimized in that regard).
Yes, that's also the case on my P72: external ports are wired to the dGPU, and that's the only reason I use the dGPU. If Lenovo had offered a config without a dGPU, I would have taken it.
In my observations, the dGPU takes 8W idle power unless either the Xorg "nvidia" driver is also loaded (in which case it is reduced to about 3W) or it is switched off with bbswitch (0W). -
This looks off, in my old aw15r3 with the 1070gtx the differences werent even nearly as big.
-
yrekabakery Notebook Virtuoso
Did that unit even have Optimus? -
If you had the non Gsync version than yes, with a switch to disable it.
-
yrekabakery Notebook Virtuoso
I didn’t see as much of an Optimus performance hit with Maxwell either, so I guess Turing is impacted the most.Vasudev likes this. -
Might just be a driver glitch on his system. Even the 2070mq on my new laptop with optimus is giving quite the performance bumpnover a full power and overclocked 1070gtx in my tests in low cpu usage games. In high cpu usage games the 2070mq runs away even more but that is because it also got a 6core cpu vs a quad core.
-
yrekabakery Notebook Virtuoso
He's noted this regression on several different Optimus systems, as have members of this forum and Notebookcheck. Have you compared the most egregious examples like CS:GO and DOTA 2? -
Only CS GO, but I am well above the mentioned framerates from his video. I check the video to see which driver versions he used.
-
yrekabakery Notebook Virtuoso
Have you directly compared Optimus vs. non-Optimus on the same system in CS:GO? -
No because my current system (and most others) dont have a mux switch available. But the performance is way higher than what Jarrod portrays. ALso there is no reason why Turing suddenly would be affected by optimus while pascal and maxwell werent.
-
yrekabakery Notebook Virtuoso
Did you use the ulletical benchmark in CS:GO? Also even without a mux you can test Optimus vs. non-Optimus by using an external monitor. Hardware architecture differences could explain the higher Optimus overhead with Turing, however I’ve seen similarly large performance drop in CS:GO due to Optimus reported on Pascal systems as well. -
Not all laptops route the external monitor the same. Not sure how it is routed in my current Razer Blade to be honest. I do know that Intel HD drivers can have huge issues with certian driver combinations. resulting into microstutters. Especially prevalent when using an external monitor and switching between duplicate, extended and second screen only screenmodes. With you can only fix by disabling the Intel HD driver and re-enabling it. I wouldnt rule this out that Jarrod was running into a similra issue.
I did run the benchmark yes. -
yrekabakery Notebook Virtuoso
Very easy to tell. If the external monitor shows up in Nvidia control panel, then it’s routed to the dGPU. -
Not always true, even though my monitor shows up, gsync is working etc. It is affected by the Intel drivers. when I disable those, the screen is automatically mirrored, and the stutter bug also appears when the Intel drivers are loaded, even on the external screen. In short, it still affects the system quite a bit. On my ALienware this was not the case, WHen an external monitor was in use, the iGPU drivers wouldnt even show up.
-
yrekabakery Notebook Virtuoso
The iGPU still drives the internal display if it is enabled. To get rid of the stuttering, set the display output to the external monitor only so that the internal display is disabled. -
Yeah that is what i mean. I use secondary only. But intel drivers still affect it.
From the 3 recent laptops that i owned, triton 500,alienware 15 r3 and rb15, only the aw15r3 allowed me to use a display port fully outside of the igpu. -
This happens on any system with 2 GPUs running simultaneously. Happens on desktops as well and it's Windows which handles that behavior.
The stutter is often caused by "reverse-optimus", which is where something on the NV GPU output is being accelerated by the iGPU and copied over, which is a MUCH slower process. -
Bought Lenovo Y540 with GF1650 last week. The laptop has a BIOS function to disable iGPU. There is a huge difference in CSGo. I'm getting now around 250-300 fps instead of 100-180.
4W4K3 and yrekabakery like this. -
Sounds like it has a MUX chip in it. An easy way to tell is if the Display/Panel options change from appearing in the Intel CPL to the Nvidia CPL.
It's odd though, as usually having a MUX chip would allow you to offer G-Sync support as well.
On Optimus machines it's impossible to have the Intel GPU offline. -
Hmmmm wonder if I should give this a test as well to see if I get similar results.
On my M18xR2 using 1060 in Optimus mode, hit was like 5-7%. -
yrekabakery Notebook Virtuoso
I believe the G-Sync option is reserved for the more premium Y740, although it’s nice to hear the lower end model still allows you to disable Optimus.
Nvidia Optimus reduces gaming performance
Discussion in 'Gaming (Software and Graphics Cards)' started by yrekabakery, May 8, 2019.