The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Nvidia Optimus reduces gaming performance

    Discussion in 'Gaming (Software and Graphics Cards)' started by yrekabakery, May 8, 2019.

  1. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
  2. pipyakas

    pipyakas Notebook Guru

    Reputations:
    14
    Messages:
    56
    Likes Received:
    40
    Trophy Points:
    26
    Well let's take it as the trade off for not running the dGPU when not needed? Which is the entire point of Optimus
    It just sucks that most OEMs wont include the option to turn it off, probably due to difficulties in designing another pass to the display from the dGPU
     
    Zymphad and Vasudev like this.
  3. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Yeah but that difference is startling at times. 256 FPS vs. 138 FPS in CS:GO is insane, and maybe for someone who plays it competitively that could be a real dealbreaker. The Optimus performance can't even get the full utility of the 144Hz panel.

    And this seems to be a recent development too. Optimus vs. non-Optimus didn't have this kind of regression prior to Turing laptops.
     
  4. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    It might just be something about the implementation on this laptop.
     
  5. Ragib Zaman

    Ragib Zaman Notebook Enthusiast

    Reputations:
    0
    Messages:
    46
    Likes Received:
    6
    Trophy Points:
    16

    As explained in the YouTube video, Optimus utilizes the GPU for rendering the graphics, but then still passes through to the CPU graphics for output to the monitor, introducing an overhead on the CPU. The more a game's FPS is CPU dependent, the larger the FPS difference between Optimus and G-Sync FPS will be. On one end is CS:GO, known for being notoriously CPU bottlenecked so it sees a large increase from being released from that overhead. On the other end, games which are totally GPU bottlenecked and minimally CPU dependent will experience minimal loss of FPS from incurring the Optimus overhead. Most games will typically be somewhere in the middle, with between 7% and 20% FPS gain.
     
    slimmolG likes this.
  6. cucubits

    cucubits Notebook Deity

    Reputations:
    207
    Messages:
    761
    Likes Received:
    655
    Trophy Points:
    106
    I'm not bothered by optimus. Asus bundled software lets you just active discrete graphics only and that's that, no more optimus. Just had to do this once and it only runs in g-sync mode all the time.
     
  7. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Perhaps, although it should be noted that the video creator tested both the Asus GX701 and Dell G5 and observed the same issue.

    I’m not sure if that’s the reason. AC Odyssey and Watch Dogs 2, games which are more CPU-intensive (he showed how WD2 used 100% of the i7-8750H with or without Optimus) saw less of a performance drop than DOTA 2 and CS:GO, which do not fully utilize the CPU. Ghost Recon: Wildlands should be completely GPU-bound at those settings on that setup, yet still saw a performance drop.
     
  8. Mastermind5200

    Mastermind5200 Notebook Virtuoso

    Reputations:
    372
    Messages:
    2,152
    Likes Received:
    826
    Trophy Points:
    131
    I personally don't see why vendors don't go for dGPU only/iGPU only, its the best solution IMO
     
    c69k and Prototime like this.
  9. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    iGPU only means less possibility to Clean out Your Wallet. Add in more features then charge you more :D
     
    Last edited: May 13, 2019
    Dennismungai and TBoneSan like this.
  10. Mastermind5200

    Mastermind5200 Notebook Virtuoso

    Reputations:
    372
    Messages:
    2,152
    Likes Received:
    826
    Trophy Points:
    131
    iGPU only is a feature I can see many people wanting, heavily increased battery life for when you don't need your dGPU active.
     
    Prototime likes this.
  11. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Which in the end will lower the manufacturers / OEM and resellers earnings. Not in favor of them. Why should they bother with what the consumers want? :D
     
  12. Temp1234453

    Temp1234453 Notebook Consultant

    Reputations:
    15
    Messages:
    214
    Likes Received:
    33
    Trophy Points:
    41
    Does this happen with Turing only?
     
  13. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Optimus had a small FPS impact in previous GPU generations as well, but to nowhere near this degree.
     
  14. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    I actually commented on the YT video explaining the performance differentials, but I'll chuck it in here:

    TL;DR Optimus has 2 different bottlenecks, one is CPU utilization (CPU-Capped games) and the other is PCIE bus utilization (high framerates).

    If you happen to trigger both of these conditions you'll see a massive difference like in CS:GO. If you trigger neither of them, then performance will be almost the same like in Witcher 3.
     
    Last edited: May 14, 2019
  15. Mastermind5200

    Mastermind5200 Notebook Virtuoso

    Reputations:
    372
    Messages:
    2,152
    Likes Received:
    826
    Trophy Points:
    131
    Whats worse is that eGPUs have a X4/x2 PCH link, not direct CPU link. Hoping a manufacturer comes up with a x16 link like the GS30 or Thunderbolt 4 is a x8 3.0 or x4 4.0 connection, and maybe depending on vendor, through the CPU
     
    hfm likes this.
  16. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    The Thunderbolt 3 standard has been released and will likely be integrated fully into the USB4.0 spec. It will presumably be merged with PCIE 4.0.

    The key difference will be how it's linked. As you say, current laptops are all linked via PCH which is an Intel limitation (Intel mobile CPUs only allow direct PCIE connections for graphics devices). Some desktop boards have add-in cards which allow Thunderbolt 3 to be connected directly to the CPU and the bottlenecking is significantly reduced. However, TB3 for eGPU applications is entirely pointless on desktop hardware.

    Ryzen based mobile CPUs could theoretically be connected in that manner as they have 1x8 for the GPU and 1x4 for NVME. You could take the NVME lanes and connect them to a Titan Ridge controller.
     
  17. Mastermind5200

    Mastermind5200 Notebook Virtuoso

    Reputations:
    372
    Messages:
    2,152
    Likes Received:
    826
    Trophy Points:
    131
    I doubt any vendor would implement a Ryzen CPU and a Intel TB controller, unless like you said, its somehow integrated into PCIE 4/USB 4
     
  18. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    This has always been the case since Optimus was first introduced. Optimus was always rubbish for a workstation or gaming laptop.

    Don't buy BGA equipped workstation/gaming laptops, simple.
     
  19. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Even gangsters like me find this ludicrous
     
  20. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    The best compromise is a MUX chip as used in ASUS and Clevo units. Having the choice is always better.

    The fact is the Nvidia GPU idle power consumption is still far too high to be competitive and there is a real market for business/gaming hybrid laptops. I'd certainly use an Optimus laptop and deal with some performance deficit than be stuck with Intel iGPU only.

    AFAIK part of the release of the Thunderbolt 3 standard to USB-IF means that third parties could begin manufacturing their own Thunderbolt 3 controllers.
    The main thing stopping AMD+TB3 integration was due to Intel requiring any device carrying Thunderbolt 3 to be certified through Intel. Theoretically that restriction has also been lifted so in the short term we could see Ryzen + Titan Ridge pairings until said third parties come to the table with their own implementations.
     
  21. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    makes next to no diff direct on the CPU or not, unless youre for some reason hammering the DMI bandwidth while using the egpu
     
  22. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Almost everything important is funneled through the DMI channel so it wouldn't be all that hard to come up with reasons to create issues.

    Keep in mind, that you don't necessarily have to saturate all x4 DMI channels to cause an impact. You would only need to create latency conditions. Off the top of my head, texture streaming could do this.
     
  23. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Thankfully it's not that simple, texture streaming in fact doesn't cause issues at least in the latest games. I've had no issues. Yes almost everything is funneled but cmon, what uses 40Gb? or 3200MB a second besides an SSD or TB3 GPU.
     
  24. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    My MSI GT72 w/ 980m doesn't have optimus and whatever extra that cost me was worth it. I'm sure I'll be making a request thread for the short list of gaming laptops with manual iGPU/dGPU switching in approximately two years from today.
     
  25. heretofore

    heretofore Notebook Consultant

    Reputations:
    10
    Messages:
    109
    Likes Received:
    37
    Trophy Points:
    41
    Thanks to yrekabakery for bringing this issue to our attention.
    I google searched for more info.

    In windows, go to "Display Settings" or "Display Properties"
    It will show which display is connected to which graphics processor.

    windows_display_settings.jpg
     
    intruder16 and yrekabakery like this.
  26. bobbie424242

    bobbie424242 Notebook Geek

    Reputations:
    2
    Messages:
    79
    Likes Received:
    39
    Trophy Points:
    26
    Note that on Linux (using an Optimus setup) you can turn off entirely the dGPU with bbswitch, so it will consume 0W (instead of about 8W at idle) and reduce idle temp as well (about -10 degrees celsius on my Quadro P600). I do not think this is possible on Windows.
     
  27. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    In an Optimus setup the dGPU definitely does not consume 8W at idle. That’s what a power-hungry GPU like the GTX 1080 consumes without Optimus.
     
  28. bobbie424242

    bobbie424242 Notebook Geek

    Reputations:
    2
    Messages:
    79
    Likes Received:
    39
    Trophy Points:
    26
    I did more test and you might be right: running xorg using the proprietary nvidia driver, I notice about a +3W consumption at idle vs using the iGPU with the dGPU off with bbswitch.
    However, if running xorg with the iGPU without the dGPU off, I observe +8W. And whenever the dGPU is on (idle or not) the increase in temperature is significant.
     
    yrekabakery likes this.
  29. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Windows shuts off the Nvidia GPU automatically based on the driver rules. That being said, many users unknowingly use sub-optimal settings (from a power-saving perspective) which results in the Nvidia GPU being enabled and used for basic apps like Browsers. i.e Lots of people immediately change the "Preferred Graphics Processor" from Auto -> High Performance which forces any GPU accelerated app, no matter how mundane, ot fire up the Nvidia GPU.

    Furthermore, bbswitch cannot work around situations where multiple monitors are in use. Because the external display ports are almost always connected directly to the Nvidia GPU (in the case of Clevo), this will cause the Nvidia GPU to activate regardless.

    The Nvidia Proprietary driver ships with the nvidia-prime app to switch the entire DWM to use iGPU or dGPU. By default it is in dGPU mode so you will see increased idle power.
     
  30. bobbie424242

    bobbie424242 Notebook Geek

    Reputations:
    2
    Messages:
    79
    Likes Received:
    39
    Trophy Points:
    26
    I'm not certain the Windows driver can entirely switch off the dGPU so it uses 0W (what bbswitch does). In my experience and with the dGPU idle, total wattage is always higher
    on Windows than on Linux with the dGPU switched off with bbswitch. On that Linux setup (iGPU only, dGPU off) it is as low as 2-3W for total laptop power consumption at idle (everything else being quite optimized in that regard).

    Yes, that's also the case on my P72: external ports are wired to the dGPU, and that's the only reason I use the dGPU. If Lenovo had offered a config without a dGPU, I would have taken it.

    In my observations, the dGPU takes 8W idle power unless either the Xorg "nvidia" driver is also loaded (in which case it is reduced to about 3W) or it is switched off with bbswitch (0W).
     
  31. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    This looks off, in my old aw15r3 with the 1070gtx the differences werent even nearly as big.
     
  32. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Did that unit even have Optimus?
     
  33. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    If you had the non Gsync version than yes, with a switch to disable it.
     
  34. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I didn’t see as much of an Optimus performance hit with Maxwell either, so I guess Turing is impacted the most.
     
    Vasudev likes this.
  35. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Might just be a driver glitch on his system. Even the 2070mq on my new laptop with optimus is giving quite the performance bumpnover a full power and overclocked 1070gtx in my tests in low cpu usage games. In high cpu usage games the 2070mq runs away even more but that is because it also got a 6core cpu vs a quad core.
     
  36. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    He's noted this regression on several different Optimus systems, as have members of this forum and Notebookcheck. Have you compared the most egregious examples like CS:GO and DOTA 2?
     
  37. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Only CS GO, but I am well above the mentioned framerates from his video. I check the video to see which driver versions he used.
     
  38. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Have you directly compared Optimus vs. non-Optimus on the same system in CS:GO?
     
  39. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    No because my current system (and most others) dont have a mux switch available. But the performance is way higher than what Jarrod portrays. ALso there is no reason why Turing suddenly would be affected by optimus while pascal and maxwell werent.
     
  40. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Did you use the ulletical benchmark in CS:GO? Also even without a mux you can test Optimus vs. non-Optimus by using an external monitor. Hardware architecture differences could explain the higher Optimus overhead with Turing, however I’ve seen similarly large performance drop in CS:GO due to Optimus reported on Pascal systems as well.
     
  41. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Not all laptops route the external monitor the same. Not sure how it is routed in my current Razer Blade to be honest. I do know that Intel HD drivers can have huge issues with certian driver combinations. resulting into microstutters. Especially prevalent when using an external monitor and switching between duplicate, extended and second screen only screenmodes. With you can only fix by disabling the Intel HD driver and re-enabling it. I wouldnt rule this out that Jarrod was running into a similra issue.

    I did run the benchmark yes.
     
  42. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Very easy to tell. If the external monitor shows up in Nvidia control panel, then it’s routed to the dGPU.
     
  43. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Not always true, even though my monitor shows up, gsync is working etc. It is affected by the Intel drivers. when I disable those, the screen is automatically mirrored, and the stutter bug also appears when the Intel drivers are loaded, even on the external screen. In short, it still affects the system quite a bit. On my ALienware this was not the case, WHen an external monitor was in use, the iGPU drivers wouldnt even show up.
     
  44. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The iGPU still drives the internal display if it is enabled. To get rid of the stuttering, set the display output to the external monitor only so that the internal display is disabled.
     
  45. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Yeah that is what i mean. I use secondary only. But intel drivers still affect it.

    From the 3 recent laptops that i owned, triton 500,alienware 15 r3 and rb15, only the aw15r3 allowed me to use a display port fully outside of the igpu.
     
  46. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    This happens on any system with 2 GPUs running simultaneously. Happens on desktops as well and it's Windows which handles that behavior.
    The stutter is often caused by "reverse-optimus", which is where something on the NV GPU output is being accelerated by the iGPU and copied over, which is a MUCH slower process.
     
  47. Dantei

    Dantei Notebook Consultant

    Reputations:
    12
    Messages:
    271
    Likes Received:
    30
    Trophy Points:
    41
    Bought Lenovo Y540 with GF1650 last week. The laptop has a BIOS function to disable iGPU. There is a huge difference in CSGo. I'm getting now around 250-300 fps instead of 100-180.
     
    4W4K3 and yrekabakery like this.
  48. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Sounds like it has a MUX chip in it. An easy way to tell is if the Display/Panel options change from appearing in the Intel CPL to the Nvidia CPL.
    It's odd though, as usually having a MUX chip would allow you to offer G-Sync support as well.

    On Optimus machines it's impossible to have the Intel GPU offline.
     
  49. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Hmmmm wonder if I should give this a test as well to see if I get similar results.

    On my M18xR2 using 1060 in Optimus mode, hit was like 5-7%.
     
  50. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I believe the G-Sync option is reserved for the more premium Y740, although it’s nice to hear the lower end model still allows you to disable Optimus.
     
 Next page →