The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Optimus, why does something so good have to be so bad?

    Discussion in 'Gaming (Software and Graphics Cards)' started by jeffmd, Nov 30, 2016.

  1. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    So I went with an optimus based gaming laptop instead of a gsync because I didn't want to absolutely destroy my battery life. I had been using adaptive synch on my desktop with success and figured I could just use that. Well now it turns out adaptive is not supported when optimus is used, and that isn't just limited to dGPU but eGPUs as well. Seems like optimus has as many serious faults as it does perks. :/
     
    i_pk_pjers_i and hmscott like this.
  2. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    adaptive sync is not freesync/adaptive refresh rate


    optimus has always been cancer, and will stay that way. you can buy non optimus (dGPU only) laptops to alleviate this concern of take a look at MSi laptops which have a hard toggle between full iGPU mode and full dGPU or Alienware with optimus / full dGPU mode mode if you'd like
     
    jaug1337, TBoneSan, TomJGX and 2 others like this.
  3. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,604
    Messages:
    23,562
    Likes Received:
    36,865
    Trophy Points:
    931
    The last time I used Optimus was on an ASUS ROG G750JX and I swore to never buy a system with Optimus after that. As nice as it is to have better battery life, I had constant headaches trying to force apps like some games or video players to only use the dGPU for better performance, I got tired of manually going into each app in the global settings to ensure it's using the dGPU then I was thinking, heck, why did I even get Optimus if I find myself preferring to run everything as it should, which is, using the dGPU for better performance / rendering, be it a browser or a video player or a game.

    Just my 2 cents

    Edit: My bad, it's not the G750JX which had Optimus I can't remember which ASUS ROG laptop it was. Thanks to @hmsscott for correcting me :eek: :rolleyes:
     
    Last edited: Dec 1, 2016
  4. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    the last i had a system with optimus was the alienware 17 (2013 model) first thing i did was to switch it to dedicated mode and when i got a 120Hz panel to replace the 60hz stock panel it disabled it by default.

    in other words: optimus is a joke.;
     
    hmscott and saturnotaku like this.
  5. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,705
    Trophy Points:
    431
    It's amazing how even 5+ years (a virtual eternity in the tech industry) later NVIDIA still hasn't come within 100 miles of perfecting Optimus, especially given the explosive growth of the portable computer market during that time. My old MacBook Pro had automatic GPU switching. It was only supported within OS X but almost always worked correctly. In the few instances it didn't, someone wrote a program that allowed you to override it and set whatever GPU you wanted to use via the menu bar. Switching didn't work at all in Windows when running Boot Camp, which was actually a major factor when I bought the system so I wouldn't have to mess around with it like I did with a previous Optimus-enabled Clevo laptop I previously had (and ended up returning).

    It seems that several manufacturers have finally wised up to Optimus' awfulness and now allow the user to decide what they want, either through keyboard shortcuts or the BIOS, as is the case with my Clevo P670RS. Most folks who buy a gaming notebook are likely to be enthusiasts anyway so they'll know their way around to tailor their settings how they want.
     
    Last edited: Dec 1, 2016
    hmscott likes this.
  6. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    Nope.. but it IS better then vsynch and I haven't had ANY issues with visual tearing. Seems great for something that is essentially "free", as in you don't need any additional hardware, it is simply a step up from ancient vsync

    Oh, my new laptop is MSI. I will need to look into the full dgpu toggle, havn't heard any mention of it. I did notice the power button toggles colors to tell you which mode it is in so there is no question. EDIT: Looks like only GT models that feature optimus will feature the toggle.

    I've played with optimus on my asus q550lf for many years now and didn't find the swap between gpu modes to be any where near as annoying or problematic. The only game that had issues with it was gta5 on release and that was an issue patched into the game within a couple weeks. I think I would have found having %50 of my battery life far far more problematic. I mean why CAN'T nvidia figure out how to sip power like an iGPU when not gaming?

    It also seems like you wanted your nvidia on for stuff the intel already accells at... ie browser and video playback. Unless you were playing doom in a browser, you most likely didn't need to.
     
    Last edited: Dec 1, 2016
    hmscott likes this.
  7. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    Ouch, that was a bad choice. Not sure if adaptive vsync is forbidden for optimus though.
     
    hmscott likes this.
  8. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    I don't know that yet, I think I would still prefer to deal with vsync than have hours of my battery life chopped off. And it seems to be an optimus thing, when I searched for the issue I was finding post by alienware users who had the same issue, and it even carried over to the alienware blaster.
     
    hmscott and Prototime like this.
  9. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    The MSi GE/GP/GS series have the HDMI and mDP routed to dGPU only, so you can output from the GPU directly to an external monitor if that's needed.

    Only the GT73VR and GT83VR for Pascal generation have the iGPU/dGPU toggle.
     
    hmscott and sasuke256 like this.
  10. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    Mobius, ok, so thats a little better then. I'll have this thing hooked to a bigger screen when at home. It is replacing my desktop, just havn't done the full switch in yet.
     
    hmscott likes this.
  11. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    I don't know about those "hours of battery life chopped" considering the fact that Nvidia GPU in low performance mode takes substantially less power than iGPU in the high performance mode and there are situations when low-end iGPU needs to work at full while Nvidia GPU would still stay at it's low performance state without getting sweat.
    So an hour maybe. Hours and hours - nah.
    P.S. Try Nvidia Inspector and choose Adaptive vsync there. Will it work?
     
    hmscott likes this.
  12. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    Sorry yea... it is about an hour more or less when on casual browsing and movie playback. I tried out the app and fiddled with the vsync options. It did seem to do something, but all for the worse. BTW I guess it was fast sync I was using on the desktop, not adaptive. Trying to force fast sync just introduced a constant micro stutter. So stuck with vsync I am.

    Also found out that chrome reacts badly on an optimus system while using hdmi out. Related to its hardware acceleration, I found a strange pause when scrolling pages and HTML 5 video won't show up full screen. Its getting confused with the 2 GPUs.

    Seems there is no impact if I just turn chromes hardware acceleration off.
     
    Last edited: Dec 1, 2016
    hmscott likes this.
  13. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You don't need Optimus to use the iGPU to get better battery life, you can get a laptop with a MUX switch to switch between dGPU (Nvidia) and iGPU (Intel).

    With iGPU only the dGPU isn't even powered up, and isn't visible in the Device Hardware list. It's pure power saving with iGPU only.

    With Optimus on battery the dGPU can be available, enabled and powered up, and even if not really used - you have everything pointing to the iGPU for GPU access, the dGPU is still drawing power, so it's not optimal battery savings.

    There is no need for Optimus :)
     
  14. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31

    Well there's your problem! I can count the number of laptops I have seen with these switches with zero fingers. It comes down to optimus, or gsync dedicated.

    And the amount of power the dGPU draws when the iGPU is in use is so insignificant, you can't even access its sensors because THOSE are off.
     
    hmscott likes this.
  15. Prototime

    Prototime Notebook Evangelist

    Reputations:
    201
    Messages:
    639
    Likes Received:
    883
    Trophy Points:
    106
    I've seen Clevo laptops with a MUX switch that allows toggling between a "dGPU-only mode" and an "MS-Hybrid mode" (the latter of which is basically Optimus). Are there also laptops out there that allow toggling between a "dGPU-only mode" and ai "iGPU-only mode"? That would be better than what I've seen.
     
    jaug1337 and hmscott like this.
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You haven't been looking hard enough :)

    G-sync isn't required for dGPU, and in fact you don't need a G-sync panel for dGPU work.

    With AW you have to be careful though, if you don't have a G-sync panel, AW with automatically give you an Optimus setup.

    AWused to be switchable between iGPU/Optimus and dGPU/no-Optimus, but I haven't seen anyone confirm it's changeable in the new 2016 models, it looks like you either get dGPU with G-sync and no Optimus, or you get a iGPU with a non-Gsync panel.

    That might be where the confusion about G-sync = dGPU - when in fact that's new and only on AW's newest laptops.

    MSI has 2 laptops with MUX switch with no Optimus in any mode.

    The GT73VR and GT83VR, and you get dGPU +- G-sync and iGPU non G-sync. The GT83VR 18.4" panel is ancient and never came as G-sync.

    The GT72S used to come with a MUX switch with no Optimus in any mode, and dGPU +- Gsync and iGPU non G-sync, but this year the GT72VR is dGPU only - no switch.

    Other makes have other options, keep looking for an iGPU / dGPU switchable, it's awesome.

    Optimus has so many drawbacks you haven't even begun to imagine, but I know you know everything and will need to learn it for yourself, and for that I am sorry for your wasted time. :)
     
  17. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Optimus is really, really, really dumb, full stop. The idea is great: use the on-die iGPU for low-intensity work, switch to the dGPU for more resource-intensive work. But the implementation is absolutely terrible. The best would be to let the OS control a multiplexer and a power control, where the running application requests a certain amount of GPU power to the OS, and the OS decides which GPU to use.

    This is my idea of how Optimus would work: by default, the iGPU would run on the desktop, its framebuffer output being passed to the mux and then on to the internal display/output ports. When the OS sends a request for the dGPU to start, the dGPU is powered on, clocks up if necessary, its framebuffer syncing exactly with that of the iGPU (as the OS sends the same signal to both GPUs, keeping both framebuffer outputs the same). Then, the mux discards any output from the iGPU and the dGPU takes charge, providing entire control over the display. The iGPU may or may not be switched off at this point. I suspect this is how it's done on OS X.

    Let me say this again: Apple has got three things right on their MacBooks. Their display aspect ratio (screw 16:9, 16:10 is the best), the displays themselves (I still think MacBook displays have the best calibration and pixel density, viewing angles and general quality), and their dGPU switching, which appears to be manufacturer-agnostic, given that Apple alternates between AMD and nVidia every couple of years or so.
     
    steberg and hmscott like this.
  18. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    No...not confused. However the current crop of 1060 laptops at the moment don't seem to feature any switches, and the only ones that are NOT using optimus are the ones with Gsync panels because optimus can't be used.
     
    hmscott likes this.
  19. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    At that price range you usually don't see a MUX switch, you need to get into the top range of models before you see it.

    At the low end you get differentiation via one mode or the other. dGPU only or iGPU with Optimus only.

    You can actually have a G-sync panel with iGPU / dGPU switchable, and some switchable iGPU modes have also supported Optimus on the past. In iGPU mode G-sync is unavailable.

    So having a G-sync panel doesn't necessarily mean no Optimus, even though it does in the models you are seeing. Same goes for Asus GL502 series.

    I didn't say you were confused, I was saying the obfuscation of configuration was confusing - making it difficult to assign clear cut rules for Optimus / no Optimus vs G-sync :cool:
     
  20. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    I see... yea I pretty much didnt even bother looking at any of the models above 2k. I figured it was pretty much just 4k displays and 1tb SSDs causing them to be so much.
     
    hmscott likes this.
  21. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    376.09 is acting up for me again. First abnormally lower performance and now no dGPU support at all. The dGPU icons flashes and I either get a error message about no usable video modes or still run the Intel. :mad:
    I think it's the exact opposite. Optimus is trying to play smart (but not smart enough) while it should be dumb. Different GPUs (or even driver versions) have different feature sets. There is no way you can reliably migrate a DX/OGL context from one device to another. And assigning a new context to one device using some undocumented software voodoo isn't working well, as many of us in this thread have found.

    Mac does it by leaving a simple attribute in the software's packaging (which is put in place by the human developer, who is usually informed) to control where generated OGL contexts should go, not playing smart and making any guesses at runtime, which is why it works reliably. A similar agreement can be made for Bumblebee (unofficial Optimus support on Linux) which only exposes one OGL device to the application. The app either run on the NV GPU as expected or crash loudly. You're never left on some middle ground scratching your head.
     
    Last edited: Dec 7, 2016
    jaug1337 likes this.
  22. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    And here I am dealing with that piece of sh*** Optimus on my MSI GT70 laptop... even with some of the newer games it just decides to NOT use my GPU and I have to manually set it to do so.

    .. That is even after all the unnecessary BIOS, vBIOS and Windows tweaks.


    It has been 4 years and it still manages to kick me, when I'm laying down, more than my girlfriend.

    edit: important grammar fixes
     
  23. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    If you can get it to run on the right GPU manually, that's already great. Sometimes manual selection doesn't work.
     
  24. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    My god. I cant even imagine the horror.

    I am doing just fine atm.
     
  25. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681

    does the gt70 actually use optimus? I remember it have a gpu switch on the touch bar portion.
     
  26. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Correction.

    I have the MSI 1762, GT70 chassis, w/ a 680M, so I am rolling in the deep.
     
  27. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    See my post above. #21

    I'd rather do it manually all the time (in reality all I would need to do is changing the command line on the game's shortcut once) than dealing with this horror.