The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    4 Years since Nvidia Optimus Started... How is it?

    Discussion in 'Gaming (Software and Graphics Cards)' started by cowteriyaki, Nov 4, 2014.

  1. cowteriyaki

    cowteriyaki Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5
    So after getting sick of my ty 5 yr old laptop I've been looking for a new GTX 900m laptop to buy.

    Trying to get as much info as possible I've been browsing this forum for a while and a few other ones in Korean.

    And I can't help but notice how badly the folks in the Korean community loathe the fact that Optimus is unavoidable in low-mid price ranged gaming laptops.

    From what I've gathered, besides the occasional uncalled use of integrated graphics in gpu-intense applications, Optimus forces the dedicated gpu to pass through the IGPU unless the hardware has MUX designs to completely bypass it.

    I read posts complaining how the above aspect of Optimus causes unnecessary heating in CPU, creates fps stutters and delays in everyday gaming(compared to non-Optimus hardware with same specs and everything) etc etc...

    But because I don't see much of these complaints over here (and because I've never used any Optimus laptops before) I'm getting curious just how good/bad Optimus is? Is the difference in performance and whatnot (compared to same H/W specs without Optimus) really noticeable?

    p.s. I'm not really sure where the proper place to post this is, but considering how much relevance gpu has to gaming I've decided to put it on the gaming section.
     
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Optimus systems always seem to get the short end of the stick. Adaptive V-Sync, refresh rate overclocking, 120Hz, 3D, Hackintosh (?), etc. are all unsupported along with annoyances like games/apps using the wrong GPU. It's inherently gonna be more buggy since it's an additional layer of complexity compared to non-Optimus notebooks or desktops. That's why I'm really glad I have SLI so I don't have to deal with any Optimus limitations and idiosyncrasies. I'd suggest staying away from Optimus if you can, unless you absolutely need non-gaming battery life longer than a few hours.
     
    maxheap likes this.
  3. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    The Enduro and Optimus features seem to be more annoying than helpful.

    I'd prefer to simply have the integrated graphics run while on battery, or some kind of specific manual choice to turn on or off the dedicated graphics, rather than an automatic feature that is supposedly "intelligently" extending battery life. When running demanding applications or games, I will always be using AC power, so I wouldn't need battery life, I'd need only the best possible performance.
     
  4. cowteriyaki

    cowteriyaki Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5
    Damn, it sounds like Optimus really ain't good at all. If I really need long-hours use on battery, I'd just get a tablet. I mean, the city I live in has power outlets practically everywhere haha. And it's not like you'd expect long battery hours while on heavy load anyways.

    Why aren't there any cheap-ish options for non-Optimus laptops with moderate gpu anyways? Are mux configurations that hard to design?(I practically know nothing about electric engineering.. All I know is that multiplexers are a combination of logic gates lol)

    In my country at least, most(if not all) non-Optimus laptops are either SLI beasts or Alienwares.. Both of which are over at least $2400+, sadly out of my price range.
     
  5. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Can't you simply disable the Optimus features, or somehow force it to use the dedicated GPU when you are on AC power?
     
  6. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    Well.. looking back, it's hard to escape the thought that Optimus has been a specially consumer-unfriendly ploy by Intel to claw themselves into both... the "integrated graphics and embedded systems" market, as well as the major laptop market. Either of which they really should have no right to dominate. For the reason that in the smaller embedded device market, there have been multiple massively better solutions employed, for many years now. That either have been scuttled by Intel lawsuit (anyone remember Nvidia ION, and the design that wasn't paired with 2kg of Intel motherboard pieces?), or else been simply marketed very badly - either way, the result was that Optimus allowed Intel to force themselves into the mini-pc and then later tablet market without offering anything of value. Indeed, it's not until now and the latest version of Bay Trail that Intel actually is even close to matching what was demonstrated as early as 2005 from the competitors.

    In the larger laptop market, it really did take Intel well over 10 years to finally have one chip with an integrated graphics controller that - during it's lifetime.. of one and a half year - had a fully functional graphics driver when the successor was rolled out. And it's not until last summer, when Windows released a final transition to their dwm, that Optimus actually worked properly out of the box. In the same way, the Nvidia and Intel partnership had to go all the way to Maxwell/Haswell before finally managing to create an extra bridge that doesn't break the entire system when one component turns off.

    Meanwhile, the adaptive power scheme on a mobile nvidia card since Kepler actually would let most dedicated nvidia card draw less power than any of the intel igps, when just displaying the desktop. There's also the entire driver issue with the hardware commands - a situation created by Intel and Nvidia on purpose, of course - that have complicated matters considerably when trying to get this crap to work on linux. So when speaking from the perspective of an irate tech-geek, Optimus is Satan, definitely.

    It is, however, necessary to have Satan in your laptop messenger bag in this day and age. So let's all celebrate that at least Satan works with some efficiently nowadays... since last summer and the unofficial official bridging with the Intel graphics driver and Microsoft's dwm. In the sense that an Optimus system now, in 2014, actually works towards what was advertised in terms of power-saving when Optimus was presented back in 2010. And that the slowest haswell systems are now finally as power-efficient - when you're turning the screen off and leaving the computer alone - as an AMD APU from four years ago.

    It's also time to celebrate that nvidia graphics cards have become small enough, and Intel processors efficient enough, to fit a decent amount of graphics processing power in a 40w package. Meaning that, for the first time ever, it's actually possible to have some decent portable gaming - on battery - on an intel/nvidia setup. That's good.

    But seriously, though.. if you wanted a laptop for gaming, that you just want to carry with you and plug in wherever you're actually going to sit and play. Then Optimus has no purpose whatsoever, of course. Arguably, it had no right to exist until very recently. And like explained, if it did not exist, there would have been other compact SoC-designs similar to the tegra design that had taken over the smaller mobile/laptop market long ago. In all probability, without Intel's tour de force in marketing and in the court-room, your current laptop would have been something like this today: as powerful as before, but running on an actual operating system, on battery all day, while being approximately as thick as the keyboard wander and a sheet lithium polymer battery, and weighing about as much as a 15 inch sheet of transparent plexi-glass. Also, it would probably cost a fraction of the price of a current laptop. And you could get one in black metal finish.

    ..ok. The black metal finish may be pushing it a little bit.
     
    tareyza likes this.
  7. cowteriyaki

    cowteriyaki Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5
    That's where my question arises.. Apparently from what I've heard, unless the manufacturers design it otherwise(i.e. mux) even the dedicated GPU's traffic must go through the integrated GPU in order to reach the screen. So even if I exclusively use the dedicated GPU on AC, in a way I'm still depending on the internal GPU to pass it through. And that's the part where I see so many rants about Optimus in Korean communities. I've been told they can't be disabled from the bios for this reason.

    I'm still not very knowledgeable with this.. I'd really love it if someone say otherwise and prove me I'm wrong and I can happily choose an Optimus laptop without worrying about this stuff lol.
     
  8. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I've only very rarely had trouble with Optimus. Most of the time, if there is a problem, it's just a matter of going to the NVIDIA Control Panel to manually set the game to use the dedicated GPU myself. The game I have had the most trouble with was Sonic Generations, which required a different workaround using .ini files to force the game to utilise the dedicated GPU. Other than that, it has been pretty smooth sailing.
     
    moviemarketing likes this.
  9. MrDJ

    MrDJ Notebook Nobel Laureate

    Reputations:
    2,594
    Messages:
    10,832
    Likes Received:
    363
    Trophy Points:
    501
    as far as i can remember the first graphics cards were 525 and 555 using optimus and they were hit and miss if they auto switched or not.
    the only problem ive had is a couple of new AAA games would not use my 680 so had to manually set the game exe in nvidia control panel.
    otherwise me not had any other problems.
     
  10. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    I heard that some of the Alienware systems possess a "mux" switch that enables the laptop to bypass the Intel graphics. If other members can highlight which models/revision has this feature, it would be appreciated.
     
  11. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    The gt72 has a manual switch between igp and dedicated gpu. This is my preferred method for larger machines.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    All the Alienware 17 and 18" models should have MUX (Sandy Bridge CPUs and beyond though), and SLI machines cannot use optimus. Also to this effect, the overpriced GT72 from MSI has a MUX switch and therefore doesn't use Optimus.

    The Clevo SLI models also lack Optimus too, but also lack a MUX switch.

    The benefit of Optimus is not only for low-power, low-heat, low-noise when using undemanding programs, but also that you have access to the iGPU in a way similar to how desktops do, which is not used in MUX systems which swap the display. But most people wouldn't bother or benefit from things that would use the iGPU, such as Quicksync. Who the heck is using quicksync while gaming? Unless you're livestreaming using the encoder in OBS, but the compressing is so bad that you would need to tack on way too much bitrate for it to be useful. If you're editing video and want to quicksync-render x264, then it'll still be non-gaming use as well. No point other than lacking a restart (which MUX systems need) to swap the display adapter.

    if you want, you can find some SLI or MUX systems if you look decently hard, but you might end up going away from what you actually want or spending a lot.

    I personally see no reason for Optimus, and can understand why people have problems, because most display options are removed from nVidia Control Panel and placed in the Intel menu. Lots of general users don't know this and thus can't find options they need, such as aspect ratio scaling.
     
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Optimus is not that bad guys, sheesh. It's stable and works as intended for the most part. If you're looking to use external displays a lot and use more complex or "non-standard" features, then sure, Optimus can hinder it. But in general if you use your laptop with the built-in display primarily it's not an issue. If you want those other features then buy a laptop without Optimus. And honestly any restrictions are due more to how the laptop was designed and Intel's GPU hardware and drivers since it's the pass-thru GPU.

    I did like the manual switching on the GT72 though. That was definitely a nice feature.
     
    be77solo likes this.
  14. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    I agree, I tinker much less with my current laptop, a not bleeding edge Clevo P170EM with gtx680M, I barely even notice Optimus is there and I have no issue with an external screen and extended desktop. Only when I first got it when I would have a monitoring program like an Afterburner graph open on desktop it would cause a microstutter as it wakes up the GPU to poll it for stats. That seems to be fixed now. Also some old or non-release games (gzdoom) had no profile in the driver and would default to igpu, really minor hassles to add one - for the extra hour+ of battery life over the previous gen P150HM with gtx580M without Optimus.

    Sure it was bad for its first few years but not now. Enduro seems to be at the tail end of that phase now but I have NFI. I don't need to save the few quid and buy AMD and their driver headaches.

    Tl;dr people want to still whinge about it more than is justified, unless 9xxM/new features throw up new problems which I'm sure you will find on Google I wouldn't worry about it at all.
     
    HTWingNut and be77solo like this.
  15. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    You can disable iGPU all the way from BIOS.

     
  16. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Depends on the laptop. If the dedicated GPU routes through the Intel IGP to the external and internal ports, then no, you can't. If there is a mux that allows switching then yes you can.
     
  17. landsome

    landsome Notebook Evangelist

    Reputations:
    245
    Messages:
    536
    Likes Received:
    36
    Trophy Points:
    41
    By the way, non-gaming but game-able notebooks such as Dell's and HP's 17" workstations also allow disabling Optimus.
     
  18. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    This is because the nVidia control panel options and settings are much more necessary for a workstation.

    It also applies to higher end users; I would HAPPILY again give up battery life with another clevo SLI model over taking an optimus machine because I do a decent bit with NCP that does not work with optimus.
     
  19. Seanwhat

    Seanwhat Notebook Evangelist

    Reputations:
    42
    Messages:
    327
    Likes Received:
    41
    Trophy Points:
    41
    What's everyone's problem with optimus? It works perfectly fine for me and always has.
     
    be77solo likes this.
  20. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    The Intel IGP is limited in what it can output and if you want to run Optimus, you have to pass the video through the IGP. The example that comes right to mind is that the IGP can't do 10-bit displays like the HP Dreamcolor displays.

    There are always the rare issue where optimus just refuses to switch to the dedicated GPU, that hasn't happened often lately as far as I can tell compared to when it first launched, but ti's still a possibility. I've seen other features missing mentioned in the first page of this thread. Granted, they won't affect everyone, far from it, but there are still those that do care and they are more than likely to chime in this thread.

    As mentionned in the OP, unless the hardware for it is actually there, you'll still have to have the video output go through the IGP. Notable exceptions: Alienware 17, 18, Dell Precisions M4x00, M6x00 and a few others were mentioned.
     
  21. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ..does disabling it have any practical implications at all? Going by how the drivers are set up - we're really talking about using one memory area for the frontbuffer output over another. And as far as I know, there are no latency problems that turn up because of that..
     
  22. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Disabling Optimus worsens battery life and blocks the iGPU access if you needed it for Quicksync. There is NO other downside.
     
  23. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Pretty much this.

    Even with a Kepler K5000m, there is roughly a 3 hours difference in batter life between optimus enabled and disabled, 5 hours for a M6700 with no switchable graphics and 8 hours for one with switchable.
     
  24. landsome

    landsome Notebook Evangelist

    Reputations:
    245
    Messages:
    536
    Likes Received:
    36
    Trophy Points:
    41
    Which is huge - a 60% increase in battery life, that is. Question is, how much better are the mid-range Kepler cards (K2100M, 750M etc.) at conserving power. The K5000M can be pretty hungry...