So after getting sick of my ty 5 yr old laptop I've been looking for a new GTX 900m laptop to buy.
Trying to get as much info as possible I've been browsing this forum for a while and a few other ones in Korean.
And I can't help but notice how badly the folks in the Korean community loathe the fact that Optimus is unavoidable in low-mid price ranged gaming laptops.
From what I've gathered, besides the occasional uncalled use of integrated graphics in gpu-intense applications, Optimus forces the dedicated gpu to pass through the IGPU unless the hardware has MUX designs to completely bypass it.
I read posts complaining how the above aspect of Optimus causes unnecessary heating in CPU, creates fps stutters and delays in everyday gaming(compared to non-Optimus hardware with same specs and everything) etc etc...
But because I don't see much of these complaints over here (and because I've never used any Optimus laptops before) I'm getting curious just how good/bad Optimus is? Is the difference in performance and whatnot (compared to same H/W specs without Optimus) really noticeable?
p.s. I'm not really sure where the proper place to post this is, but considering how much relevance gpu has to gaming I've decided to put it on the gaming section.
-
-
Optimus systems always seem to get the short end of the stick. Adaptive V-Sync, refresh rate overclocking, 120Hz, 3D, Hackintosh (?), etc. are all unsupported along with annoyances like games/apps using the wrong GPU. It's inherently gonna be more buggy since it's an additional layer of complexity compared to non-Optimus notebooks or desktops. That's why I'm really glad I have SLI so I don't have to deal with any Optimus limitations and idiosyncrasies. I'd suggest staying away from Optimus if you can, unless you absolutely need non-gaming battery life longer than a few hours.
maxheap likes this. -
moviemarketing Milk Drinker
The Enduro and Optimus features seem to be more annoying than helpful.
I'd prefer to simply have the integrated graphics run while on battery, or some kind of specific manual choice to turn on or off the dedicated graphics, rather than an automatic feature that is supposedly "intelligently" extending battery life. When running demanding applications or games, I will always be using AC power, so I wouldn't need battery life, I'd need only the best possible performance. -
Damn, it sounds like Optimus really ain't good at all. If I really need long-hours use on battery, I'd just get a tablet. I mean, the city I live in has power outlets practically everywhere haha. And it's not like you'd expect long battery hours while on heavy load anyways.
Why aren't there any cheap-ish options for non-Optimus laptops with moderate gpu anyways? Are mux configurations that hard to design?(I practically know nothing about electric engineering.. All I know is that multiplexers are a combination of logic gates lol)
In my country at least, most(if not all) non-Optimus laptops are either SLI beasts or Alienwares.. Both of which are over at least $2400+, sadly out of my price range. -
moviemarketing Milk Drinker
Can't you simply disable the Optimus features, or somehow force it to use the dedicated GPU when you are on AC power?
-
Well.. looking back, it's hard to escape the thought that Optimus has been a specially consumer-unfriendly ploy by Intel to claw themselves into both... the "integrated graphics and embedded systems" market, as well as the major laptop market. Either of which they really should have no right to dominate. For the reason that in the smaller embedded device market, there have been multiple massively better solutions employed, for many years now. That either have been scuttled by Intel lawsuit (anyone remember Nvidia ION, and the design that wasn't paired with 2kg of Intel motherboard pieces?), or else been simply marketed very badly - either way, the result was that Optimus allowed Intel to force themselves into the mini-pc and then later tablet market without offering anything of value. Indeed, it's not until now and the latest version of Bay Trail that Intel actually is even close to matching what was demonstrated as early as 2005 from the competitors.
In the larger laptop market, it really did take Intel well over 10 years to finally have one chip with an integrated graphics controller that - during it's lifetime.. of one and a half year - had a fully functional graphics driver when the successor was rolled out. And it's not until last summer, when Windows released a final transition to their dwm, that Optimus actually worked properly out of the box. In the same way, the Nvidia and Intel partnership had to go all the way to Maxwell/Haswell before finally managing to create an extra bridge that doesn't break the entire system when one component turns off.
Meanwhile, the adaptive power scheme on a mobile nvidia card since Kepler actually would let most dedicated nvidia card draw less power than any of the intel igps, when just displaying the desktop. There's also the entire driver issue with the hardware commands - a situation created by Intel and Nvidia on purpose, of course - that have complicated matters considerably when trying to get this crap to work on linux. So when speaking from the perspective of an irate tech-geek, Optimus is Satan, definitely.
It is, however, necessary to have Satan in your laptop messenger bag in this day and age. So let's all celebrate that at least Satan works with some efficiently nowadays... since last summer and the unofficial official bridging with the Intel graphics driver and Microsoft's dwm. In the sense that an Optimus system now, in 2014, actually works towards what was advertised in terms of power-saving when Optimus was presented back in 2010. And that the slowest haswell systems are now finally as power-efficient - when you're turning the screen off and leaving the computer alone - as an AMD APU from four years ago.
It's also time to celebrate that nvidia graphics cards have become small enough, and Intel processors efficient enough, to fit a decent amount of graphics processing power in a 40w package. Meaning that, for the first time ever, it's actually possible to have some decent portable gaming - on battery - on an intel/nvidia setup. That's good.
But seriously, though.. if you wanted a laptop for gaming, that you just want to carry with you and plug in wherever you're actually going to sit and play. Then Optimus has no purpose whatsoever, of course. Arguably, it had no right to exist until very recently. And like explained, if it did not exist, there would have been other compact SoC-designs similar to the tegra design that had taken over the smaller mobile/laptop market long ago. In all probability, without Intel's tour de force in marketing and in the court-room, your current laptop would have been something like this today: as powerful as before, but running on an actual operating system, on battery all day, while being approximately as thick as the keyboard wander and a sheet lithium polymer battery, and weighing about as much as a 15 inch sheet of transparent plexi-glass. Also, it would probably cost a fraction of the price of a current laptop. And you could get one in black metal finish.
..ok. The black metal finish may be pushing it a little bit.tareyza likes this. -
I'm still not very knowledgeable with this.. I'd really love it if someone say otherwise and prove me I'm wrong and I can happily choose an Optimus laptop without worrying about this stuff lol. -
I've only very rarely had trouble with Optimus. Most of the time, if there is a problem, it's just a matter of going to the NVIDIA Control Panel to manually set the game to use the dedicated GPU myself. The game I have had the most trouble with was Sonic Generations, which required a different workaround using .ini files to force the game to utilise the dedicated GPU. Other than that, it has been pretty smooth sailing.
moviemarketing likes this. -
as far as i can remember the first graphics cards were 525 and 555 using optimus and they were hit and miss if they auto switched or not.
the only problem ive had is a couple of new AAA games would not use my 680 so had to manually set the game exe in nvidia control panel.
otherwise me not had any other problems. -
I heard that some of the Alienware systems possess a "mux" switch that enables the laptop to bypass the Intel graphics. If other members can highlight which models/revision has this feature, it would be appreciated.
-
Meaker@Sager Company Representative
The gt72 has a manual switch between igp and dedicated gpu. This is my preferred method for larger machines.
-
All the Alienware 17 and 18" models should have MUX (Sandy Bridge CPUs and beyond though), and SLI machines cannot use optimus. Also to this effect, the overpriced GT72 from MSI has a MUX switch and therefore doesn't use Optimus.
The Clevo SLI models also lack Optimus too, but also lack a MUX switch.
The benefit of Optimus is not only for low-power, low-heat, low-noise when using undemanding programs, but also that you have access to the iGPU in a way similar to how desktops do, which is not used in MUX systems which swap the display. But most people wouldn't bother or benefit from things that would use the iGPU, such as Quicksync. Who the heck is using quicksync while gaming? Unless you're livestreaming using the encoder in OBS, but the compressing is so bad that you would need to tack on way too much bitrate for it to be useful. If you're editing video and want to quicksync-render x264, then it'll still be non-gaming use as well. No point other than lacking a restart (which MUX systems need) to swap the display adapter.
if you want, you can find some SLI or MUX systems if you look decently hard, but you might end up going away from what you actually want or spending a lot.
I personally see no reason for Optimus, and can understand why people have problems, because most display options are removed from nVidia Control Panel and placed in the Intel menu. Lots of general users don't know this and thus can't find options they need, such as aspect ratio scaling. -
Optimus is not that bad guys, sheesh. It's stable and works as intended for the most part. If you're looking to use external displays a lot and use more complex or "non-standard" features, then sure, Optimus can hinder it. But in general if you use your laptop with the built-in display primarily it's not an issue. If you want those other features then buy a laptop without Optimus. And honestly any restrictions are due more to how the laptop was designed and Intel's GPU hardware and drivers since it's the pass-thru GPU.
I did like the manual switching on the GT72 though. That was definitely a nice feature.be77solo likes this. -
I agree, I tinker much less with my current laptop, a not bleeding edge Clevo P170EM with gtx680M, I barely even notice Optimus is there and I have no issue with an external screen and extended desktop. Only when I first got it when I would have a monitoring program like an Afterburner graph open on desktop it would cause a microstutter as it wakes up the GPU to poll it for stats. That seems to be fixed now. Also some old or non-release games (gzdoom) had no profile in the driver and would default to igpu, really minor hassles to add one - for the extra hour+ of battery life over the previous gen P150HM with gtx580M without Optimus.
Sure it was bad for its first few years but not now. Enduro seems to be at the tail end of that phase now but I have NFI. I don't need to save the few quid and buy AMD and their driver headaches.
Tl;dr people want to still whinge about it more than is justified, unless 9xxM/new features throw up new problems which I'm sure you will find on Google I wouldn't worry about it at all. -
You can disable iGPU all the way from BIOS.
-
-
-
It also applies to higher end users; I would HAPPILY again give up battery life with another clevo SLI model over taking an optimus machine because I do a decent bit with NCP that does not work with optimus. -
What's everyone's problem with optimus? It works perfectly fine for me and always has.
be77solo likes this. -
There are always the rare issue where optimus just refuses to switch to the dedicated GPU, that hasn't happened often lately as far as I can tell compared to when it first launched, but ti's still a possibility. I've seen other features missing mentioned in the first page of this thread. Granted, they won't affect everyone, far from it, but there are still those that do care and they are more than likely to chime in this thread.
-
..does disabling it have any practical implications at all? Going by how the drivers are set up - we're really talking about using one memory area for the frontbuffer output over another. And as far as I know, there are no latency problems that turn up because of that..
-
-
Even with a Kepler K5000m, there is roughly a 3 hours difference in batter life between optimus enabled and disabled, 5 hours for a M6700 with no switchable graphics and 8 hours for one with switchable. -
4 Years since Nvidia Optimus Started... How is it?
Discussion in 'Gaming (Software and Graphics Cards)' started by cowteriyaki, Nov 4, 2014.