So I went with an optimus based gaming laptop instead of a gsync because I didn't want to absolutely destroy my battery life. I had been using adaptive synch on my desktop with success and figured I could just use that. Well now it turns out adaptive is not supported when optimus is used, and that isn't just limited to dGPU but eGPUs as well. Seems like optimus has as many serious faults as it does perks. :/
-
optimus has always been cancer, and will stay that way. you can buy non optimus (dGPU only) laptops to alleviate this concern of take a look at MSi laptops which have a hard toggle between full iGPU mode and full dGPU or Alienware with optimus / full dGPU mode mode if you'd like -
Spartan@HIDevolution Company Representative
Just my 2 cents
Edit: My bad, it's not the G750JX which had Optimus I can't remember which ASUS ROG laptop it was. Thanks to @hmsscott for correcting meLast edited: Dec 1, 2016TBoneSan, hmscott, sasuke256 and 1 other person like this. -
the last i had a system with optimus was the alienware 17 (2013 model) first thing i did was to switch it to dedicated mode and when i got a 120Hz panel to replace the 60hz stock panel it disabled it by default.
in other words: optimus is a joke.;hmscott and saturnotaku like this. -
saturnotaku Notebook Nobel Laureate
It's amazing how even 5+ years (a virtual eternity in the tech industry) later NVIDIA still hasn't come within 100 miles of perfecting Optimus, especially given the explosive growth of the portable computer market during that time. My old MacBook Pro had automatic GPU switching. It was only supported within OS X but almost always worked correctly. In the few instances it didn't, someone wrote a program that allowed you to override it and set whatever GPU you wanted to use via the menu bar. Switching didn't work at all in Windows when running Boot Camp, which was actually a major factor when I bought the system so I wouldn't have to mess around with it like I did with a previous Optimus-enabled Clevo laptop I previously had (and ended up returning).
It seems that several manufacturers have finally wised up to Optimus' awfulness and now allow the user to decide what they want, either through keyboard shortcuts or the BIOS, as is the case with my Clevo P670RS. Most folks who buy a gaming notebook are likely to be enthusiasts anyway so they'll know their way around to tailor their settings how they want.Last edited: Dec 1, 2016hmscott likes this. -
It also seems like you wanted your nvidia on for stuff the intel already accells at... ie browser and video playback. Unless you were playing doom in a browser, you most likely didn't need to.Last edited: Dec 1, 2016hmscott likes this. -
hmscott likes this.
-
-
Only the GT73VR and GT83VR for Pascal generation have the iGPU/dGPU toggle. -
Mobius, ok, so thats a little better then. I'll have this thing hooked to a bigger screen when at home. It is replacing my desktop, just havn't done the full switch in yet.
hmscott likes this. -
So an hour maybe. Hours and hours - nah.
P.S. Try Nvidia Inspector and choose Adaptive vsync there. Will it work?hmscott likes this. -
Sorry yea... it is about an hour more or less when on casual browsing and movie playback. I tried out the app and fiddled with the vsync options. It did seem to do something, but all for the worse. BTW I guess it was fast sync I was using on the desktop, not adaptive. Trying to force fast sync just introduced a constant micro stutter. So stuck with vsync I am.
Also found out that chrome reacts badly on an optimus system while using hdmi out. Related to its hardware acceleration, I found a strange pause when scrolling pages and HTML 5 video won't show up full screen. Its getting confused with the 2 GPUs.
Seems there is no impact if I just turn chromes hardware acceleration off.Last edited: Dec 1, 2016hmscott likes this. -
With iGPU only the dGPU isn't even powered up, and isn't visible in the Device Hardware list. It's pure power saving with iGPU only.
With Optimus on battery the dGPU can be available, enabled and powered up, and even if not really used - you have everything pointing to the iGPU for GPU access, the dGPU is still drawing power, so it's not optimal battery savings.
There is no need for Optimus -
Well there's your problem! I can count the number of laptops I have seen with these switches with zero fingers. It comes down to optimus, or gsync dedicated.
And the amount of power the dGPU draws when the iGPU is in use is so insignificant, you can't even access its sensors because THOSE are off.hmscott likes this. -
I've seen Clevo laptops with a MUX switch that allows toggling between a "dGPU-only mode" and an "MS-Hybrid mode" (the latter of which is basically Optimus). Are there also laptops out there that allow toggling between a "dGPU-only mode" and ai "iGPU-only mode"? That would be better than what I've seen.
-
G-sync isn't required for dGPU, and in fact you don't need a G-sync panel for dGPU work.
With AW you have to be careful though, if you don't have a G-sync panel, AW with automatically give you an Optimus setup.
AWused to be switchable between iGPU/Optimus and dGPU/no-Optimus, but I haven't seen anyone confirm it's changeable in the new 2016 models, it looks like you either get dGPU with G-sync and no Optimus, or you get a iGPU with a non-Gsync panel.
That might be where the confusion about G-sync = dGPU - when in fact that's new and only on AW's newest laptops.
MSI has 2 laptops with MUX switch with no Optimus in any mode.
The GT73VR and GT83VR, and you get dGPU +- G-sync and iGPU non G-sync. The GT83VR 18.4" panel is ancient and never came as G-sync.
The GT72S used to come with a MUX switch with no Optimus in any mode, and dGPU +- Gsync and iGPU non G-sync, but this year the GT72VR is dGPU only - no switch.
Other makes have other options, keep looking for an iGPU / dGPU switchable, it's awesome.
Optimus has so many drawbacks you haven't even begun to imagine, but I know you know everything and will need to learn it for yourself, and for that I am sorry for your wasted time. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Optimus is really, really, really dumb, full stop. The idea is great: use the on-die iGPU for low-intensity work, switch to the dGPU for more resource-intensive work. But the implementation is absolutely terrible. The best would be to let the OS control a multiplexer and a power control, where the running application requests a certain amount of GPU power to the OS, and the OS decides which GPU to use.
This is my idea of how Optimus would work: by default, the iGPU would run on the desktop, its framebuffer output being passed to the mux and then on to the internal display/output ports. When the OS sends a request for the dGPU to start, the dGPU is powered on, clocks up if necessary, its framebuffer syncing exactly with that of the iGPU (as the OS sends the same signal to both GPUs, keeping both framebuffer outputs the same). Then, the mux discards any output from the iGPU and the dGPU takes charge, providing entire control over the display. The iGPU may or may not be switched off at this point. I suspect this is how it's done on OS X.
Let me say this again: Apple has got three things right on their MacBooks. Their display aspect ratio (screw 16:9, 16:10 is the best), the displays themselves (I still think MacBook displays have the best calibration and pixel density, viewing angles and general quality), and their dGPU switching, which appears to be manufacturer-agnostic, given that Apple alternates between AMD and nVidia every couple of years or so. -
hmscott likes this.
-
At the low end you get differentiation via one mode or the other. dGPU only or iGPU with Optimus only.
You can actually have a G-sync panel with iGPU / dGPU switchable, and some switchable iGPU modes have also supported Optimus on the past. In iGPU mode G-sync is unavailable.
So having a G-sync panel doesn't necessarily mean no Optimus, even though it does in the models you are seeing. Same goes for Asus GL502 series.
I didn't say you were confused, I was saying the obfuscation of configuration was confusing - making it difficult to assign clear cut rules for Optimus / no Optimus vs G-sync -
I see... yea I pretty much didnt even bother looking at any of the models above 2k. I figured it was pretty much just 4k displays and 1tb SSDs causing them to be so much.
hmscott likes this. -
376.09 is acting up for me again. First abnormally lower performance and now no dGPU support at all. The dGPU icons flashes and I either get a error message about no usable video modes or still run the Intel.
Mac does it by leaving a simple attribute in the software's packaging (which is put in place by the human developer, who is usually informed) to control where generated OGL contexts should go, not playing smart and making any guesses at runtime, which is why it works reliably. A similar agreement can be made for Bumblebee (unofficial Optimus support on Linux) which only exposes one OGL device to the application. The app either run on the NV GPU as expected or crash loudly. You're never left on some middle ground scratching your head.Last edited: Dec 7, 2016jaug1337 likes this. -
And here I am dealing with that piece of sh*** Optimus on my MSI GT70 laptop... even with some of the newer games it just decides to NOT use my GPU and I have to manually set it to do so.
.. That is even after all the unnecessary BIOS, vBIOS and Windows tweaks.
It has been 4 years and it still manages to kick me, when I'm laying down, more than my girlfriend.
edit: important grammar fixes -
If you can get it to run on the right GPU manually, that's already great. Sometimes manual selection doesn't work.
-
I am doing just fine atm. -
does the gt70 actually use optimus? I remember it have a gpu switch on the touch bar portion. -
I have the MSI 1762, GT70 chassis, w/ a 680M, so I am rolling in the deep. -
I'd rather do it manually all the time (in reality all I would need to do is changing the command line on the game's shortcut once) than dealing with this horror.
Optimus, why does something so good have to be so bad?
Discussion in 'Gaming (Software and Graphics Cards)' started by jeffmd, Nov 30, 2016.