This is a work in progress. Please PM me if you have more information![]()
TongFang
MAX 15/17 Covert Line 2020
Mech G3
CLEVO
P65x (e.g. P650, P651) systems = 15"
P67x = 17"
PA71 = 17"
PB51RD-G (RTX2060)
PB51RF-G (RTX2070)
PB70EF-G
PB71EF-G
PB50EF-G
PB51EF-G
LENOVO
y540
y740
y545
ALIENWARE
Alienware 15R3/R4
Alienware 17R4/R5
Alienware 13R3 except OLED
DELL
Dell Precision 7530
Dell Precision 7730
MSI
MSI WT75
GS66
ASUS
Asus gx501
ROG Zephyrus S GX502GW
ROG ZEPHYRUS S GX701
ACER
Acer Predator Triton 500
-
As far as Clevo based systems go:
- P65x (e.g. P650, P651) systems = 15"
- P67x = 17"
- PA71 = 17"
- ?
Clevo uses MSHybrid.Last edited: Feb 7, 2018Vasudev likes this. -
The MSI Titan I used a few months back had this feature too.
There has to be more than just a handful, right?
It would seem like these would be the go-to laptops for so many people for this feature alone. Likely the main reason it’s not is simply because the majority don’t realize it exists.
So with that said, what is stopping manufacturers from implementing this in all laptops with dedicated GPUs with intel CPUs? Is it expensive? Are there licensing issues?
Truth be told my last Clevo was the p650 and I stumbled across this feature within the bios (factory set with mshybrid off) and had no idea this was an option.
After using a machine recently with Optimus I realize how great having both dynamic sync for gaming and Optimus or MSHybrid for times when your laptop is a laptop. -
Alienware M17x R3
Alienware M17x R4
Alienware 17
Alienware M18x R1
Alienware M18x R2
Alienware 18
Optimus mode -> BIOS GFX set to SG
Discrete mode -> BIOS GFX set to PEG
Intel only mode -> BIOS GFX set to IGFX
None support g-syncChivalrous Roamin Knight, Dennismungai, Vasudev and 1 other person like this. -
I can only think of two. Pascal can use FastSync but many of the laptops you mentioned date back to previous gpu generations. Begs the question “why?”
The other reason is that Optimus at times has some stuttering and hitching. Almost random and driver defendant.
What else?
Thanks for participating in this random thread folks -
Having a switch, on the other hand, doesn't do any damage or cause any issue. At least in theory. -
-
-
ole!!! likes this.
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
-
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
@ole!!! of course I do. But personal experience is not the point here, is it? The point is, excluding feature that is now frequently included even in lower-end machines, let alone any workstation, definitely won't improve sales.
-
Why are people arguing that MUX chips shouldn't be in a laptop.
Making a gaming laptop able to last a respectable amount of time unplugged is a plus to me, and it'll entice more people to buy the said laptop due to its multipurpose nature.
When done correctly (iGPU, dGPU and mshybrid/optimus), a MUXed laptop would be far more useful compared to an unMUXed one.
I was able to take my P71 as a DTR and travel device during holiday due to the MUX, sadly couldn't say the same for the KM1.
On topic.
ThinkPad P70/P71 (MXM) has MUX to switch between Optimus and dGPU only (iGPU turned off completely).
Not sure how the case is with the P50/P51 and other models that have a discrete graphics built in.chezzzz, VictorDev, Starlight5 and 4 others like this. -
just curious, now that it seems that the Clevo P650 series has been retired (not being manufactured anymore by Clevo, and not being stocked anymore by many boutique sellers of higher-end laptops), is there a newer replacement model made by Clevo that still has / incorporates a MUX Switch ?
also, are there any other newer model higher-end laptops in general (as of this writing and since 2018 Computex & E3) that will incorporate a MUX Switch capability ? any info & input appreciated, thanks all !Last edited: Jun 16, 2018 -
yrekabakery Notebook Virtuoso
-
-
Also 8ms input lag is not the same is not ditching half of the CPU power...... -
-
I never had to do that, but I prefer to have a mux than forced Optimus everytime..
chezzzz likes this. -
And yes, it reduces the max performance to half of what a GPU can do
Because if the presentation delay is 8ms, it means that anything that needs to be presented to the monitor needs to be done within 8ms instead of 16.
16 ms is the number the GPU has to process each frame to present a fluid 60 frames per second.
The math works, and you can see that in practice in Acer, Asus and all Optimus laptops
It is an abomination meant for power saving not for a power user who needs that power, you literally pay extra for a GPU you can't really use. And yes, it still happens, my laptop which had that issue was 2 years old, and it still happens, especially ugly on midrange GPUs, like 950, 1050, 1060, etc.
Mux Swtiches avoid that issue entirely, because they force the iGPU off, otherwise that added delay comes from the mere fact that the data needs to pass through the iGPU before it reaches the display, so it is the time it takes the data to go from the dGPU to the iGPU and then the display, which is measured very carefully, 8ms. -
Also Having optimus doesnt mean a laptop cannot have a MUX switch. My optimus laptop has and the different in input lag is not perceivable by a highspeed camera (Intel HD620 + Geforce 1070). Tested on a 165hz 1440P high refresh rate screen. The framerates are exactly the same with optimus and so are frametimes. With your logic, the 90fps that I have in BF1 on 1440P are just 45fps. Thats just simply not the case.
Please dont spread misinformation. If it would be the case what you say than it would already be picked up by the mass media and forced to be taken out.Last edited by a moderator: Jun 18, 2018 -
chezzzz likes this.
-
To be more precise, I don't give a darn about it anymore, I bit a bullet then, wouldn't purchase another Optimus laptop in my life ever again.
You have a switch, that is why you don't see the lost time. It isn't input lag, it is presentation lag
This has been acknowledged by industry experts earlier in this thread and also publicly , Mux switches basically avoid this issue by not having the signal to go through the internal GPU again, you can ask industry professionals if you think I'm wrong.
Again, I simply couldn't care less. But when I bought a laptop, I haven't had this information readily available and I made a mistake, I just want to raise awareness.
Of course Nvidia is very aware of it, but then again, were talking about laptops, most people don't mind some added lag for a much much longer battery life.
Just don't go and saying I am wrong, I never stated that a mux switch would still have the issue, just Optimus and no mux has it
Try this, install madVR, start a video and crank the settings.
You'll see a presentation time, that is how much the frame makes to the display, after it is already ready , I really wanted to know this before I bought a laptop back then.chezzzz likes this. -
Furthermore,
1. Your reported frame / framerate is correct. That means the GPU is processing at that speed. Whether the delay between you clicking and the action happening exist, is another story. Also, that can be compensated for as well by smart tricks by the engine . That is a developers job. My information is very important for new developers who don't know about this and need to search for hours to find this info .
2. Some engines can and will compensate for this. Some other things can not.
3. This is very much power user stuff, if you're a gamer, ignore it. It will screw up with certain statistics and debugs and you need to be aware of it, for very specific tasks.
4. I'm fairly sure that on a typical Optimus laptop you can see this delay when using an external monitor and the internal one, and that the external doesn't even allow for high framerstes most of times, the more expensive laptops come with MUX precisely because of those Optimus limitations
5. Optimus adds battery life by forcing the laptop to process all stuff through the iGPU. This means that the dedicated card is not connected to anything externally, only to the iGPU, a lot of overheard for the video data, which is very much time sensitive.
6. The typical data flow looks like this:
CPU -> GPU -> // Present
Optimus data flow looks like this
CPU -> dGPU -> iGPU -> // Present
The iGPU is very limited, and adds a ton of presentation delay.
Input delay adds up as well, that is the amount of delay the monitor has, which means how much it takes it to present a frame after it received it.
It is important to know those things especially as a developer because otherwise you hunt for reasons why the presentation delay exists. Those are choices made to extend battery life and other stuff. It isn't all bad, just need to know and understand it well.
Then again, if you have a MUC switch , you don't care. MUX simply kills the iGPU, so no loss in framesLast edited: Jun 17, 2018chezzzz likes this. -
I tested BOTH the mux switch solution AND optimus. Both! not just the mux switch.
and next to that, that is not how input lag works! you are unaware of the false information you are spreading simply you had a bad experience and simply dont know why!
Input lag is a delay, nothing more or less. If you have 8ms of extra input lag. It means all frames will be pushed 8ms later than what is happening in game but the frametimes itself remain the same. Just like gaming on a TV with a higher input lag vs a PC monitor with almost zero input lag. Doesnt suddenly turn 60fps in half of that. I cant even imagine how you came to this conclusion?
The reason why I tested with a highspeed camera (Sony RX1 high framerate mode) was to test the input lag of my new Gsync monitor at that time of there was extra input lag with and without gsync enabled. Tested it with Optimus as well and with current optimus laptops there isnt a steady increase in input lag. If something so severe was the case as well YOutube would be filled with videos but there is simply none.
MadVR is not a valid measurement for input lag or latency's. You can only measure input lag with external tools.
Regarding the use of external monitors, thats all dependent on the port routing, port version etc. Display ports ALWAYS support higher framerates in gaming laptops and are for the majority routed directly to the discrete GPU. HDMI depends the port version in most cases and often is routed to only the iGPU and or both. Optimus is present there in that case too. So using that as a measurements isnt a valid way next to that, there might be a huge input lag difference between panel types as well.
Using the Intel GPU is only reading the Nvidia GPU framebuffer and nothing more. It hardly takes any bandwidth, CPU calls or anything. Its basically a direct hardware stream. Maybe you should read the Optimus whitepapers on how it works.....Last edited: Jun 17, 2018Dennismungai likes this. -
Optimus still has many serious downsides ...simply google Optimus "issues" and you will see many (but not all). That in and of itself should alert many that Optimus (software) is not a perfect solution without various potential compromises. True MUX switching (via BIOS), on the other-hand, is a stand alone verifiable convenient solution for those who simply want to completely shut off the damn dGPU to use just the iGPU (for what ever reason). Frankly, based on the title of this thread, I thought it was specifically about identifying current (and past) laptops that have a true MUX switch ... and not about the merits or downsides of a potentially problematic software solution (MSHybrid, Optimus, etc) that may or may not work for everyone.
Last edited: Jun 18, 2018Georgel likes this. -
It's not half the performance. Don't know where you are getting that from.
My 1060 in my notebook vs a random desktop 1060 I found on google:
https://www.3dmark.com/compare/fs/15564093/fs/13477987
Didn't realize I'm getting only 50% of the performance.
Although I prefer MUX switching over Optimus.
iGPU only or dGPU only are the best way to go. Wish newer laptops still did this. -
Not all engines can overcome it, and it isn't freaking input lag / input delay. I am talking about the over added presentation time.
madVR is the perfect example for this. IT needs to get frames ready to push in its time. If it knows it has that presentation time, it subtracts it from the total work time allowed. Anjd madVR rarely presents stuff in 16ms aka 60fps, it usually works in 24 fps for video, and those 8ms of added time are literally lost processing power to a solution that needs to be taken into account by programmers because Nvidia decided to go for what can only be named a very compromised solution.
You do not fully understand the issue, the issue is that those are real problems for developers. As a gamer, you don't see all this struggle because you game from your comfortable home, after someonen has made an engine aware of those things and solved those issues. You really need to understand the difference, I am a developer, I know what I'm talking about. Your software knows it has a 8ms presentation time, that is reported by the computer, it is a real lag in presentation, so the software does its best to compensate
With Optimus -> 8ms present time
Without Optimus -> 0.23 ms present time
Of course you can compensate for this, if you know it exists and if you understand what it is, exactly this is the issue, it isn't input lag, that is something else entirely, this is presentation time.
@Meaker@Sager - Meaker knows about it and he's an industry professional. This issue has been acknowledged multiple times over multiple forums. Most people will be more than fine with Optimus, once again, most stuffs I mention over those forums are very specific, very power user and very much things you don't care about. For example, CPU power and number of threads and such - gamers are mostly still perfect using 6700K or even lower, while with a workload like video rendering, 8700K makes a rather big difference, same for processing batches of data bases in a well programmed environment. In fact, as some industry pros have pointed out, most gamers are better using i5 CPUs for most games rather than I7....
I may be becoming obsolete in my talks because I really am not a typical gamer, I rarely game and most of my time is spent in google docs and word documents anyways lolFalkentyne and chezzzz like this. -
Last edited by a moderator: Jun 18, 2018
-
-
Clevo P650/P670 series = actual MUX Switch
Clevo PA71EP6-G/ES-G = actual MUX Switch (?)
Alienware = actual MUX Switch (?)
Lenovo Thinkpad (some models = BIOS MUX switching)Last edited: Jun 18, 2018 -
IF you're happy with Optimus, good for you. I will not take back what I said, it is and will be an abomination made by improvements over improvements, not a real solution for anything.
When I met with the Optimusabomination I was working with low-level routing and noticed that everything was reaching the display late, resulting in a loss of processing power. You can tell me what you want, I sold that laptop, and all laptops since then, you can't do real work without the latest PC tech anyways.
In case I ever decide to buy a laptop again, I also need this thread to stay awake and for people to report what laptops have that swich and which laptops are Optimus only.
Thank you.Rei Fukai, chezzzz, Falkentyne and 1 other person like this. -
Meaker@Sager Company Representative
Optimus has been optimised over the years, you do have the dGPU passing completed frames to the IGP to output, this is the part that has been optimised. You can measure the drop, and I am sure some edge cases likely remain but it's now pretty darn close. The MUX is more to allow easy display overclocking and configuration/G-sync and those programs that dislike optimus (these tend to be professional apps).
The last time I really played with it I tested a Titan X maxwell overclocked to 1500Mhz core connected via a 16x link to a 4870hq, I did not notice a drop in 3dmark that was outside of run variance. It's quite impressive. -
-
Last edited: Jun 18, 2018Georgel likes this.
-
Alienware 15R3/R4
Alienware 17R4/R5
Alienware 13R3chezzzz likes this. -
Some people are more sensitive to input lag than others. -
The real downside of optimus is that it is incompatible with VR headetsets because it cant access the required port modes through optimus (next to that older optimus systems cant go over 60fps display modes). -
Not iGPU & dGPU only like my M18x. -
I always hear the justification for Optimus is that it saves battery life. But wouldn't having a MUX switch and turning it to "iGPU only" mode (thus completely turning off the dGPU) save even more battery life than Optimus, with its constant switching between the iGPU and dGPU?
Asking because I'm absolutely sick of Optimus turning on and off my fracking dGPU every time I open a program, close a program, plug in my computer, etc... a notebook with a MUX switch and decent battery life is starting to sound more and more attractive.
B0B likes this. -
TL;DR: Optimus is a workaround for Windows. Sucks, but Windows controls the PC gaming market.
Optimus saves battery life over dGPU-only and that's about it. Optimus' ONLY advantage over the MUX (save traces and a different redriver IC), is Windows sucks.
That's really it. Windows freaks out if you change GPUs. It used to require a reboot and a careful bought of HW abstraction in the pre-Vista era. Vista brought the ability to have it *mostly* not bluescreen if you changed GPUs while the system was running. Mostly. Actually, rarely, and only with specific driver combinations. Bluescreens were still to be expected and common. But it was still more convenient than having to reboot every time one wanted to switch GPUs. A common workaround was to have the desktop manager close during the transition from one GPU to the other. This often required a logout/login cycle. Still more convenient than a full reboot. Windows 10 is probably better, but I am never taking that risk ever again.
Optimus sidesteps all of that by running the IGP all the time (so no GPU switching actually occurs) and since the dGPU connects to the Intel IGP via a predetermined API, Nvidia can take control of the driver aspect. This is a HUGE deal, especially for those who have dealt with how useless mobile GPU driver support was in the past (especially for ATi/AMD). The drivers for the old MUX systems used to need a lot of vendor support (from Sony, Dell, etc; not Nvidia, ATi/AMD, Intel, S3, etc), since a lot of the implementations were a bit unique. They were custom Intel+Nvidia/AMD driver hodgepodges that had unique, critical binaries which only worked with specific driver versions.
I had an old Sony with such a setup. Radeon GPU MUXed with Intel IGP. Only 4 drivers were ever released, spanning Windows Vista and 7 (three releases for vista, one for 7). Modifying newer drivers to work quickly became impossible within a few months of the last release. The newer drivers were just incompatible and would bluescreen. Starcraft 2 never got full support, and a number of game breaking bugs remained (all of which were resolved in newer drivers). I faced a choice of running the Radeon GPU all the time and living with almost half of the battery life (of a palty 3.5 hours to begin with), or running the Intel IGP and getting useful battery life out of it. I ran the Intel IGP. When I got a chance, I ditched that setup in favor of Optimus and never, ever looked back.
I've had three Optimus laptops (I actually got the Radeon-powered Vaio to replace an Alienware M11x with Optimus - that M11x had a serious hinge design flaw that Dell was forced to service out of warranty), and I've generally had no problems with Optimus. Using the GPU tray icon to monitor dGPU activity (and setting the Nvidia Control Panel to defer to the IGP for most apps), I've not had a problem with parasitic GPU usage. The biggest offender used to be media players - I guess some of them default to the Nvidia decoder instead of the Intel one? A lot of applications have embedded media players for no good reason.build6, Dennismungai, Prototime and 1 other person like this. -
Clevo is usually open about what has a MUX and what has Optimus.Prototime and Dennismungai like this. -
Hello there,
This is for these with (mostly Clevo) laptops with a BIOS-based MUX switch (between Optimus and Discrete Graphics only) and are running Linux: Are you able to access the backlight controls on Linux after switching to the discrete GPU?
Perhaps @Vasudev , et al,, can assist in this. Your Alienware laptop has a similar mechanism. -
Ionising_Radiation ?v = ve*ln(m0/m1)
A couple more notebooks to add to the list:
Dell Precision 7530
Dell Precision 7730
BIOS → Video → Enable Switchable Graphics.
The good (or bad?) thing is that the Precisions use the new ( ahem proprietary) Dell Graphics Form Factor, which has the output ports directly on the card, unlike MXM. So, there's another option in the BIOS to directly output video from the discrete GPU to the DisplayPort/HDMI output.
A definitely bad thing is that Optimus has no measurable effect on the battery life: I still get only around 6 hours from the 97 Whr battery, when I should be getting in excess of 12. Compare the Precision 5530, which nets some 8 hours of battery life on Optimus...
No G-Sync, though. Although I need to check if G-Sync is enabled on the DP/HDMI ports. -
-
win32asmguy Moderator Moderator
The MSI WT75 has a MUX which switches between Hybrid Graphics and Discrete Graphics mode. It can be controlled via the BIOS by using the Ctrl-Shift-Alt-F2 trick to expose advanced settings. All display outputs are routed through the discrete graphics card regardless of being in hybrid or discrete mode.
Vasudev and Dennismungai like this. -
So, with the discrete GPU engaged, do you have any form of backlight control?Vasudev likes this.
All laptops with a MUX switch
Discussion in 'Hardware Components and Aftermarket Upgrades' started by B0B, Feb 6, 2018.