So far, I have come up with the NVIDIA GeForce GT 425M as being the best Optimus-enabled GPU.
Check out this bad boy:
Newegg.com - ASUS N43JF-A1 Notebook Intel Core i5 460M(2.53GHz) 14" 4GB Memory DDR3 1066 500GB HDD 7200rpm DVD Super Multi NVIDIA GeForce GT 425M
OH MY GOD. What a sexy piece of machinery that is. It is seriously making me doubt buying a ThinkPad with that beastly performance and nice warranty. $100 bucks too! Not bad.
Back on-topic, can any of you show me a better graphics card with Optimus?
(Note: I'm talking NVIDIA Optimus only, not ATI's lame solution.)
-
-
The_Observer 9262 is the best:)
Have you thought of a Clevo with 425m?
Sager NP5135 (Built on Clevo B5130M) -
For the keyboard alone, I would not consider the Clevo. 3 rows for the number pads and the idiotic pg up/dn/home and other buttons requiring fn key... And probably the mouse pad/buttons don't fair much better.
The Asus does look like a nice deal for decent GPU and battery life while you some very slick appearance and best warranty in the industry.
The one issue I have with it is the screen, would want a higher res screen with that package. -
For the purpose of the thread though; is the GT 425M the best Optimus GPU?
In reply to the previous two posters; the ASUS has a smaller footprint than that Clevo does. I like the design and the ASUS brand better too. Subjective, of course.
Anywho, FHD for the 425M might stress it too much. 1366x768 is much more playable with a weaker GPU than 1920x1080 is. I'd be worried about that. -
Toshiba Qosmio® X505-Q893 Laptop
And alot of people been buying it at $899 for the past couple days on Amazon and their lightning deal.
I got the Q892, which has the same specs, just missing the backlit keyboard on Toshiba direct's Black Friday/ Cyber Monday deal for $979. The only problem right now is that the high framerates drop in game when you unplug the laptop from a wall outlet and I'm assuming it has to do with nvidia's optimus feature. So I'm thinking, that Nvidia's optimus isn't all that it's cracked up to be, and doesn't choose the correct gpu to use. Since the new nvidia verde driver 260.99 is incompatible with the x505, I can't use the optimus user interface to force the games to use the Nvidia GPU when unplugged instead of the integrated gpu on the i5.
Take a Look: New Optimus Interface NVIDIA
It's not just this laptop, an example would be Alienware's M11x that can't game unplugged because of the optimus feature also. Although the M11x does have the optimus user interface found in the Nvidia user panel to override optimus and choose to force the use of the discrete GPU.
YouTube - Nvidia Optimus Feature on the Alienware M11x- R2
So you might want to think about this before buying a optimus enabled laptop. -
The alternative to it would be ATI, whose switchable graphics is a bit messier.
And that laptop is 10 freakin' pounds!!!
I think NVIDIA wouldn't let that happen anyway. That'd defeat the purpose of... well Optimus.
Also, that laptop there doesn't have Optimus. -
H.A.L. 9000 Occam's Chainsaw
I'm simply not a fan of Optimus _yet_. Also, that Qosmio has a weird resolution. It's 18.4" but 1680x945? WTH? Just give it a proper 1920x1080 panel and be done with it, Toshiba!
I believe the OP is correct in assuming the GT 425m is about the best Optimus combo for performance AND battery. For what it's worth. -
toshiba quosmio is does not have optimus. The gpu is not supported and it needs the i5 intel cpu
optimus SEEMS to be entirely a function of software.
to the best of my knowledge, I can disable my 260 gtx gpu. This will turn it off and prolong battery life, run video from software.
enable it when im plugged in and want to play a game.
This I will call stamarptimus. Because truth is,if youre on battery you dont need intel gpu either. This is far superior to optimus because it works with all known cpu and gpu combinations, and all known operating systems.
in the recent past they had a little switch you could turn your discreet gpu off with, and making one would be a fairly simple task.... but even this is unnecessary as you can with software just turn it off.
Buying fancier software that turns it off based on battery or what not is not a good reason to buy a laptop in my opinion. -
-
And how the H E double-toothpicks are you going to do ANYTHING with out a display adapter?!?! I mean, that is taking notebook to an extreme; pen and paper processing.
Anywho, for Windows 7, I like Optimus. -
disable your display adapter and it runs just fine from software.
A gpu is entirely unecessary.
you will run from a microsoft driver.
Go do it right now. You have stamarptimus installed in your system right now.
With your gpu turned off you will get much better battery life. BABOOM its magical. Beats the snot out of that silly optimus stuff.
edit although im not positive stamptipus works with linux I pretty much assume it does. Linux was written and runs today on computers with no video adapters.... i mean non laptops really old computers. -
double post
-
H.A.L. 9000 Occam's Chainsaw
Either way... disabling the GPU will just make the system use a software frame buffer, and in linux a VESA Frame Buffer. Optimus is all software, but doesn't work and isn't slated to ever be supported in Linux. The way X works in linux, the whole framework would have to be rebuilt to accomidate proper Optimus switching. If you'll be using Linux stay away from Optimus with a 10ft pole.
-
-
-
Right you are.
Actually the op confused me with his question.
The entire 4 series gpus support optimus
So the top gpu that supports the optimus is actually the 480 gtx sli you can get in the clevo x7200 notebooks.
If you have an i5 cpu you can switch to intel gpu on the fly with the magical optimus software.
This is also compatible with stamoptimus
Optimus is way better than atis version which doesnt suprise me because atis version is limited to just one mobile gpu and cpu.
Again I will point out this is sort of like ... disabling one gpu and enabling another....
You could do the exact same thing with hardware profile.
( not and have it detect a loss of ac power and switch Im not claiming it does nothing its just not really that cool at all) -
H.A.L. 9000 Occam's Chainsaw
Also, I thought gaming notebooks were suppossed to have standardized resolutions. I know the resolution of 1680x945 is either 16:10 or 16:9 (I'm on android right now and I don't feel like switching to the calculator), but I've never seen a game have native support for such an odd resolution. Mostly its just the standard resolutions supported natively. -
Im not sure what games think of that. They probably offer 1600 x 900 and it might look bad -
Stamar, that won't disable or turn off your GPU, it simply runs a generic Microsoft driver. That decreases battery life and decreases performance.
-
Hmmm
I dont feel like doing an experiment. I used to be able to add about a half hour of time on a sony laptop by disabling the gpu.
Ive never bothered to try this particular one. Or about 3 laptops since.
However disabling hardware in device manager seems to disable it.
without a gpu running it goes to microsoft driver.
you are no longer using a gpu just running the display from software.
you can experiment on your own time with it sort of old school for me.
It will definitely decrease performance.... but increase battery liife.
It turns your gpu off. thats all optimus does so if turning off the gpu doesnt increase battery life then youve got nowhere to improve. -
H.A.L. 9000 Occam's Chainsaw
-
-
-
Guys, what I mean is that the NVIDIA GeForce GT 425M is the best Optimus-enabled GPU. I've been unable to find a better GPU that takes advantage of Optimus yet, whether or not it supports it.
-
-
-
Whatever you're trying to say makes absolutely no sense. If you had properly "disabled" your GPU, you would not have no output on the screen, and by all rights, your computer shouldn't even turn on. -
Either
1) the CPU is processing the display. This means a lot of inefficient power usage as they are much less efficient at processing display than a GPU. EVEN IF the GPU is turned OFF completely (which doesn't happen with optimus afaict) unless you're talking huge chips, the CPU would use more extra power than you'd save from the GPU idling.
2) the GPU is still processing the display but is in compatibility mode, you're still using the GPU; the only difference is all the bloatware and crapware and gadgetware running in the background is not being allowed to do fancy gfx stuff to eat up GPU cycles to the same extent; on the other hand the lack of drivers would mean the GPU is running at stock frequency, no powermizering downclocking to save consumption.
Either way battery life would not be better enough to warrant thinking about it. I reckon it'd be worse in either case.
Back in the old days when 2D GPUs were separate from 3D GPUs you may have had an argument... but that's decades ago now. -
DaneGRClose Notebook Virtuoso
Another thing I'm not seeing mentioned here is the power usage issue, the 425m is going to be on the same level with overall performance with the 335m the large difference is going to be power efficiency and battery life. If you look around most of the 4XXm Optimus computers are only getting 2-4 hours of battery life, on the other hand the 3XXm units are getting 4-8 hours of battery life. The numbers I found on TDP(wattage/power consumption) showed that the 335m is run on a 23w TDP and the 425m is at 35w so there's going to be a decent difference in power consumption for not much in performance. I would say what it comes down to is deciding if DX11 support is worth losing a decent amount of battery life which is what Optimus is designed for.
-
Well, I'm not as concerned with the battery life of the machine whilst using the discrete card. What is great about Optimus is that if you want to browse the web with 8 hours of battery life, you can. And then, when it comes down to gaming, I can plug my system in and have a great time. Of course, a lot of people do care about battery life whilst using the discrete GPU, so DanGERClose's information proves very valuable. Thanks for that.
The XPS line is strange though; only the 420M (14 & 15), or the 435M (17) support Optimus. You can upgrade them; but then you lose Optimus (and are forced to go with the i7). Even though the 435M is technically the most powerful Optimus-implemented GPU I could find thus far, it's usefulness is completely negated by the huge screen, resolution and weight of the XPS 17. Especially considering that the 435M in the XPS is only a slightly higher-clocked 425M. Therefore, I still come to the conclusion that the ASUS N43 is the best Optimus system at this point... -
not only does disabling the gpu extend battery life
disabling integrated gpus like the intel hd also extend battery life.
the days of 2d graphics are today lol. you do not need a gpu to drive a display. Yes it extends battery life.
I have no idea of what sort of thoughts go into it using more battery life.
It is a weird stretch to think that if you were trying to display something that took the cpu a lot of time to render, then you would be taking more time or cpu power?
Possible but in real life not doable or doesnt matter. If you cant display it with the cpu then obviously dont turn the gpu off....
turning off the gpu extends battery life by a lot. the gpu is not necessary to run your display.
In desktops sometimes the gpu card comes wtih an external display port so perhaps thats where that is coming from.
Theres just so much confusion coming from a few sources.
If you turn your gpu off, it will be using much less power. Your battery will last longer. You will not be able to display 3d graphics and rendering will be slower. Do not use this for games.
What you all need is a utility that shows the watt usage of all the peripherials in your computer. For the life of me its been a while so I dont remmeber what it is called.
Slam dunk when you see the watts consumed by gpu when idling watts consumed when it is turned off ( none its that simple)
Power consumption before and after, battery life increase. This is something that I learned here on this forum a few years ago I dont have the utiility anymore I cant remember its name. It wwas a few machines ago for me. -
DaneGRClose Notebook Virtuoso
-
Well, for hours is more useful than one!
As for me, I may just get a ThinkPad with an Intel IGP loaded with Linux, then buy an Xbox 360 and play that when I get the gaming itch. Best of both worlds, eh? -
-
-
Just so you know, AMD's answer to Optimus, Dynamic Switchable Graphics, should be hitting laptops very soon, and two of the known cards that will support it, the HD 6550M (rebrand of the Mobility HD 5650, which beats the GT 425M and trades blows with the GT 435M) and the HD 6570M (rebrand of the Mobility HD 5770, which should beat the GT 435M in most cases, especially if it has GDDR5), will offer great performance and better overclockability than the midrange Optimus GPUs available.
-
After reading this thread, I'm horrified to see how much false information is being posted regarding Optimus and how it functions.
Here's a quick breakdown of some points I've read on this thread:
You can play games on the GPU when running on battery
- This is 110% false. The Optimus profile for a game shows you whether the IGP or GPU will be used to run the game. This is the same regardless of whether you're plugged in or not.
Those who have never used Optimus are likely thinking of some of the more recent Switchable Graphics implementations where the system "automatically" disables the GPU when the plug is pulled. This is not Optimus by any stretch of the imagination.
Disabling the IGP would save more battery life.
- When Optimus is using the GPU, the only portion of the IGP which is not disabled or running at the lowest (barely running) power state is the display controller. The amount of power this consumes would be hard to even measure.
Changing the Optimus profile (what some call the "Whitelist") doesn't work all the time.
- Again, this is false. Optimus have several fallbacks should there be a driver issue causing any incorrect behavior. If the Optimus profile for an application isn't triggering the GPU, you can modify the profile to force the GPU. In addition, you can also enable a driver option to allow you to right-click on the application's .exe and choose whether it should run on IGP or GPU.
Disabling the IGP would save more battery life.
- When Optimus is using the GPU, the only portion of the IGP which is not disabled or running at the lowest (barely running) power state is the display controller. The amount of power this consumes would be hard to even measure.
You'll see awful battery life from an Optimus system when browsing the internet or surfing the web .
- This one had me nearly speechless as it is painfully wrong. Web browsing is actually one of the key reasons we created Optimus. When you're browsing, Optimus will disable the GPU and use IGP as it is not intensive and to achieve the best battery life. The only exceptions to this is when you use Flash 10.1 or Flash 10.2 and browse to YouTube or another site that can benefit from the hardware acceleration the GPU provides. The moment you browse away from that site, Optimus disables the GPU again.
IE9 and browsers supporting HTML and WebGL will also cause Optimus to enable the GPU to handle this content. Again, once you browse away from something that benefits from the GPU Optimus will disable the GPU and move back to IGP for the best battery life.
With thanks,
Sean
Sean Pelletier
Senior Technical Marketing Manager - Notebooks
NVIDIA - This is 110% false. The Optimus profile for a game shows you whether the IGP or GPU will be used to run the game. This is the same regardless of whether you're plugged in or not.
-
DaneGRClose Notebook Virtuoso
Sean
-The whitelist is crap sometimes, driver support for a lot of the machines running optimus is horrible at best. For instance you want to tell me why you still don't have an answer to PunkBuster multiple dx9 errors when a couple weekend warriors have already developed some? The whitelist on some notebooks may work great, the theory may work great, but the majority of the systems equipped with it still don't fully work. The whitelist also seems to bug the two GPU's to FREQUENT switching killing the possible battery life by what I would guess to be 10-20%, sure a power user can change it to where it works decently but your average person who buys an optimus configured computer doesn't know how to change certain settings such as driver .inf files and configurations that are over the head of a lot of consumers.
-The battery life on surfing the web is decent, but again the buggy frequent switching even while idling on the desktop makes the battery life decent at best on most machines. Honestly I love my M11xR2 but I'd throw Optimus out the window for a good manual switching at this point.
-What you are trying to say in your post is the way Nvidia trys to convey and sell Optimus, the way it looks on paper. Optimus is an amazing idea and if Nvidia will actually step up to the plate and get it working properly along with proper driver support it will be amazing, but the fact remains that it is still a buggy function that has irritated more people than you can imagine.
What I would like to see is Nvidia step up and admit that it doesn't work correctly on a lot of machines. Don't play stupid with your consumers and promise fixes for 4+ months with absolutely nothing actually seeing the light of day. The 265 series punkbuster fix is just the tip of the iceberg. Also Nvidia needs to step up and admit(even if not publicly) that you are getting killed on almost every sector of the market by ATi/AMD, the only thing Nvidia seems to be winning on is the fact that only Nvidia has CUDA and PhysX support on the GPU. I love Nvidia products but I've been a bit disappointed by what I've seen in the last year. Please don't take this as bashing. I only intend it to be a report on what real end users are seeing with the products you make, and recommendations for what I would like to see in the future. -
-
and also to users you can NOT disable the IGP as machines with Optimus use the IGP as its final display. hence the punkbuster issues -
DaneGRClose Notebook Virtuoso
Yup, the Nvidia GPU actually runs through the IGP so both run at the same time at all times, technically the Nvidia GPU is always on to some point it just isn't always being utilized as the actual display driver.
-
I've tried everything I know, from changing the power settings to High Performance, tweaking the options in the power setting advanced options, even changing registry settings to open up hidden power options, yet nothing seems to work.
I had this sort of problem with my previous gaming laptop and the solution was to use the very first nvidia verde drivers when they came out which gave me maximum performance gaming while even unplugged.
Anyway, I've tried to install the most recent verde driver 260.99, but it seems to be incompatible with the Toshiba x505 q892 and won't let me install. I'm guessing that since this notebook is very modern and fairly new it uses state of the art technology, such as the Nvidia GTX 460m and Intel's i5 460m (not to be confused with the 460m GPU), and since I can't game on high performance, even after fidling with the power options, I'm assuming it also is utilizing the Optimus technology from Nvidia.
I did some researching and came across the video of the new "Optimus User Interface", Take a Look: New Optimus Interface NVIDIA, which also lead me to the M11x video showing the Optimus interface in action, YouTube - Nvidia Optimus Feature on the Alienware M11x- R2. I can't find this user face nor the option to choose on my notebook's Nvidia control panel.
These are my questions. First, is the Toshiba x505 q892 using optimus technology? If it is, could it be that this is the cause of the drop in performance when gaming unplugged? Also will Nvidia support this particular notebook with future releases of the verde drivers? And in the meantime, I was thinking about manually installing the 260.99 driver using the "have disk method" by downloading and installing the driver from Laptopvideo2go.com. Would the new verde driver/drivers give me the Optimus User interface on my Nvidia control panel, so I could force it to use the GTX 460m?
I'm in desperate need of answers, and I'd appreciate it if you could help me, Sean. And thank's in advance for your time sir.
Jacob -
When Optimus doesn't see a need for the GPU to be enabled, it disabled the GPU and the GPU consumes 0W. This is the magic of Optimus. There's a reason ATI didn't have an immediate answer for this. The technology here is anything but trivial. Doing this all without multiplexers is not a simple thing to accomplish.
In contrast, the IGP is always on to some degree. When Optimus has the GPU enabled, only the display controller on the IGP is active. Everything else is either shut down or running at the lowest possible power state.
When Optimus has the GPU disabled, the IGP is fully on.
In each case, the IGP display controller is active.
Great system and I'm sure you'll be impressed with the gaming chops of the GXT 460M! Unfortunately, Toshiba did not choose to implement Optimus on that notebook. Regardless, you still should not be seeing a perf hit if you have the Windows power options set. I don't have that system handy, but I vaguely remember there being some sort of Toshiba power application or management software. I'd disable that as this is likely what's throttling clock speeds for CPU and GPU and hurting performance.
As for Verde driver support. Your X505 is a very recent system so it likely wasn't done going through qualification and validation in our labs. You'll likely have support in the next Verde driver which we should be posting in the immediate future. (certainly within 30 days).
Enjoy the system! -
DaneGRClose Notebook Virtuoso
Would you like to explain to me how the Nvidia GPU is completely shut off and also running at 135mhz Core at 0.8v and be able to output a temperature and other information from sensors? It's impossible is the answer, it would be like saying you can unplug a tv and still have it give you information. The GPU may go into a dormant/extremely low power draw type of a state but on the M11x-R2 it never shuts off 100%. I hope you guys get the driver right this time, I understand the process is long but it's getting a little ridiculous on the amount of time 1 driver fix has taken. You're also turning a lot of people away from Optimus and possibly even Nvidia as a whole because of the mess ups that have happened on multiple sectors of Nvidia's business plan. The 3XXm/3XX series rebranding was pure crap, the 4XX series in my opinion has been decent but also a major flop considering how hyped up it was, and the Optimus issues have also been another major flop not because there are issues but because of how long it has taken Nvidia to respond. Again, I love Nvidia as a company, Nvidia's products and support recently on another hand is something that is strongly making me consider going to ATi. I really do hope that your company can start to show a stronger market presence. Thanks for the info and response.
Dane -
-
this is pathetic. -
DaneGRClose Notebook Virtuoso
Richteralan I don't care what that video shows, I can't say for certain that some rigs don't completely shut off the discrete GPU as I haven't used all of them, but I can't tell you for certain that it would be impossible for the GPU in the M11xR2 to be giving readouts and showing at 135/270/135 without being powered to some point meaning it is DRAWING POWER. You explain to me how a non powered device can show readouts of temperature, clocks, usage, etc without being powered. And I can tell you that after a ton of time using this rig with constant monitoring at all times that it has never once stopped giving readouts meaning it has never fully shut off. If you'd like I'll even go to the next step of proving it.
-
Although I can appreciate your skepticism, I can honestly tell you the dGPU is electrically off and not consuming any power. When exiting a game, Optimus turns the GPU off immediately. The only "exception" to this is using Flash 10.1/10.2 where exiting a Flash video will drop the GPU down into the lowest power state for a second or two before turning electrically off. (Optimus does this just in case you're about to launch another video.)
When the GPU is off (you can verify using the Optimus taskbar tool in Verde drivers), the GPU is consuming 0W and running a clock frequency of 0MHz.
What tool are you using to report clock speeds? Several things could account for the odd reporting you're seeing. The application you're using could be mapped to the wrong sensors, the polling interval could be too long, etc.
Remember, there are no multiplexers here eating up power and the GPU is not running is some low-power state. Instead, is literally electrically off and measured to consume 0W. More details can be found in the Optimus whitepaper I wrote.
Note: This was written when Optimus originally launched so some of the applications being mentioned might be dated. Regardless, everything technical still applies.
Have a good one guys!
Sean
Sean Pelletier
Senior Technical Marketing Manager - Notebooks
NVIDIA -
DaneGRClose Notebook Virtuoso
Sean,
I've used MSI Afterburner, EVGA Precision, sidebar gadgets, and GPU-z every single one of them reports the exact same 135/270/135 @ .80v reading, so one of a few things is happening here:
1-it doesn't actually shut off
2-optimus has an issue and isn't working as it should which also means #1 applies
3-Alienware has somehow changed the Optimus programming to prevent a full shutoff
If someone wants to tell me how it would be something else that makes sense that enables something electronic to not draw any power and still give readouts I'm all ears. There's no way I see 6+ programs giving a false read so I highly doubt that's the issue. I appreciate you coming on here Sean, it's nice to see someone from nvidia stick their head out without having to be called.
Dane -
Hello Sean,
I recently bought a XPS 15 with gt420 and a core i3.
I also noticed that actually only the drivers provided directly from dell seems to work with optimus (the others doesn't work at all). Considering the advertising on nvidia site about this laptop may we expect an official nvidia release who supports the ID of our card? My worries are about the huge time the OEMs take to update their drivers
Best GPU & Optimus combination?
Discussion in 'Gaming (Software and Graphics Cards)' started by lupusarcanus, Dec 10, 2010.