The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Best GPU & Optimus combination?

    Discussion in 'Gaming (Software and Graphics Cards)' started by lupusarcanus, Dec 10, 2010.

  1. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    So far, I have come up with the NVIDIA GeForce GT 425M as being the best Optimus-enabled GPU.

    Check out this bad boy:
    Newegg.com - ASUS N43JF-A1 Notebook Intel Core i5 460M(2.53GHz) 14" 4GB Memory DDR3 1066 500GB HDD 7200rpm DVD Super Multi NVIDIA GeForce GT 425M

    OH MY GOD. What a sexy piece of machinery that is. It is seriously making me doubt buying a ThinkPad with that beastly performance and nice warranty. $100 bucks too! Not bad.

    Back on-topic, can any of you show me a better graphics card with Optimus?

    (Note: I'm talking NVIDIA Optimus only, not ATI's lame solution.)
     
  2. The_Observer

    The_Observer 9262 is the best:)

    Reputations:
    385
    Messages:
    2,662
    Likes Received:
    0
    Trophy Points:
    55
  3. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    For the keyboard alone, I would not consider the Clevo. 3 rows for the number pads and the idiotic pg up/dn/home and other buttons requiring fn key... And probably the mouse pad/buttons don't fair much better.

    The Asus does look like a nice deal for decent GPU and battery life while you some very slick appearance and best warranty in the industry.

    The one issue I have with it is the screen, would want a higher res screen with that package.
     
  4. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    For the purpose of the thread though; is the GT 425M the best Optimus GPU?

    In reply to the previous two posters; the ASUS has a smaller footprint than that Clevo does. I like the design and the ASUS brand better too. Subjective, of course.

    Anywho, FHD for the 425M might stress it too much. 1366x768 is much more playable with a weaker GPU than 1920x1080 is. I'd be worried about that.
     
  5. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    Sexy? Check this sexy beast!

    Toshiba Qosmio® X505-Q893 Laptop

    And alot of people been buying it at $899 for the past couple days on Amazon and their lightning deal.

    I got the Q892, which has the same specs, just missing the backlit keyboard on Toshiba direct's Black Friday/ Cyber Monday deal for $979. The only problem right now is that the high framerates drop in game when you unplug the laptop from a wall outlet and I'm assuming it has to do with nvidia's optimus feature. So I'm thinking, that Nvidia's optimus isn't all that it's cracked up to be, and doesn't choose the correct gpu to use. Since the new nvidia verde driver 260.99 is incompatible with the x505, I can't use the optimus user interface to force the games to use the Nvidia GPU when unplugged instead of the integrated gpu on the i5.

    Take a Look: New Optimus Interface NVIDIA

    It's not just this laptop, an example would be Alienware's M11x that can't game unplugged because of the optimus feature also. Although the M11x does have the optimus user interface found in the Nvidia user panel to override optimus and choose to force the use of the discrete GPU.

    YouTube - Nvidia Optimus Feature on the Alienware M11x- R2

    So you might want to think about this before buying a optimus enabled laptop.
     
  6. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    The alternative to it would be ATI, whose switchable graphics is a bit messier.

    And that laptop is 10 freakin' pounds!!!

    I think NVIDIA wouldn't let that happen anyway. That'd defeat the purpose of... well Optimus.

    Also, that laptop there doesn't have Optimus.
     
  7. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    I'm simply not a fan of Optimus _yet_. Also, that Qosmio has a weird resolution. It's 18.4" but 1680x945? WTH? Just give it a proper 1920x1080 panel and be done with it, Toshiba!

    I believe the OP is correct in assuming the GT 425m is about the best Optimus combo for performance AND battery. For what it's worth.
     
  8. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    toshiba quosmio is does not have optimus. The gpu is not supported and it needs the i5 intel cpu

    optimus SEEMS to be entirely a function of software.

    to the best of my knowledge, I can disable my 260 gtx gpu. This will turn it off and prolong battery life, run video from software.


    enable it when im plugged in and want to play a game.

    This I will call stamarptimus. Because truth is,if youre on battery you dont need intel gpu either. This is far superior to optimus because it works with all known cpu and gpu combinations, and all known operating systems.

    in the recent past they had a little switch you could turn your discreet gpu off with, and making one would be a fairly simple task.... but even this is unnecessary as you can with software just turn it off.

    Buying fancier software that turns it off based on battery or what not is not a good reason to buy a laptop in my opinion.
     
  9. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    LOL. Talk about trying to squeeze dollars...
     
  10. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    You would change your opinion drastically upon playing with Linux a bit...

    And how the H E double-toothpicks are you going to do ANYTHING with out a display adapter?!?! I mean, that is taking notebook to an extreme; pen and paper processing.

    Anywho, for Windows 7, I like Optimus.
     
  11. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    disable your display adapter and it runs just fine from software.

    A gpu is entirely unecessary.

    you will run from a microsoft driver.

    Go do it right now. You have stamarptimus installed in your system right now.

    With your gpu turned off you will get much better battery life. BABOOM its magical. Beats the snot out of that silly optimus stuff.

    edit although im not positive stamptipus works with linux I pretty much assume it does. Linux was written and runs today on computers with no video adapters.... i mean non laptops really old computers.
     
  12. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    double post
     
  13. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    Either way... disabling the GPU will just make the system use a software frame buffer, and in linux a VESA Frame Buffer. Optimus is all software, but doesn't work and isn't slated to ever be supported in Linux. The way X works in linux, the whole framework would have to be rebuilt to accomidate proper Optimus switching. If you'll be using Linux stay away from Optimus with a 10ft pole.
     
  14. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    What are you guys talking about the Qosmio doesn't have Optimus? The GTX 460m does support Optimus and the i5 does have an integrated GPU. Also the 1680x945 is perfect since this is a gaming laptop that takes into account high framerates, going with the 1920x1080 would just bog down the GPU with maxed out graphic settings in modern 3d games especially first person shooters.
     
  15. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    At least in the Qosmio link there is not any mention of that i5 with an integrated GPU. This may be a confussion because the i5 chipset can support an integrated GPU but the laptop producer may choose not to include the integrated GPU (which the majority of laptops producers do choose).
     
  16. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    Right you are.

    Actually the op confused me with his question.

    The entire 4 series gpus support optimus
    So the top gpu that supports the optimus is actually the 480 gtx sli you can get in the clevo x7200 notebooks.

    If you have an i5 cpu you can switch to intel gpu on the fly with the magical optimus software.

    This is also compatible with stamoptimus


    Optimus is way better than atis version which doesnt suprise me because atis version is limited to just one mobile gpu and cpu.

    Again I will point out this is sort of like ... disabling one gpu and enabling another....

    You could do the exact same thing with hardware profile.

    ( not and have it detect a loss of ac power and switch Im not claiming it does nothing its just not really that cool at all)
     
  17. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    I never said it didn't support Optimus, just that I am not a fan of it. Just that if the OP wants any kind of respectable battery life, and a decent gaming GPU... the GT 425m fits that bill quite nicely.

    Also, I thought gaming notebooks were suppossed to have standardized resolutions. I know the resolution of 1680x945 is either 16:10 or 16:9 (I'm on android right now and I don't feel like switching to the calculator), but I've never seen a game have native support for such an odd resolution. Mostly its just the standard resolutions supported natively.
     
  18. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    youre definitely right thats a non standard resolution.

    Im not sure what games think of that. They probably offer 1600 x 900 and it might look bad
     
  19. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    Stamar, that won't disable or turn off your GPU, it simply runs a generic Microsoft driver. That decreases battery life and decreases performance.
     
  20. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    Hmmm

    I dont feel like doing an experiment. I used to be able to add about a half hour of time on a sony laptop by disabling the gpu.

    Ive never bothered to try this particular one. Or about 3 laptops since.

    However disabling hardware in device manager seems to disable it.

    without a gpu running it goes to microsoft driver.
    you are no longer using a gpu just running the display from software.

    you can experiment on your own time with it sort of old school for me.
    It will definitely decrease performance.... but increase battery liife.

    It turns your gpu off. thats all optimus does so if turning off the gpu doesnt increase battery life then youve got nowhere to improve.
     
  21. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    That would be my guess, but it would be a nice tidbit to know for sure. :)
     
  22. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    The integrated GPU is included on the CPU package of all dual-core i3/5/7 CPUs. It is not on the chipset. The producer may not support optimus, but they don't have the option to 'not include the integrated GPU'.

    I would consider the ATI implementation superior. With ATI, you switch between GPUs with a physical switch, so instead of having to go into the device manager and disable the Nvidia GPU, you just flip a switch on the case. Also, the ATI switchable graphics works with Linux because it's a physical hardware switch and doesn't depend on Windows Vista/7 or software drivers.
     
  23. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    It depends on what you're looking for as 'best'. I'd say that a more powerful 4xx series GPU would be better, because the 425 is a low-end dedicated GPU, and even with a more powerful GPU you still have Optimus and the integrated Intel GPU to switch to on battery power.
     
  24. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    Guys, what I mean is that the NVIDIA GeForce GT 425M is the best Optimus-enabled GPU. I've been unable to find a better GPU that takes advantage of Optimus yet, whether or not it supports it.
     
  25. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    The Dell XPS 17 has an Optimus-enabled 435. I think there's a few ASUSes with an Optimus-enabled 335, which should be (slightly) faster than the 425. I'm sure that there has to be others out there.
     
  26. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    Well all my games have an option for 1680x945, from Battlefield Bad Company 2, and Modern Warfare, all the way back to Falcon 4.0 Allied Force. They all support this "non standard" resolution you speak of and I'll tell you it looks awesome, especially running with maxed out graphic settings at 60 + fps on the "better" GTX 460m as compared to that GT 425m. Again my only gripe is not being able to disable the optimus feature that prevents great framerates while on battery.
     
  27. Amnesiac

    Amnesiac 404

    Reputations:
    1,312
    Messages:
    3,433
    Likes Received:
    20
    Trophy Points:
    106
    wut...

    Whatever you're trying to say makes absolutely no sense. If you had properly "disabled" your GPU, you would not have no output on the screen, and by all rights, your computer shouldn't even turn on.
     
  28. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    "Software" is just instructions. Instructions are processed on hardware. Hardware consumes power. Something is executing the "software"... it's either the GPU or CPU...

    Either
    1) the CPU is processing the display. This means a lot of inefficient power usage as they are much less efficient at processing display than a GPU. EVEN IF the GPU is turned OFF completely (which doesn't happen with optimus afaict) unless you're talking huge chips, the CPU would use more extra power than you'd save from the GPU idling.
    2) the GPU is still processing the display but is in compatibility mode, you're still using the GPU; the only difference is all the bloatware and crapware and gadgetware running in the background is not being allowed to do fancy gfx stuff to eat up GPU cycles to the same extent; on the other hand the lack of drivers would mean the GPU is running at stock frequency, no powermizering downclocking to save consumption.

    Either way battery life would not be better enough to warrant thinking about it. I reckon it'd be worse in either case.

    Back in the old days when 2D GPUs were separate from 3D GPUs you may have had an argument... but that's decades ago now.
     
  29. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    Another thing I'm not seeing mentioned here is the power usage issue, the 425m is going to be on the same level with overall performance with the 335m the large difference is going to be power efficiency and battery life. If you look around most of the 4XXm Optimus computers are only getting 2-4 hours of battery life, on the other hand the 3XXm units are getting 4-8 hours of battery life. The numbers I found on TDP(wattage/power consumption) showed that the 335m is run on a 23w TDP and the 425m is at 35w so there's going to be a decent difference in power consumption for not much in performance. I would say what it comes down to is deciding if DX11 support is worth losing a decent amount of battery life which is what Optimus is designed for.
     
  30. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    Well, I'm not as concerned with the battery life of the machine whilst using the discrete card. What is great about Optimus is that if you want to browse the web with 8 hours of battery life, you can. And then, when it comes down to gaming, I can plug my system in and have a great time. Of course, a lot of people do care about battery life whilst using the discrete GPU, so DanGERClose's information proves very valuable. Thanks for that.

    The XPS line is strange though; only the 420M (14 & 15), or the 435M (17) support Optimus. You can upgrade them; but then you lose Optimus (and are forced to go with the i7). Even though the 435M is technically the most powerful Optimus-implemented GPU I could find thus far, it's usefulness is completely negated by the huge screen, resolution and weight of the XPS 17. Especially considering that the 435M in the XPS is only a slightly higher-clocked 425M. Therefore, I still come to the conclusion that the ASUS N43 is the best Optimus system at this point...
     
  31. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    not only does disabling the gpu extend battery life

    disabling integrated gpus like the intel hd also extend battery life.

    the days of 2d graphics are today lol. you do not need a gpu to drive a display. Yes it extends battery life.
    I have no idea of what sort of thoughts go into it using more battery life.

    It is a weird stretch to think that if you were trying to display something that took the cpu a lot of time to render, then you would be taking more time or cpu power?
    Possible but in real life not doable or doesnt matter. If you cant display it with the cpu then obviously dont turn the gpu off....

    turning off the gpu extends battery life by a lot. the gpu is not necessary to run your display.

    In desktops sometimes the gpu card comes wtih an external display port so perhaps thats where that is coming from.
    Theres just so much confusion coming from a few sources.

    If you turn your gpu off, it will be using much less power. Your battery will last longer. You will not be able to display 3d graphics and rendering will be slower. Do not use this for games.


    What you all need is a utility that shows the watt usage of all the peripherials in your computer. For the life of me its been a while so I dont remmeber what it is called.

    Slam dunk when you see the watts consumed by gpu when idling watts consumed when it is turned off ( none its that simple)

    Power consumption before and after, battery life increase. This is something that I learned here on this forum a few years ago I dont have the utiility anymore I cant remember its name. It wwas a few machines ago for me.
     
  32. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    The only thing I will tell you is that the rigs running the 425m are not going to get 8 hours, they're not even going to get 6 surfing the web only. Really Optimus is an amazing idea and if Nvidia continues to develop and steps up to the plate with some actual progress it will be amazing, but I can tell you that it's not all what it looks like on paper in real life. Optimus is buggy at best right now, you don't really have control over anything at all in it, sure you can change your whitelist in the Nvidia control panel but not evenn that is 100%. If you go with an Optimus rig with a true mobile CPU(not an ultra low voltage UM/SU/SL) and a 4XXm GPU best I've seen real life is about 4 hours surfing the web and realistically even 4 hours will be tough to hit if you're actually using it. Another thing I'd recommend is to wait a little longer on the XPS series, it's brand new and Dell has shown a track record of failing on the XPS line with overheating, throttling, and a whole other slew of issues. There's a reason a lot of people skip the XPS line and pay a bit more for an Alienware or go to another brand even. Whatever you end up with I hope it works for you and you enjoy it! ;)
     
  33. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    Well, for hours is more useful than one! :)

    As for me, I may just get a ThinkPad with an Intel IGP loaded with Linux, then buy an Xbox 360 and play that when I get the gaming itch. Best of both worlds, eh?
     
  34. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    I'm pretty sure you have no idea what you're talking about.
     
  35. lupusarcanus

    lupusarcanus Notebook Consultant

    Reputations:
    244
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    LOL, that's exactly what I thought.
     
  36. City Pig

    City Pig Notebook Virtuoso

    Reputations:
    483
    Messages:
    2,322
    Likes Received:
    0
    Trophy Points:
    55
    Just so you know, AMD's answer to Optimus, Dynamic Switchable Graphics, should be hitting laptops very soon, and two of the known cards that will support it, the HD 6550M (rebrand of the Mobility HD 5650, which beats the GT 425M and trades blows with the GT 435M) and the HD 6570M (rebrand of the Mobility HD 5770, which should beat the GT 435M in most cases, especially if it has GDDR5), will offer great performance and better overclockability than the midrange Optimus GPUs available.
     
  37. PellyNV

    PellyNV Notebook Enthusiast

    Reputations:
    106
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    After reading this thread, I'm horrified to see how much false information is being posted regarding Optimus and how it functions.

    Here's a quick breakdown of some points I've read on this thread:

    You can play games on the GPU when running on battery
    • This is 110% false. The Optimus profile for a game shows you whether the IGP or GPU will be used to run the game. This is the same regardless of whether you're plugged in or not.

      Those who have never used Optimus are likely thinking of some of the more recent Switchable Graphics implementations where the system "automatically" disables the GPU when the plug is pulled. This is not Optimus by any stretch of the imagination.

    Disabling the IGP would save more battery life.
    • When Optimus is using the GPU, the only portion of the IGP which is not disabled or running at the lowest (barely running) power state is the display controller. The amount of power this consumes would be hard to even measure.

    Changing the Optimus profile (what some call the "Whitelist") doesn't work all the time.
    • Again, this is false. Optimus have several fallbacks should there be a driver issue causing any incorrect behavior. If the Optimus profile for an application isn't triggering the GPU, you can modify the profile to force the GPU. In addition, you can also enable a driver option to allow you to right-click on the application's .exe and choose whether it should run on IGP or GPU.

    Disabling the IGP would save more battery life.
    • When Optimus is using the GPU, the only portion of the IGP which is not disabled or running at the lowest (barely running) power state is the display controller. The amount of power this consumes would be hard to even measure.

    You'll see awful battery life from an Optimus system when browsing the internet or surfing the web .
    • This one had me nearly speechless as it is painfully wrong. Web browsing is actually one of the key reasons we created Optimus. When you're browsing, Optimus will disable the GPU and use IGP as it is not intensive and to achieve the best battery life. The only exceptions to this is when you use Flash 10.1 or Flash 10.2 and browse to YouTube or another site that can benefit from the hardware acceleration the GPU provides. The moment you browse away from that site, Optimus disables the GPU again.

      IE9 and browsers supporting HTML and WebGL will also cause Optimus to enable the GPU to handle this content. Again, once you browse away from something that benefits from the GPU Optimus will disable the GPU and move back to IGP for the best battery life.

    With thanks,

    Sean

    Sean Pelletier
    Senior Technical Marketing Manager - Notebooks
    NVIDIA
     
  38. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    Sean

    -The whitelist is crap sometimes, driver support for a lot of the machines running optimus is horrible at best. For instance you want to tell me why you still don't have an answer to PunkBuster multiple dx9 errors when a couple weekend warriors have already developed some? The whitelist on some notebooks may work great, the theory may work great, but the majority of the systems equipped with it still don't fully work. The whitelist also seems to bug the two GPU's to FREQUENT switching killing the possible battery life by what I would guess to be 10-20%, sure a power user can change it to where it works decently but your average person who buys an optimus configured computer doesn't know how to change certain settings such as driver .inf files and configurations that are over the head of a lot of consumers.
    -The battery life on surfing the web is decent, but again the buggy frequent switching even while idling on the desktop makes the battery life decent at best on most machines. Honestly I love my M11xR2 but I'd throw Optimus out the window for a good manual switching at this point.
    -What you are trying to say in your post is the way Nvidia trys to convey and sell Optimus, the way it looks on paper. Optimus is an amazing idea and if Nvidia will actually step up to the plate and get it working properly along with proper driver support it will be amazing, but the fact remains that it is still a buggy function that has irritated more people than you can imagine.

    What I would like to see is Nvidia step up and admit that it doesn't work correctly on a lot of machines. Don't play stupid with your consumers and promise fixes for 4+ months with absolutely nothing actually seeing the light of day. The 265 series punkbuster fix is just the tip of the iceberg. Also Nvidia needs to step up and admit(even if not publicly) that you are getting killed on almost every sector of the market by ATi/AMD, the only thing Nvidia seems to be winning on is the fact that only Nvidia has CUDA and PhysX support on the GPU. I love Nvidia products but I've been a bit disappointed by what I've seen in the last year. Please don't take this as bashing. I only intend it to be a report on what real end users are seeing with the products you make, and recommendations for what I would like to see in the future.
     
  39. Paralel

    Paralel Notebook Evangelist

    Reputations:
    57
    Messages:
    396
    Likes Received:
    0
    Trophy Points:
    30
    I thought maybe it was just me...
     
  40. crazycanuk

    crazycanuk Notebook Virtuoso

    Reputations:
    1,354
    Messages:
    2,705
    Likes Received:
    3
    Trophy Points:
    56
    reading my mind are we.

    and also to users you can NOT disable the IGP as machines with Optimus use the IGP as its final display. hence the punkbuster issues
     
  41. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    Yup, the Nvidia GPU actually runs through the IGP so both run at the same time at all times, technically the Nvidia GPU is always on to some point it just isn't always being utilized as the actual display driver.
     
  42. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    Hello Sean! Nice to see you on this forum. Anyway what do you know of the Toshiba X505 Q892? I just got this notebook after the Thanks Giving holiday, and found that it powers down when I unplug it from an outlet, thereby dropping performance and framerates in my graphically demanding games.

    I've tried everything I know, from changing the power settings to High Performance, tweaking the options in the power setting advanced options, even changing registry settings to open up hidden power options, yet nothing seems to work.

    I had this sort of problem with my previous gaming laptop and the solution was to use the very first nvidia verde drivers when they came out which gave me maximum performance gaming while even unplugged.

    Anyway, I've tried to install the most recent verde driver 260.99, but it seems to be incompatible with the Toshiba x505 q892 and won't let me install. I'm guessing that since this notebook is very modern and fairly new it uses state of the art technology, such as the Nvidia GTX 460m and Intel's i5 460m (not to be confused with the 460m GPU), and since I can't game on high performance, even after fidling with the power options, I'm assuming it also is utilizing the Optimus technology from Nvidia.

    I did some researching and came across the video of the new "Optimus User Interface", Take a Look: New Optimus Interface NVIDIA, which also lead me to the M11x video showing the Optimus interface in action, YouTube - Nvidia Optimus Feature on the Alienware M11x- R2. I can't find this user face nor the option to choose on my notebook's Nvidia control panel.

    These are my questions. First, is the Toshiba x505 q892 using optimus technology? If it is, could it be that this is the cause of the drop in performance when gaming unplugged? Also will Nvidia support this particular notebook with future releases of the verde drivers? And in the meantime, I was thinking about manually installing the 260.99 driver using the "have disk method" by downloading and installing the driver from Laptopvideo2go.com. Would the new verde driver/drivers give me the Optimus User interface on my Nvidia control panel, so I could force it to use the GTX 460m?

    I'm in desperate need of answers, and I'd appreciate it if you could help me, Sean. And thank's in advance for your time sir.

    Jacob
     
  43. PellyNV

    PellyNV Notebook Enthusiast

    Reputations:
    106
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    The fix is checked into our upcoming Verde driver stream. In addition though, we have other bug fixes, perf improvements, etc. which are also being checked into the driver build. Unlike those at home who are able to code up solutions for themselves, NVIDIA still needs to send these driver builds through exhaustive Q&A and ultimately through Microsoft to get final WHQL certification. Hang tight though, as we'll be releasing the fix in the next Verde painfully soon!

    Not correct actually (although I didn't believe it when we were first designing Optimus either!). Due to the logic we've built into the GPU itself, we are able to literally turn the GPU 100% off. Not some low power state or some other trick.

    When Optimus doesn't see a need for the GPU to be enabled, it disabled the GPU and the GPU consumes 0W. This is the magic of Optimus. There's a reason ATI didn't have an immediate answer for this. The technology here is anything but trivial. Doing this all without multiplexers is not a simple thing to accomplish.

    In contrast, the IGP is always on to some degree. When Optimus has the GPU enabled, only the display controller on the IGP is active. Everything else is either shut down or running at the lowest possible power state.

    When Optimus has the GPU disabled, the IGP is fully on.

    In each case, the IGP display controller is active.

    Hey Jacob,

    Great system and I'm sure you'll be impressed with the gaming chops of the GXT 460M! Unfortunately, Toshiba did not choose to implement Optimus on that notebook. Regardless, you still should not be seeing a perf hit if you have the Windows power options set. I don't have that system handy, but I vaguely remember there being some sort of Toshiba power application or management software. I'd disable that as this is likely what's throttling clock speeds for CPU and GPU and hurting performance.

    As for Verde driver support. Your X505 is a very recent system so it likely wasn't done going through qualification and validation in our labs. You'll likely have support in the next Verde driver which we should be posting in the immediate future. (certainly within 30 days).

    Enjoy the system!
     
  44. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    Sean

    Would you like to explain to me how the Nvidia GPU is completely shut off and also running at 135mhz Core at 0.8v and be able to output a temperature and other information from sensors? It's impossible is the answer, it would be like saying you can unplug a tv and still have it give you information. The GPU may go into a dormant/extremely low power draw type of a state but on the M11x-R2 it never shuts off 100%. I hope you guys get the driver right this time, I understand the process is long but it's getting a little ridiculous on the amount of time 1 driver fix has taken. You're also turning a lot of people away from Optimus and possibly even Nvidia as a whole because of the mess ups that have happened on multiple sectors of Nvidia's business plan. The 3XXm/3XX series rebranding was pure crap, the 4XX series in my opinion has been decent but also a major flop considering how hyped up it was, and the Optimus issues have also been another major flop not because there are issues but because of how long it has taken Nvidia to respond. Again, I love Nvidia as a company, Nvidia's products and support recently on another hand is something that is strongly making me consider going to ATi. I really do hope that your company can start to show a stronger market presence. Thanks for the info and response.

    Dane
     
  45. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    YouTube - Optimus Engineering Coolness for GeForce and ION
     
  46. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    Do you seriously believe what you have just said?

    this is pathetic.
     
  47. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    Richteralan I don't care what that video shows, I can't say for certain that some rigs don't completely shut off the discrete GPU as I haven't used all of them, but I can't tell you for certain that it would be impossible for the GPU in the M11xR2 to be giving readouts and showing at 135/270/135 without being powered to some point meaning it is DRAWING POWER. You explain to me how a non powered device can show readouts of temperature, clocks, usage, etc without being powered. And I can tell you that after a ton of time using this rig with constant monitoring at all times that it has never once stopped giving readouts meaning it has never fully shut off. If you'd like I'll even go to the next step of proving it.
     
  48. PellyNV

    PellyNV Notebook Enthusiast

    Reputations:
    106
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    DaneGRClose,

    Although I can appreciate your skepticism, I can honestly tell you the dGPU is electrically off and not consuming any power. When exiting a game, Optimus turns the GPU off immediately. The only "exception" to this is using Flash 10.1/10.2 where exiting a Flash video will drop the GPU down into the lowest power state for a second or two before turning electrically off. (Optimus does this just in case you're about to launch another video.)

    When the GPU is off (you can verify using the Optimus taskbar tool in Verde drivers), the GPU is consuming 0W and running a clock frequency of 0MHz.

    What tool are you using to report clock speeds? Several things could account for the odd reporting you're seeing. The application you're using could be mapped to the wrong sensors, the polling interval could be too long, etc.

    Remember, there are no multiplexers here eating up power and the GPU is not running is some low-power state. Instead, is literally electrically off and measured to consume 0W. More details can be found in the Optimus whitepaper I wrote.

    Note: This was written when Optimus originally launched so some of the applications being mentioned might be dated. Regardless, everything technical still applies.

    Have a good one guys!

    Sean

    Sean Pelletier
    Senior Technical Marketing Manager - Notebooks
    NVIDIA
     
  49. DaneGRClose

    DaneGRClose Notebook Virtuoso

    Reputations:
    1,805
    Messages:
    2,550
    Likes Received:
    30
    Trophy Points:
    66
    Sean,

    I've used MSI Afterburner, EVGA Precision, sidebar gadgets, and GPU-z every single one of them reports the exact same 135/270/135 @ .80v reading, so one of a few things is happening here:
    1-it doesn't actually shut off
    2-optimus has an issue and isn't working as it should which also means #1 applies
    3-Alienware has somehow changed the Optimus programming to prevent a full shutoff
    If someone wants to tell me how it would be something else that makes sense that enables something electronic to not draw any power and still give readouts I'm all ears. There's no way I see 6+ programs giving a false read so I highly doubt that's the issue. I appreciate you coming on here Sean, it's nice to see someone from nvidia stick their head out without having to be called.

    Dane
     
  50. SimoxTav

    SimoxTav Notebook Evangelist

    Reputations:
    273
    Messages:
    442
    Likes Received:
    0
    Trophy Points:
    30
    Hello Sean,
    I recently bought a XPS 15 with gt420 and a core i3.
    I also noticed that actually only the drivers provided directly from dell seems to work with optimus (the others doesn't work at all). Considering the advertising on nvidia site about this laptop may we expect an official nvidia release who supports the ID of our card? My worries are about the huge time the OEMs take to update their drivers :(
     
 Next page →