The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Integrated/discrete GPU

    Discussion in 'Alienware 14 and M14x' started by /Drakk_, Jul 1, 2011.

  1. /Drakk_

    /Drakk_ Notebook Consultant

    Reputations:
    2
    Messages:
    281
    Likes Received:
    0
    Trophy Points:
    30
    So we all (mostly) know how optimus works - you have most apps running on the integrated GPU in the Sandy Bridge processor, and whitelisted apps use the nvidia GT555 to run (in theory).

    What I'm wondering is, when running an app on the GT555, what is the GPU in the processor doing? Sitting idle? Or does the processor GPU actually still function and help to take some load off the GT555?
     
  2. BeastRider

    BeastRider Notebook Evangelist

    Reputations:
    63
    Messages:
    580
    Likes Received:
    0
    Trophy Points:
    30
    Never thought of this, but I'm guessing the integrated graphics doesn't help the CPU,it just simply turns off..This is a guess of course lol. :D
     
  3. everythingsablur

    everythingsablur Notebook Evangelist

    Reputations:
    89
    Messages:
    312
    Likes Received:
    16
    Trophy Points:
    31
    My understanding of Optimus is that it is a muxless design. The integrated GPU is connected to the display, and the dedicated GPU is routed through the integrated GPU. When an application uses the GT 555m, the GPU renders everything in its memory and passes it through the Intel GPU for display. The integrated GPU is always "active", it's just not rendering anything. This is how Optimus can seamlessly switch between rendering engines on the fly. A mux design would have a screen flash (and both GPUs would connect to the mux which is connected to the display).
     
  4. niko2021

    niko2021 Notebook Evangelist

    Reputations:
    93
    Messages:
    592
    Likes Received:
    6
    Trophy Points:
    31
    The nvidia gt55m goes through the cpu, which contains the intel 3000 hd integrated gpu when needed. That's how it switches seamlessly. The cpu is both the cpu and integrated gpu. And if you're not using the gt555m, the pci-e lanes do not have a current flowing through them, according to nvidia. So your battery isnt being drained by the nvidia card. That's why it only works on sandy bridge processor laptops.
     
  5. CGSDR

    CGSDR Alien Master Race

    Reputations:
    285
    Messages:
    1,477
    Likes Received:
    10
    Trophy Points:
    56
    For my understanding Before everythingsablur's post, GT 555 handle apps that you assign them to run on it, while the integrated GPU handle display and some other app that you didnt assign it to run with the GT 555, thus making it much more effective on power saving like on battery, because when u don run game, it would just simply use the integrated GPU.
     
  6. niko2021

    niko2021 Notebook Evangelist

    Reputations:
    93
    Messages:
    592
    Likes Received:
    6
    Trophy Points:
    31
    Optimus is also designed to have preset programs in which it'll run automatically. Whenever you watch a youtube type video, the gt555m runs. Also most popular games are added to the global settings in the optimus program which also uses the gt555m automatically. I just change all my games to use the gt555m instead of global just to be certain. Optimus will not really allow you to to something graphically intensive and not have the gt555m on almost always. I have the gpu activity task bar icon activated. It shows when the gt555m is in use. And it's on when i least realize im using it :)
     
  7. /Drakk_

    /Drakk_ Notebook Consultant

    Reputations:
    2
    Messages:
    281
    Likes Received:
    0
    Trophy Points:
    30
    Hm. I wonder if there's a way to make them both render (if they're not doing so already) when an app runs in discrete graphics mode. Something like a pseudo-SLI mode using both the integrated and the discrete GPU to render a scene. Perhaps good for another few FPS?
     
  8. everythingsablur

    everythingsablur Notebook Evangelist

    Reputations:
    89
    Messages:
    312
    Likes Received:
    16
    Trophy Points:
    31
    Exactly! The IGP is always drawing power (on CPU die), but the dGPU needs minimimal (if any?) power when not active = power savings.

    This is pretty much what AMD is doing with their Hybrid CrossFire/CrossFireX technology. It's pretty restrictive though as only specific GPU-APU combinations work. For the most part I think it's still a desktop-only technology, but with their new mobile APUs, it might not be too far off if some manufacturer is willing to pair a discrete GPU with an AMD APU.
    Hybrid CrossFire Combinations chart
     
  9. Tergazzi

    Tergazzi Notebook Enthusiast

    Reputations:
    6
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    WOW, that is a pretty complicated process just to make a battery last longer, wouldnt it have been easier to just make better batteries?... lol
     
  10. everythingsablur

    everythingsablur Notebook Evangelist

    Reputations:
    89
    Messages:
    312
    Likes Received:
    16
    Trophy Points:
    31
    Actually no... Battery technology is limited by chemistry. Unless someone starts discovering new, easily mass producible chemical formulations (might need some new elements while we're at it...), we're pretty much stuck. Lithium Ion batteries were invented in the 1970s, and that's still the height of our battery technology. It is much easier to work with what you can control, which is engineering silicon for great efficiency.
     
  11. Tergazzi

    Tergazzi Notebook Enthusiast

    Reputations:
    6
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Wow, I did not realize that, thanks for the info. Thats amazing we havent been able to improve what was invented in the 70's, but we can make what use to fill a whole room (computer) and now fit it all into a 10" Tablet. Seems kinda strange we cant improve batteries.
     
  12. everythingsablur

    everythingsablur Notebook Evangelist

    Reputations:
    89
    Messages:
    312
    Likes Received:
    16
    Trophy Points:
    31
    It's not really that hard to fathom. Think of computers and batteries as cars and gasoline. You want a car to go faster, you build a more complex, more powerful, more efficient engine. That could be everything from better headers, ignition systems, fuel injectors, super/turbo charging, you name it. Lots of tricks in that book. No matter what you do to make the car go faster, the gasoline is still pretty much the same today as it was 40 years ago.

    Chemistry is chemistry. A chemical reaction between two elements will always produce the same electrical current as the last time. You can't make the same two elements magically have a bigger reaction. The electrical capacitance of a chemical reaction is never going to change because you can't just make a lithium atom with more electrons. You need to create new chemical compositions entirely. Never mind mining said chemicals in enough quantities for mass production...

    My hope/guess is Lithium titanate batteries (aka. Super Charge Ion Battery, or SCiB) will be the next big step, but they are only starting to be made for electric busses now. Will be years before they get small enough to put into laptops commercially. Toshiba prototyped one last year. That or Lithium Sulpher, since sulpher is really cheap and quite plentiful. Not sure if it has enough capacitance or longevity though...