The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    How far have mobile GPU's progressed in the last 3 years?

    Discussion in 'Gaming (Software and Graphics Cards)' started by PIN360, Apr 6, 2011.

  1. PIN360

    PIN360 Notebook Enthusiast

    Reputations:
    0
    Messages:
    37
    Likes Received:
    2
    Trophy Points:
    16
    So I was wondering... I was trying to compare the power of my 8800m GTX SLI card to the current generation of GPU's. I remember when the 260's and 280's came out the power difference wasnt that much greater. 260m's for example were pretty much rebranded 9800m gtx's.

    Are today's single noteboook GPU's more powerful than 8800m gtx's in sli these days? If so which ones?

    Theres a website that lists mobile GPU's and what tier they are in. But thats not what I want to look at. I was hoping for some comparisons made by real people with real game experiences.
     
  2. whitrzac

    whitrzac The orange end is cold...

    Reputations:
    497
    Messages:
    1,142
    Likes Received:
    7
    Trophy Points:
    56
    even from a desktop standpoint, a friend has a pair of 8800gts cards and can play most modern games without issues....
     
  3. Baka

    Baka (・ω・)

    Reputations:
    2,228
    Messages:
    2,111
    Likes Received:
    20
    Trophy Points:
    56
    A single GTX485M is probably stronger than 8800M GTX SLI ._.
     
  4. dttran83

    dttran83 Notebook Deity

    Reputations:
    272
    Messages:
    837
    Likes Received:
    7
    Trophy Points:
    31
  5. shinakuma9

    shinakuma9 Notebook Deity

    Reputations:
    172
    Messages:
    1,512
    Likes Received:
    0
    Trophy Points:
    55
    Moore's Law.
     
  6. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Moore's Law applies to Intel CPUs if I'm not mistaken, if anything the GPUs in laptops seem to get larger and hotter each time with each performance increment.

    As already mentioned, some of the high-end single GPU solutions available now can trump SLI or Crossfire combinations of cards 1-2 generations ago.
     
  7. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    From what I've seen a single 485m GTX is slightly stronger than my 9800m GTX's in SLI.
     
  8. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    Moore's law doesn't just apply to Intel. It was just an observation made by the then-head of Intel, Gordon Moore, and it more accurately states that the number of transistors on a chip doubles approximately every 18 months. Which has pretty much held true. Just because they're larger and hotter doesn't mean they aren't adhering to Moore's Law, or rather, observation.
     
  9. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    I stand corrected then, assuming of course both Nvidia and ATI have actually doubled their number of transistors every other new GPU iteration from 3 years ago.
     
  10. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    Well Moore's law is more an observation than anything else. There's no need to think so hard about it :D
     
  11. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    One GTX 485M will win by 40 to 50 percent.
     
  12. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
  13. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    GTX 485M 40nm 100W TDP 256-bit GDDR5 384:64:32 shader (vertex/geometry/pixel) clock speed 575MHz core 1100MHz shader 1500MHz memory (3000 effective, though that seems a bit high to me, that's what I've got down for this GPU). More info can be found on wiki or searching the forums.
     
  14. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    I tried looking on wiki, could you post a link, and google brought up no results at all, not even on nvideas main page for the card.
     
  15. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    [​IMG]

    Direct sauce.
     
  16. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    thanks bra, now thats some nice stats, I wonder were it will be in another 5 years. Im hoping graphene is playing a part in it by 2020 personally, Or the fruits of "Unlimited Detail" are real and that just messes up graphics cards for ever. I mean if we could just have one piece of software like this guy claims we'd only need to upgrade hardware for more resolutions. He makes it all convincing but without anything else released on how he does it, it makes it all to shady.

    For people who don't know Unlimited Detail Technology - Home is this software a guy developed that shows well.... unlimited detail. Defeating the purposed of more powerful video cards.
     
  17. Räy

    Räy Guest

    Reputations:
    0
    Messages:
    0
    Likes Received:
    0
    Trophy Points:
    0
    There was on old tomshardware article from last year that went like this:
    Mobility Radeon Vs. GeForce M: The CrossFire Advantage : A Leopard That Changes Its Spots


    clevo x8100
    sli 5870m and 940xm cpu

    compared to

    desktop
    ati 5850 and intel i7 920

    The $4,000 clevo matched exactly with the $600 of desktop parts. Mind you an intel i7 920 and 5850 can max just about anything now in days besides crysis and metro 2033.

    Now the radeon 6970m is a single desktop ati 6850 downclocked. Though the desktop 6850 is a bit slower than the desktop 5850 it is still a massive increase in performance. People with the Clevo x7200 can post easily in the 5000s in 3dmark11 performance using sli 6970s with the intel i7 960-990x.
    AVADirect?s X7200: The GeForce GTX 485M SLI Mobile Graphics Giant : Better? Faster? Cheaper?
     
  18. Mr_Mysterious

    Mr_Mysterious Like...duuuuuude

    Reputations:
    1,552
    Messages:
    2,383
    Likes Received:
    15
    Trophy Points:
    56
    $4000 to $600???? Deeee-amn! :O

    Mr. Mysterious
     
  19. Mr_Mysterious

    Mr_Mysterious Like...duuuuuude

    Reputations:
    1,552
    Messages:
    2,383
    Likes Received:
    15
    Trophy Points:
    56
    Then again...the laptops have distinct advantages that desktops don't have ;)

    Mr. Mysterious
     
  20. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    That's why I have a decidedly midrange laptop and a desktop that can max everything out.
     
  21. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    It all depends on your individual needs really. I have it the other way around because I like to be able to game and do other things where ever I wish. My desktop is an old Dual Core and I use it for web surfing, downloads, word processing etc.
     
  22. TomJG90

    TomJG90 Notebook Evangelist

    Reputations:
    46
    Messages:
    425
    Likes Received:
    0
    Trophy Points:
    30
    My desktop is a P4 and i never ever want to use it. Whichever way , all you can say is wow as how GPU and CPU power has grown. I can't figure what it will become in this decade. All from the humble P3 in 2000...
     
  23. Generic User #2

    Generic User #2 Notebook Deity

    Reputations:
    179
    Messages:
    846
    Likes Received:
    0
    Trophy Points:
    30
    ugh, i wish people would stop talking about Moore's Law(/observation).

    in these current times, its not so much as a prediction as a goal for intel.
     
  24. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Ah, lets see, mobile gaming has waxed and waned compared to desktops. When I got my 1694WLMi it had a midrange X600 GPU, which was exactly the same chip as the desktop just downclocked. No matter it overclocked 50% anyway.

    It got left behind a bit after the x800 series, came back around the HD4 series, took a dip for a while while we were stuck with the 4670/9600M, then with the 5 series and GTX460M things have come back again.