The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia 580M vs. Nvidia 485M

    Discussion in 'Gaming (Software and Graphics Cards)' started by hellhoundpro, Jul 10, 2011.

  1. hellhoundpro

    hellhoundpro Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    Is it worth 200$?

    At this point I'm finding get gaming benchmarks for both because they are in general in different laptops and comparing different games. I know the benchmarks by benchmarking software aren't that different but I want to see true performance difference in games. A

    Anyone got links or opinions to share?
     
  2. Nick

    Nick Professor Carnista

    Reputations:
    3,870
    Messages:
    4,089
    Likes Received:
    644
    Trophy Points:
    181
    The performance difference is only like 5-8%.

    I would stick with the cheaper 485m.
     
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I know a 6970 with an SSD is even better value :p
     
  4. KillerBunny

    KillerBunny Notebook Evangelist

    Reputations:
    207
    Messages:
    683
    Likes Received:
    0
    Trophy Points:
    30
    6990m @ the same price as a 6970m is even better :p
     
  5. mangos47

    mangos47 Notebook Consultant

    Reputations:
    84
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
    I'd prefer 485 and save that $200 for something else. The performance improvement is not as big as from a 560m to 485m which costs $295 for some model of laptop and that's more economic IMO.
     
  6. Kanashii

    Kanashii Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    But what about heat? And what KW/h?
     
  7. KillerBunny

    KillerBunny Notebook Evangelist

    Reputations:
    207
    Messages:
    683
    Likes Received:
    0
    Trophy Points:
    30
    The 580m is based on the same architecture as the 560TI; the 485m is based on an older architecture. The 580m is essentially a 485m, but with better heat management, hence the higher stock clocks.
     
  8. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    not that there is much difference in clocks
     
  9. KillerBunny

    KillerBunny Notebook Evangelist

    Reputations:
    207
    Messages:
    683
    Likes Received:
    0
    Trophy Points:
    30
    Enough to make a 5-10% jump over the 485m (depending on benchmark/game/test). More importantly, the better heat management translates to more overclock headroom.
     
  10. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    that is mostly because of some redesign, the difference is, if Im not mistaken, 50mhz on the core
     
  11. animecrisis

    animecrisis Notebook Guru

    Reputations:
    18
    Messages:
    62
    Likes Received:
    0
    Trophy Points:
    15
    I own a X7200 with dual 485's. I am waiting til they release the 585m cards before I upgrade. If you are buying new you can either wait or for $200 more I would buy the better video cards. I look at it this way, you are paying oodles for the laptop anyways why would you skimp on video cards. For me its different because it would cost me $1600 to upgrade. A very good reason to wait lol. Also the performance between the 485 and the 480 cards were very pronounced when they came out. Just my thoughts.
     
  12. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    There will be no 585M.
     
  13. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I agree, there is just no where to go.
     
  14. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    really i think that because nvidia isn't coming out with kepler for desktops until next year that there probably will be at least one more mobile graphics card coming out in the 5xx series and it will have to have more cuda cores.
     
  15. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Um.... no. The GTX 570 (480 cores) is the next chip in line, and there's no way that Nvidia can fit that 219W TDP into the mobile space.

    It would be so severely downclocked, it'd be the 480M all over again. Maybe even worse.
     
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Well unless they magic up another core or release the 480M all over again I don't see what they are going to do.
     
  17. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    maybe a dual gpu solution, using something like a gtx 550 or something i dunno, but since those kepler cards wont be coming till late next year they will need to cook up something.
     
  18. Patrck_744

    Patrck_744 Burgers!

    Reputations:
    447
    Messages:
    1,201
    Likes Received:
    41
    Trophy Points:
    66
    Even the X7200 wouldnt be able to handle that.
     
  19. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Problem with creating a dual chip solution, even IF you fit it onto an MXM card AND make it faster than the single chip, you get reducing returns, so when you put two of those together you are effectively running quad SLI vs the dual SLI higher end chip, that's only ever going to end badly for the quad setup.

    All existing chips are in use already, so I don't even know which you would use anyway as the current 580m already has twice the power of the 560M and you would need beefier chips than that to get ahead.
     
  20. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    Maybe they'll refresh the GF100 (480m) and make something like a 590m, I can't really see Nvidia not releasing another mobile Fermi GPU before the release of Kepler.
    I could bet that we'll see another Nvidia Fermi card before the end of the year.
     
  21. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    They are not going to release a GPU that performs worse, which that would.

    That or we get MXM-C.
     
  22. animecrisis

    animecrisis Notebook Guru

    Reputations:
    18
    Messages:
    62
    Likes Received:
    0
    Trophy Points:
    15
    I disagree. I imagine they will at least overclock it higher like they did for the 580m. Due to the competitive nature of the market they have to release new cards fairly often. I going to venture that they probably held back a little on the 580 so the 585 will be about a 20% increase in performance when they release it. That would give them time to release their new architecture.
     
  23. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    actually they didn't just overclock a 485m they lowered the voltage from 1.00 to 0.87, which means if you could raise it back to 1.00 you would really have something, major overclockabilty, something like 30-40%. another forum member has his overclocked to 820mhz on the core, which the gtx 560ti is 822mhz, so basically its running at the clock rate of the desktop card.
     
  24. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Yes but the 560ti has quite a jump in power consumption too.

    I doubt you want to be drawing 125W or more from the slot.
     
  25. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    he told me we was pulling out 224 watts when it was like that, and the power brick is only 180 watts.
     
  26. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    which means the brick was maxed assuming its 80% efficient.
     
  27. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    yeah pretty much, im asuming he was using the input wattage as opposed to the output.
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    If it was output his PSU would explode.
     
  29. animecrisis

    animecrisis Notebook Guru

    Reputations:
    18
    Messages:
    62
    Likes Received:
    0
    Trophy Points:
    15
    I know I am running two 300W power bricks with an adapter so they feed into my X7200.
     
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Its very hard to get even power draw though.
     
  31. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    These high-end mobile gpus are already powerful enough.
    What's the point in raising voltages to gain higher OC and risk increasing the temperatures to dangerous areas?

    I would sooner undervolt the gpu and OC by about 10 to 20% (which should be doable).
    Lower temps, and still higher clocks.
     
  32. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    No way can these level of cards undervolt and OC that much.

    Mobile cards are still a level of play behind desktops and at FHD resolutions and AA/AF it can make a difference.
     
  33. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    So why was I able to undervolt my 9600m GT GDDR3 (albeit a mid-range card) and OC it by about 20%?
    I suspect high-end mobile gpu's would be able to do something similar.

    As for the high-end mobile gpu's being on a level of play behind desktops... well, OC-ing them by 30% won't really make that big of a difference to begin with and you risk your laptop by overvolting them too much.

    At 1920p. I highly doubt that one seriously needs AA that much.
     
  34. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    9600M GT is 4 generations old! From a completely different era.

    Also I can really tell the difference!