The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    GTX580m vs GTX485m

    Discussion in 'Sager and Clevo' started by Dr.wahab, Jun 28, 2011.

  1. Dr.wahab

    Dr.wahab Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    im already buy sager with gtx485m , its worth it if i upgrade to gtx580m?


    thanks
     
  2. TheGreatAnonymous

    TheGreatAnonymous Notebook Consultant

    Reputations:
    155
    Messages:
    213
    Likes Received:
    8
    Trophy Points:
    31
    In short, no. Perhaps if you don't mind paying a premium to have the latest bleeding edge technology, otherwise stick with the 485M. I used to think the 485 was overpriced, but with the recent price reduction it's quite a steal now. You could probably overclock the 485M to just about match the 580M anyways.
     
  3. templar2323

    templar2323 Notebook Guru

    Reputations:
    0
    Messages:
    71
    Likes Received:
    0
    Trophy Points:
    15
    I agree. But I also think we need to give it some more times to get better bench results
     
  4. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    The 580M is not just a higher clock, it also uses a different core than the GTX485M. I think it would be a serious waste of money to upgrade from the GTX485M though.
     
  5. calxn

    calxn Notebook Geek

    Reputations:
    80
    Messages:
    98
    Likes Received:
    4
    Trophy Points:
    16
    I'm in the market for a new game machine that can handle Battlefield 3. Unfortunately, the game isn't out yet so there is no way to tell if the 580M will be a big improvement. I more inclined to pay the extra $200 for the 580M. From the benchmarks I've seen, these new games really taxes these mobile GPU, and I'm not sure even the 580M can handle BF3. When I say handle, I'm talking 1920x1080 at ultra level and running atleast 40fps. I don't play none of that 720p or 1600x900p nonsense other than on the xbox. I've seen the videos of BF3. I want THAT image quality. Nothing less. I don't think the 485M can handle it. I'm thinking I'll need crossfire or SLI.
     
  6. bifnewman

    bifnewman Notebook Evangelist

    Reputations:
    6
    Messages:
    580
    Likes Received:
    0
    Trophy Points:
    30
    I suggest waiting until that game comes out or at least until more information is released on it. That way, you might be able to snag an after market 580m if you need it for a lower price. To be honest with you, with the reports I have seen on the power efficiency, i dont think there will be much difference. I've heard it uses even more power than the 485. it only has 45mhz on the 485m which is a very small overclock.
     
  7. bartman8888

    bartman8888 Notebook Geek

    Reputations:
    23
    Messages:
    90
    Likes Received:
    0
    Trophy Points:
    15
    The consensus on the "Gtx 580m" thread is "no" or "wait and see".

    If the 485M has already been delivered, it's not cost effective to upgrade it for a ~8-10% performance boost (until 3rd party benchmarks/overclocking stats say otherwise). If your reseller offers a $200 discount for the 485M, take the discount.

    Unless you want the personal satisfaction of having "the fastest GPU", stick with the 485M and take the discount.
     
  8. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    i bet 580m will run cooler when idle clock 50mhz
    and more ocable......maybe 780~800mhz core @ stock votage :)
     
  9. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    *IF* (and this is a BIG IF), EA actually optimizes PROPERLY the Frostbite 2 engine (not to be confused with Frostbite 2.0 from MoH/BF:BC2), then the card will have plenty of power. With the visuals the Source Engine is able to produce overall, and the visuals the Unreal Engine *CAN* use (as seen in Call of Duty: World at War on the PC) for lighting and airborne particles, bloom, etc (there are a few maps that are simply breathtaking, and fire looks GREAT), then there's absolutely no need for all of the visuals present in a BF3 to require SLI 580Ms to achieve. It's in the power class of the 560Ti of the desktop land, it should be overclockable to a good degree, it will have better, more optimized drivers by that time, etc etc. The list goes on. It should have enough power to push out BF3 at fantastic specs. My 280M will probably roll over and die, however.

    Again, this is a LARGE "IF", and I hope with all my nerdy gamer self that EA doesn't do the stupid thing and make the game un-optimized just because they figure PC gamers will have overflowing, waste-able power. I *did* see them display it at EA though, and even with a bare shell of an OS to run the game, they did have multiple demos with PCs running in multiplayer, enjoying those graphics; those PCs would all have to have been pretty powerful. That being said, I don't think EA would have SLI'd GTX 580s in desktops just to achieve such prowess, even if it was for E3, that'd be too overly expensive. So this gives me hope. 3D will probably be out of the question though. Just sayin.
     
  10. VeeTee

    VeeTee Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    BF3 is running on Frostbite 2. BFBC2/MoH were running on 1.5. CoD:WaW runs on a slightly modified "CoDEngine" developed by IW for CoD2. It itself IS optimized for the PC.
     
  11. calxn

    calxn Notebook Geek

    Reputations:
    80
    Messages:
    98
    Likes Received:
    4
    Trophy Points:
    16
    The difference is CoD have very small maps and small number of players so the processing power required is low. I play both series. The BF series just has more happening in a scene than CoD does. Add on top, all the new visuals in BF3 probably put it on par with the Prey series in terms of graphical demand. Maybe even more.

    I'm definitely not buying a new machine right now. My hope is that I can play with my weak 285 GTX on my desktop or on a PS3. I'd rather not spend 2-3k just for one game. But if the game is really that good and goes back to the days of 64 player maps, I might just have to buy a machine to join in one last harrah for PC gaming.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I saw that Frostbite 2.0 (as in Frostbite 1 version 2.0) was being used for BF:BC2 and MoH; and Frostbite 2 (as in Frostbite 2 version 1.0) is to be used for BF3. I wasn't aware that Frostbite 1 version 1.5 was used; I've not even heard of something like that.

    Also, I saw multiple times in various places that CoD ran on a version of the Unreal Engine. I suppose it's not out of the question what you say, but even a modified Unreal Engine is limited by the Unreal Engine, no? If it's got nothing to do with the unreal engine then that makes that point moot; however it's still a beautiful game to behold nonetheless. Best looking CoD game to date, I would say; surpassing "successors" MW2 and Black Ops.

    Also, I wasn't saying that Source nor World at War were unoptimized for the PC; quite the contrary, I simply mean that if they can squeeze out all of those impressive visuals and feats from good DX9 games with such good optimization that it can run flawlessly maxed out even on "dated" hardware like my 280M (9800GTX+ in desktop world; a far step back from the Fermi) then Frostbite 2 *can* be properly optimized to pull all those amazing, beautiful visuals from itself without requiring SLI GTX 580's or something.
     
  13. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    the difference between the gtx 580m and th 485m is only the core.

    the gf114 which is the same exact architecture, but slightly more efficient, and only the slightest bit, which will most likely allow the overclockabilty of the 485m to be 45mhz less than the 580m.

    dont know if this was already posted but heres the link for notebookcheck

    NVIDIA GeForce GTX 580M - Notebookcheck.net Tech
     
  14. calxn

    calxn Notebook Geek

    Reputations:
    80
    Messages:
    98
    Likes Received:
    4
    Trophy Points:
    16
    I've seen that link on notebookcheck, and that's what worries me. At high quality 1080p, the 580M can only do 42fps with BFBC2. Surely, BF3 will bring the 580M to its knees. People on this forum are saying 10-15% improvement is not worth $200. On the contrary, that 10% may mean the difference between playable 1080p and downgrading to some chump resolution. It would be like watching Avatar 3D on an old SD tube. Some media are worth watching at max potential. Sure, CoD looks great, but mind-blowing visuals is not why I play CoD. CoD is all about scripted cinematics. BF3 is all about huge maps, destructible physics, massive number of players, every vehicle you can want, and awesome visuals.
     
  15. Mirakel

    Mirakel Notebook Guru

    Reputations:
    0
    Messages:
    67
    Likes Received:
    0
    Trophy Points:
    15
    I agree, sir..

    That FPS rating doesn't make sense, if you look at this review you can clearly see that the GTX 485M, just like the HD 6970M, can get ~50 FPS on BFBC2 on 1080p at the highest settings. The GTX 580M should theoretically be 10% faster, so it should get ~55 FPS on BFBC2 at 1080p on the highest settings. That is, with a similar configuration as NBC uses in it's Eurocom Racer and Schenker XMG P501 reviews.

    You will be able to play BF3, however I highly doubt you can get to the absolute max with a single mobile GPU since their demo's at E3 were running on GTX 580 desktop cards and Dice admitted it wasn't running at the highest settings to ensure fluid gameplay. We'll know when the demo comes out. Wait and see, I guess..

    For gaming uses, a laptop's weak point will always be the GPU. I agree that the GTX 485M is very tempting at it's current price, but the extra investment can be worth it for games like BF3. We just don't exactly know the benefits of the GTX 580M over the GTX 485M apart from the spec sheet.
    Let's just hope someone can do some gaming benchmarks, runtime measurements and overclocking on the GTX 580M before the GTX 485M disappears!
     
  16. VeeTee

    VeeTee Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    Frostbite Engine - Wikipedia, the free encyclopedia

    I by no means am trying to attack you, it's just I get a bit frustrated when the wrong engines are represented. I can assure you that CoD does not run any code from Unreal, instead it has small, very very small traces of Quake. However, for CoD2, a completely new engine was coded to run on the next-gen PC/Consoles.
     
  17. physib

    physib Notebook Evangelist

    Reputations:
    231
    Messages:
    583
    Likes Received:
    0
    Trophy Points:
    30
    BC2 uses Frostbite 1.5, and as for CoD, iirc, it originally uses a modified Quake engine (not sure which one), and now it uses whatever engine it was using since CoD 4. But no, I don't think CoD uses a modified Unreal Engine.

    And honestly I think CoD's visual is pretty average.
     
  18. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well then, apparently I need to change mah sources. Hehe.

    But alright; the same trend of thought works though; they could properly optimize it re-design the whole engine well, I just hope they do. I don't mind being shown the light, don't feel bad about doing so.

    As for CoD, it's *only* world at war and only in a few locations, and I specifically mean lighting/ambient particles etc. It's got nothing to do with the game's overall visuals, which, I agree, aren't anything too special. I just meant it could be done with little load on a video card, if they *wanted* to. =).
     
  19. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    If buying new and can afford the $200 I'd say go for it. Can't hurt. But if you're at your spending limit, you could buy a nice SSD for $200 that would benefit you more.
     
  20. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    honestly the graphics in the cod series are lacking. you can tell by examining the characters faces that bc2 is much more detailed, even on the 360 there is a major difference
     
  21. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yes. We get this. Are you trolling aduy? I was being very specific.
     
  22. alienlover11

    alienlover11 Notebook Consultant

    Reputations:
    21
    Messages:
    207
    Likes Received:
    0
    Trophy Points:
    30
    there has been no computer built yet that can max bf 3.
     

    Attached Files:

  23. physib

    physib Notebook Evangelist

    Reputations:
    231
    Messages:
    583
    Likes Received:
    0
    Trophy Points:
    30
    ! I'm raging now lol
     
  24. alienlover11

    alienlover11 Notebook Consultant

    Reputations:
    21
    Messages:
    207
    Likes Received:
    0
    Trophy Points:
    30
    Lol I was too. When you think about it though, that's a bold statement seeing how thereare comps that have three way sli gtx 580s. Maybe the reporter misunderstood him or something.
     
  25. physib

    physib Notebook Evangelist

    Reputations:
    231
    Messages:
    583
    Likes Received:
    0
    Trophy Points:
    30
    Does that sound real to you though? Maybe he's talking about running BF3 with 32x AA and ten monitors.
     
  26. 1341

    1341 Notebook Guru

    Reputations:
    9
    Messages:
    73
    Likes Received:
    0
    Trophy Points:
    15
    of course 580m
     
  27. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    well he surely wasn't talking about 1080p he was probably talking about like a 30" 2560x1600 3d of course with physXs enabled on full, while running crysis 2 and metro 2033 on two other screens.
     
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    ^ Quad SLI GTX 580's + i7-990X OC'd to 4.5GHz, Liquid cooling, 2000W PSU. XD
     
  29. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    liquid... nitrogen
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Naw, Liquid nitrogen can't be used for long. It fluctuates the temps of the hardware apparently; only good for overclocking stress tests/benchmarks. Basically making the people at slirigs.com pee themselves wondering why your single 5850 is beating their SLI 5870's. And yes, I actually know someone who's done this.
     
  31. calxn

    calxn Notebook Geek

    Reputations:
    80
    Messages:
    98
    Likes Received:
    4
    Trophy Points:
    16
    I was actually planning on playing BF3 on my 30" but at 1080p resolution. I hemmed and hawed over laptop vs desktop, and I ended up deciding that if I have to go nuts for one game, it's not worth it. I'll just settle for 1080p on a single card laptop and maybe even hang my head low and head for 900p.

    The game won't be out till around October/November so there is a chance, the next gen chips will be out. The next gen chips from nvidia are supposed to be significantly faster and at lower thermals since they're moving to 28nm. The desktop chips are scheduled for release at the end of this year. The idea of waiting to buy a laptop till early next year with the Intel Ivy Bridge and nVidia's Kepler GPU sounds real tempting. Generational jump in performance and drop in themal.
     
  32. vestitas

    vestitas Notebook Guru

    Reputations:
    20
    Messages:
    53
    Likes Received:
    0
    Trophy Points:
    15
    The problem is that with the speed new tech comes out you could just always keep waiting for the next thing....
     
  33. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    When I consider the sheer fun I have had playing games maxed on my current laptop at 1080p, I am glad I did not keep on waiting for the next best thing. He who waits for the next best shall perhaps miss the best first.
     
  34. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,150
    Trophy Points:
    931
    haha, nice one ;)
     
  35. SpoonPants

    SpoonPants Notebook Enthusiast

    Reputations:
    1
    Messages:
    16
    Likes Received:
    0
    Trophy Points:
    5
    very true! good buddhist... :D
     
  36. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    Knowing how fast graphic card get release, and I know that both Ati and Nvidia could do a lot better with their current card (optimize, etc...), there will always be the next best thing.
     
  37. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    lol i just got mine and tested it on bfbc2 and maxed everything and no lag at all, and there were 24 people in the match on heavy metal.
     
  38. calxn

    calxn Notebook Geek

    Reputations:
    80
    Messages:
    98
    Likes Received:
    4
    Trophy Points:
    16
    Considering the dearth of interesting games for PCs right now, I don't feel like I'm missing out yet. BF3 and Mass Effect 3 are the two I'm waiting on for PC. The rest that I'm waiting on are all for consoles (Rage and MW3). The only game I really want to play on the PC right now is Mass Effect 2. I need to replay that in preparation for ME3, but that game isn't that taxing.

    I may still get a laptop soon just to grab one of the last 485 GTX before it stops selling. After some thought, I decided I want to limit how much I want to spend on the PC's last harrah. I'm even debating on the need to have the blu-ray drive (another dying technology) since I mainly watch Netflix streaming. This laptop is only for gaming. I have no other use for Windows, not that I'm against it. All my software just runs on Macs.

    It's sad since I cut my teeth playing games on the PCs, but the reality is all the developers are spending more time on consoles and mobile than they are on Windows. Part of the blame goes to Microsoft who tried to steer developers to the xbox at the expense of the PCs. It's not all bad since I've spent a lot less money on PCs over the last few years because of the transition to consoles.
     
  39. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    rumor has it that windows 8 will be able to play 360 games. most likely it will have to reboot into a 360 like operating system, because an emulator would not be fast enough.
     
  40. niffcreature

    niffcreature ex computer dyke

    Reputations:
    1,748
    Messages:
    4,094
    Likes Received:
    28
    Trophy Points:
    116
    I haven't seen a thread go more off topic than this one. Likewise, I haven't seen this many people so wrongly think they know what they're talking about, ever, on these forums...
     
  41. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    I think it really comes down to personal taste. For example, games I have played lately that I thoroughly enjoyed are The Witcher 2, Plants vs Zombies, GTA IV, Prototype, Saints Row 2, Red faction 1, Red faction Armageddon, Red Faction Guerrilla and I am yet to play Dragons Age. Steam in particular has a wealth of interesting titles but again, it is down to taste.

    As you can see, very few titles I mentioned are new ones. There are also mods that rival newly released games in my opinion.
     
  42. calxn

    calxn Notebook Geek

    Reputations:
    80
    Messages:
    98
    Likes Received:
    4
    Trophy Points:
    16
    BF3 is the only upcoming game that I'm interested in that will require the very best in GPU. Considering PC gaming has been on the decline for years now, not many people are going to have super rigs so I won't be so bad off not having the 580 GTX or SLI. Also, the idea of spending thousands just to play one game seems very idiotic. And although I've played almost every BF game, we cannot be sure that BF3 will actually be a good game.

    I'm going to wait a few days before putting in the order. I hear rumors of the 6990m possibly being released within days. Faster, cheaper, lower power running GPU? Sure, I can wait another 2-3 days.
     
  43. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    And your contribution helps this out how?
     
  44. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    I think I will stick to the 485m, I have never pay more than 400 dollars for a graphic card and never will, and I have manage to play games on max setting every time.
     
  45. 1341

    1341 Notebook Guru

    Reputations:
    9
    Messages:
    73
    Likes Received:
    0
    Trophy Points:
    15
    580m > 485m = 8%

    please wait for 785m
     
  46. Dr.wahab

    Dr.wahab Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    thanks guys :)

    i have enough money for buying GTX580m but its not worth it for only 6-8% more than gtx485m
    and i receive my sager about 6 months and i start playing in about 2 months so its waste of money to upgrade now




    ya thats i will do thanks :)
     
  47. d2c

    d2c Notebook Consultant

    Reputations:
    6
    Messages:
    214
    Likes Received:
    5
    Trophy Points:
    31
    i think he was joking about the 785m. trust me what u got is more than enough. the next thing isnt always that much better. the only why it would be worth the money is if u had waited to buy a laptop with the 485 and then the 580 had just come out.
     
  48. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    actually no because the 485m is 200 off so its not actually.
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well in my case, the next best thing is much better though. 280M --> 480M was a great difference. And now the 580M > SLI 280M OC'd.
     
  50. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    yes but is it worth 200 more than the 485m, which an amazing card?
     
 Next page →