The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    485m (3d/physX) or 6990m (performance)?

    Discussion in 'Gaming (Software and Graphics Cards)' started by vNaK, Jul 15, 2011.

  1. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    Na look like my socket got pull out, putting in an RMA for a refix :D
     
  2. fgocards

    fgocards Notebook Consultant

    Reputations:
    10
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30
    Lol. Well my point was that I'd much rather pay $300 for a 6990m and get 23 fps on Metro 2033 than pay $300 for a 485m and get 16-17fps.

    I don't play Metro 2033.. But it's good to use terribly designed code as an example. :)
     
  3. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    You mean paying $200 dollars for a 6990m as compare to paying $300 dollars for a 485m :D

    Due note your also paying for an extra 1 month wait btw.
     
  4. fgocards

    fgocards Notebook Consultant

    Reputations:
    10
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30
    The 6990m is $300 more than the 6870m on the Alienware website. I have no idea about elsewhere.
     
  5. gdansk

    gdansk Notebook Deity

    Reputations:
    325
    Messages:
    728
    Likes Received:
    42
    Trophy Points:
    41
    Well we don't know that... I think it is a pretty solid rumor that Nvidia isn't in any of the three next generation consoles, however. AMD is in the Wii U, is probably in the next Xbox. Sony has been quite mum, but who else would've they gone with?

    Anyway, OpenCL can't do physics? That is laughable. OpenCL is simply a compute library for GPGPUs (and CPUs too, actually). You can do just about any math via OpenCL, but for a lot of cases that is a bad idea. Where OpenCL and GPGPU works exceedingly well is embarrassingly parallel problems and independent problems like Fast Fourier Transform, which are much faster when conducted in a parallel manner. A lot of physical simulations, much like graphics work, is a pretty good match for such an approach, though no perfect. The fact of the matter is that there is nothing physically wrong with AMD's graphics cards that prevent them from running PhysX like simulations in real time, in fact there are multiple open source solutions out their that do use OpenCL or DirectCompute to do physics simulations on compliant GPUs, like Bullet. Heck, CPUs have been able to do real-time PhysX like effects for some time as well. Nvidia is doing what any company would do, and that is differentiate their products from the competition. It doesn't mean they've surrendered on the performance race, just that is a secondary consideration compared to tying customers to your ecosystem. And that is what I find appalling from the position of one who wishes to maintain the ability to choose.

    As for the 3D effects... I found the 3DS to be very painful to my eyes. If other methods produce results at all similar to this, I'm bound to never enjoy them.

    As for the 6990M vs. the 485M I always err on the side of price and performance considerations. If they're the same price? It may be a tough choice for you. But if in typical manner the AMD product is cheaper, definitely go for it. Laptops are generally GPU bound.
     
  6. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    The answer to your question is 42.
     
  7. Darkness62

    Darkness62 Notebook Evangelist

    Reputations:
    242
    Messages:
    522
    Likes Received:
    4
    Trophy Points:
    31
    Wow I have found the ultimate question to the ultimate answer!!! What does an AMD fanboi say when he has no evidence to backup his false claims? 42, obviously. Excellent, move the question "What is the meaning of life the universe and everything?" over to the solved column. AMD fanbois, at least they are good for a laugh. :rolleyes:

    P.S. it was the waffling between the "it's a done deal" and "Sony is not 100%" that gave away the false claims.
     
  8. hockeymass

    hockeymass that one guy

    Reputations:
    1,450
    Messages:
    3,669
    Likes Received:
    85
    Trophy Points:
    116
    Seriously? Videocard fanboy accusations? Who cares?
     
  9. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    42 as in post #42. Those are the links to the sources you asked for.

    I figured you'd miss that meaning though....just like how you missed the source links I had already posted when Mr MM's requested them. Had you caught those links originally you wouldn't have had to dig up that ancient link to a rumor Sony denied almost two and a half years ago.
     
  10. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    but the confusion was indeed amusing, so long and thanks for all the fish
     
  11. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    LOL this is kinda going off topic now.

    I'm not a fan of either but I much rather prefer Nvidia mainly due to better driver support.
     
  12. alexUW

    alexUW Notebook Virtuoso

    Reputations:
    1,524
    Messages:
    2,666
    Likes Received:
    2
    Trophy Points:
    56
    Xbox 1 = Nvidia. But I do agree, I don't see them using nvidia anymore.
     
  13. Darkness62

    Darkness62 Notebook Evangelist

    Reputations:
    242
    Messages:
    522
    Likes Received:
    4
    Trophy Points:
    31
    Oh I get it yes post 42 where you posted a rumor as fact, explains the waffling later. Sorry, for the confusion, I actually thought your link was a joke.

    Hmmm shoots this down, can probably see how it comes off as Fanboi, especially in light of your "evidence":

    Sounds like you have some hard evidence there, any more links or is that the only one you have?
     
  14. hockeymass

    hockeymass that one guy

    Reputations:
    1,450
    Messages:
    3,669
    Likes Received:
    85
    Trophy Points:
    116
    Does it really chap your that much that a stranger posted a rumor that next gen consoles won't have Nvidia chips? Really, who cares?
     
  15. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    He seems to just be a huge fanboi/troll ....
     
  16. PlagueDoctor

    PlagueDoctor Notebook Evangelist

    Reputations:
    35
    Messages:
    383
    Likes Received:
    0
    Trophy Points:
    30
    Can the 485M only use 1 external monitor or can it use like 2? I was hoping to use 2 external monitors with eyefinity, but the laptop I might be getting doesn't have an AMD card option :(
     
  17. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Getting it after the fact isn't really getting it. It's okay though, as long as I get a laugh then my jokes are still funny.

    Call me a fanboy all you want...I never play at hiding it like some who want to pretend their bias is the only truly objective one.
     
  18. GapItLykAMaori

    GapItLykAMaori Notebook Evangelist

    Reputations:
    28
    Messages:
    385
    Likes Received:
    0
    Trophy Points:
    30
    tbh if you clock the 485 it will essentialy have the same performance as the 6990m. The 6990m can be clocked as well, but when both cards are OC'd to their maximum i assume the difference between them lessens. Go for whatever is cheaper. Laptop cards are not like the desktop world, cards from each side can only be a small percentage faster than another.
     
  19. granyte

    granyte ATI+AMD -> DAAMIT

    Reputations:
    357
    Messages:
    2,346
    Likes Received:
    0
    Trophy Points:
    55
    yup on laptop the diference is minimal caus nvidia cannot put out 385 w eating monster
     
  20. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    That will depend on the flexibility of each model (485m vs 6990m) and between each card. In any case, given both performances levels, I agree that going for whatever is cheaper is the optimal option.
     
  21. m1 grant

    m1 grant Notebook Enthusiast

    Reputations:
    1
    Messages:
    38
    Likes Received:
    0
    Trophy Points:
    15
    Sorry for reviving an old thread but this question is relevant for me because i can get both for the same price, i guess the 6990 overall is stronger but I could see myself using CUDA and/or physics. How much stronger is the 6990 than the 485?
     
  22. lazard

    lazard Notebook Deity

    Reputations:
    112
    Messages:
    701
    Likes Received:
    2
    Trophy Points:
    31
    10% for now. The margin will most likely increase after AMD releases updated drivers for the 6990M.
     
  23. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    true but the 485m is easy to overclock by at least 20%, on mine i have it oc to 27%.
     
  24. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    and the 6990m as well, 800-850 core clocks and 1000+ mem clocks are what people are getting with it.

    pretty close to the 6870 clocks (900 and 1050)

    its also the same with the 580m people are getting OCs that reaches the 560ti clocks, not the norm, being the average close to it
     
  25. AlienTroll

    AlienTroll Notebook Evangelist

    Reputations:
    319
    Messages:
    598
    Likes Received:
    0
    Trophy Points:
    30
    6990M any day. 3D is a FPS hater and PhysX is a real gimmick, not too many games support it, it looks nice, but there's not much games that support it (18 games estimate?).
     
  26. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    I think is like the double. But besides the few games that use it: does it adds anything meaningful (i.e. enhanced gameplay experience)? I think not since PhysX is always an afterthought no matter how much nVidia pays for it.
     
  27. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    I think he meant games that matter, Im a fan of the dance dance games, thats why I always buy nvidia hardware, physics FTW! :rolleyes:
     
  28. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    The physX rules in Batman AA and you can only assume it will rule even harder in Batman AC
     
  29. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    I don't think you should base your buying decision on whether your gpu has PhysX or not. It not a gimmick, but it not fully used either. If Nvidia put more time and effort into it, It could potentially make or break a buyer purchasing decision, but as of now, it not worth it.

    It still better to have then to not have but it won't affect gameplay much, if at all. I think if you have seen PhysX then you will certainly notice it on AMD card which doesn't have it, but otherwise you won't notice it.
     
  30. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    I do agree that whether the GPU has PhysX or not should not be the buying decision... yet you contradict yourself in your 2nd paragraph... ;)
     
  31. AlienTroll

    AlienTroll Notebook Evangelist

    Reputations:
    319
    Messages:
    598
    Likes Received:
    0
    Trophy Points:
    30
    Well, PhysX is pretty awesome, but it isn't something that would determine a GPU buying choice.
     
  32. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Unless you plan on using the CUDA for rendering images in professional programs like I would, then go for ATI.
    You get a higher performance for the same money.
     
  33. doombug90

    doombug90 Notebook Evangelist

    Reputations:
    15
    Messages:
    311
    Likes Received:
    1
    Trophy Points:
    31
    The difference for commercial GFX cards is minimal when it comes to CAD rendering. Although I do not trust Notebookcheck 100%, the 6990M performs better in almost all tested programs than the gtx 485M.
     
  34. alxlbf2

    alxlbf2 Notebook Consultant

    Reputations:
    13
    Messages:
    230
    Likes Received:
    2
    Trophy Points:
    31
    You can use ATi Stream units too wich perform also like CUDA...
     
  35. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Most professional programs just use OpenGL. It's not a big deal, AMD is fully supported by the majority of professional software and by ALL the major professional software.

    CUDA never actually took off. It only seems that way among the uninformed because the uninformed keep throwing it around like a cookie jar and it seems more glorious than it ever was.

    All the major software companies if they are going to utilize GPGPU will be using OpenCL, not CUDA. And there is a reason why they use OpenGL instead of DirectX and OpenCL is maintained by the same group that maintains OpenGL. Developers care about cross platform and something that is easily portable. CUDA is none of that.

    Even Apple's pro studio desktops use AMD now, not Nvidia, frankly because most professional 3D applications use CPU to render not the GPGPU, since CPU is more flexible and of higher quality.
     
← Previous page