The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Should I switch out my 485M to the 6970M?

    Discussion in 'Sager and Clevo' started by meyer0095, Apr 5, 2011.

  1. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    Well, we had a good run. Thanks to those who held good discussion points. Boo to those of you who can't remain civil.

    PROTIP: quoting a moderator, who is tired of arguing, then to continue throwing insults like "kid" is not indicative of logic processing

    Peace.
     
  2. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    I agree. Ill remove it. but even vulcans let off steam sometimes, so please forgive a mere emotional human like me.
     
  3. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    There are some examples of Bullet Physics on their homepage.
    Both AMD and Sony (Used in Sony Physics SDK) seems to support this open source physics engine, and it looks like it has also been used with games & movies.
    I guess if Nvidia wanted they could add DirectCompute accelleration of Bullet Physics too.
    Maybe they already done it?
     
  4. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    I'll say this right now: the hardware based physics is an annoying argument because people have brought it up for years now and it hasn't made a damned difference, except in the eyes of those who want to devolve a discussion into Nvidia vs. AMD.

    The number of titles using PhysX exclusively has never grown large beyond a small collections of heavily Nvidia-funded titles. Big mainstream titles use Havok and other non-proprietary physics - don't believe me? Look at titles from Valve and Blizzard. The fact that PhysX can degrade GPU performance should not be forgotten either - especially since performance available to notebook class GPUs is very precious.

    At the end of the day, justify how you want to spend your money the way you feel most comfortable. For me, personally, I see no point in brand loyalty - certainly not $250 worth of brand loyalty for perks I rarely encounter when that $250 can be put towards an upgraded screen or SSD which will be used daily
     
  5. decayedmatter

    decayedmatter Notebook Evangelist

    Reputations:
    22
    Messages:
    409
    Likes Received:
    0
    Trophy Points:
    30
    I'm loving the physX effects in batman so far, definitely adds to the immersion, it only drops the framerate from 60 fps to 53 fps, i'd say it's worth it.
     
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I personally haven't seen a huge difference between PhysX on or off in any game. IMHO there really isn't anything better about nVidia's chip than AMD's that makes it worth $250. To be honest I don't think there's anything better at all. I can fully understand remaining in your "comfort zone" since maybe all you've ever used were nVidia GPU's. But in this case $250 is worth the switch to AMD regardless of your "comfort zone".

    Obviously the decision is yours how to spend your money, but in this case it's like debating whether to buy a Ford Focus or Toyota Corolla.
     
  7. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    One more thing is tessellation.

    AMD cards have very weak support for this new trendy feature in graphics engines. (Games like Alien vs Predator, Stalker, , Metro 2033...)
    check the benches of simulation software for the tessellation result on AMD cards; it is almost null.

    So now we got %5-%10 extra fps, PhysX, Tessellation, 3D vision, and Pure Video encoding for the 485M.

    May not be worth an extra $250 for many, but it is the card for the ultimate gamer.
     
  8. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    You mean those same tessellation results in highly Nvidia biased benches like Unigine Heaven?

    Fact is, the AMD cards have not suffered in actual games with tesselation against Nvidia - far from it in fact if you look at desktop benches, where AMD actually has done better in AvP, Stalker, Metro, etc. than their counterpart in Nvidia

    Bench numbers are nice to look at, but you can't play benches ;)
     
  9. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    I would personally sacrify 2-4 Frames per second at the cost of improving visuals; especially when talking about numbers in the upper ranges.

    Anyway, according to anandtech review of the 6790M, The 485M performed slightly better (numbers wise) in those titles you mentioned.

    Eurocom Racer: Why the Radeon HD 6970M Rocks - AnandTech :: Your Source for Hardware Analysis and News

    but fair enough,we cant play benches, but we can enjoy the physics and tessellation eye candy the 485M provides on top.
     
  10. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    Hold on guys, you saying that the AMD Radeon HD 6970m cant do heavy tessellation in Unigine Engine? :O

    Check the benches here.
    Saved in attached zips in each post.

    My run:
    http://forum.notebookreview.com/sag...-amd-radeon-6970m-goodness-d.html#post7348112

    Eivind's run:
    http://forum.notebookreview.com/sag...md-radeon-6970m-goodness-d-5.html#post7352457

    Seems like they perform basically the same.
    Also did some other benchies if you want to compare, but i dont really care about that. :d

    As long as i can finally play BF:BC2 im satisfied.

    DEagleson
     
  11. tduhon07

    tduhon07 Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    0
    Trophy Points:
    15
    Comparing the two cards is pretty simple

    The 485 outperforms the 6970 by a small margin in most games.

    If you can justify the increase in price for that improvement, do it. If not, then you're still getting a great card.

    So long as they both run the games we need them to.
     
  12. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    You don't quite understand the benches there do you? You mean, it beats them by 1-3% in those titles at 1920x1080 on the SAME settings? Meaning there is no extra eye candy the 485M has over the 6970M, AND if you turn on PhysX, the 485M loses performance and would actually trail
     
  13. Havoq

    Havoq Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    According to NotebookCheck.net, the AMD 6930m does better at Crysis 2 than the GTX 485m. I don't know how reliable their reviews are though.
     
  14. tduhon07

    tduhon07 Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    0
    Trophy Points:
    15
    Where'd you get that from?

    The 485 is the highest rated mobile card for Crysis 2
    NVIDIA GeForce GTX 485M - Notebookcheck.net Tech

    Though, the 6970 is right on its heels
     
  15. Havoq

    Havoq Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
  16. tduhon07

    tduhon07 Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    0
    Trophy Points:
    15
    It shows both as 30 FPS on that page.

    Or are you talking about the other settings?

    I'm talking about Ultra, as I dont think anyone is going to play at a lesser setting than they have to.

    30 FPS in Crysis is pretty good
     
  17. Havoq

    Havoq Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I was referring to overall with all the different settings, but you're right. Whichever card you have, you still win because you should be able to play smoothly anyways lol.
     
  18. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    you know guys, I think this is really the core of the matter here. its not even whether the extra features of the 485M on the clevo (like 3D out, PhysX, etc.) are worth the cost of the extra 250 bucks.

    I remember i saved up a ton of money to get the NVidia Geforce 5200 back in the day, because i wanted to play Prince of Persia Sands of time. Just for that game! This new thing called pixel shader was needed to play that game.

    Its the few games that you want to play that make you want the extra feature. For me, I somehow ended up buying crysis 2, Mafia 2, Batman Arkham Asylum and I am also looking forward to Deus Ex Human Revolution (very very important for me), Arkham City and whatever UE3 based games are coming this year (see Samaritan). You can see I like story driven FPS/TPS.

    Most of these titles need PhysX to highlight their full glory and so I keep trying to tell people! Without realizing that they don't give a crap about these games.

    What do you guys say? Have i hit the hammer on the nail??

    Plus I may replay Crysis 2 a year later with a 120hz projector and 3D glasses.
     
  19. Red Line

    Red Line Notebook Deity

    Reputations:
    1,109
    Messages:
    1,289
    Likes Received:
    141
    Trophy Points:
    81
    according to notebookcheck 6970m CF in crysis 2 gains only 13 fps from the second card! do u think it might be a driver issue? do we have any 6970m CF owners in here?
     
  20. kolias

    kolias Notebook Evangelist

    Reputations:
    251
    Messages:
    629
    Likes Received:
    88
    Trophy Points:
    41
    hi guys i have clevo p150hm,doyou know where can i buy the 6970m?
    now i have the gtx460m.
    maybe i'm going to need new power suply adapter?
    right?
     
  21. johnnyman27

    johnnyman27 Notebook Lover

    Reputations:
    354
    Messages:
    1,315
    Likes Received:
    2
    Trophy Points:
    56
    im from greece too!sent me mail at [email protected]
     
  22. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    That's exactly it.. it's a small selection of titles in the grand scheme of things. All the major mainstream games (the SC2's, Portal 2, etc. use non-proprietary physics and do just fine) hence no one really worries about it

    It's your call, in a year I can also see myself having put in a new 28nm GPU with money I saved ;)
     
  23. Goldmax.cz

    Goldmax.cz Company Representative

    Reputations:
    11
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Hello,

    we also sell VGA modules only importing from Eurocom. I am sure there is more options in Europe where to get these. We are located in Czech Republic so feel free to ask any further questions.
     
  24. Goldmax.cz

    Goldmax.cz Company Representative

    Reputations:
    11
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Hello,

    we also sell VGA modules only importing from Eurocom. I am sure there is more options in Europe where to get these. We are located in Czech Republic so feel free to ask any further questions. 16GB RAM is useless for gaming.
     
  25. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Most people in this thread seem to be citing personal reasons for getting one GPU over the other.

    What I have been trying to establish is: leaving the personal reasons aside, assuming you are not buying anything today, dispassionately and critically analyze which gives you the best feature set (forget the money for a second).

    The result is that we have ended up arguing about PHYSX and whether its even a feature or not. DEagleson refered to AMD supporting Bullet. As a result my interest was piqued. In my free time, I have now learnt about all possible Physics engines used in games some of which are more importantly used in engineering simulations. But I am tired of people using their fanboy talk and whatnot. I didn't want to share my information intially. but some good people like Kevin_jack2.0, chewietobacco, mostwanted, ranma etc. were good debaters and this is for them.

    INFORMATION ON PHYSICS IN GAMES:

    But a brief summary follows:
    The aim is to try and see:
    1.which physics engine (havok, PhysX (yes physx is an engine), ODE, Bullet etc.) will come to dominate and will be the game developers choice.

    2.But thats not enough. The engines can be developed by anyone, but will not run on the GPU. Best example: havoc was trying to make itself run on GPUs but didn't. "The company was developing a specialized version of Havok Physics called Havok FX that made use of ATI and NVIDIA GPUs for physics simulations,[8] but may have been cancelled." source: just wikipedia search havoc. Whether CPUs are now fast enough to do physics is a separate debate. Currently, Physics has to run on the GPU to give usable dynamic simulations because GPUs are faster.

    With respect to AMD and its support for Bullet:
    Q1: "AMD's own "road to physics" has certainly been a painful one - the company was always in defensive mode when it comes with physics, with company executives even calling "GPU physics is dead" [later, the statement was clarified with "until DirectX 11, and even then maybe"]."
    Q2:For those of you obsessed with market shares: "In case you wondered, according to Game Developer Magazine world's most popular physics API is nVidia PhysX, with 26.8% market share [if there was any doubt that nVidia PhysX isn't popular, which was defense line from many AMD employees], followed by Intel's Havok and its 22.7% - but Open sourced Bulled Physics Library is third with 10.3%."
    Source: AMD supports OpenCL Bullet; Are Havok and PhysX in trouble? - Bright Side Of News*
    This article is from 2009. So, we need to extrapolate to 2011; keep reading.

    So this means devs using Bullet can run the physics code in the AMD processors from 2009 onwards. Look at the list of bullet games so far:
    Toy_Story_3
    Grand Theft Auto IV and Red Dead Redemption by Rockstar Games,
    Trials HD by RedLynx.
    Free Realms.
    HotWheels
    Gravitronix
    Madagascar Kartz
    Regnum Online
    3D Mark 2011 by Futuremark.
    Blood Drive
    Source: wikipedia

    Summary: So, Havoc doesn't run on either GPU, Bullet Engine runs on AMD, PHYSX runs on NVidia.
    In a fight between PhysX, AMD's bullet and Havoc, Havoc wins. Just see the marvellous list of games it has. go to their website.
    In a fight between GPU-based physics, the loser is Havoc, but the winner is subjective. (For me, PhysX wins, because the games to-date for bullet are crap IMO. You be the judge. List of physx games can be googled.)

    Extrapolation on market shares to 2011: UE3 adopts physx into its engine. This means no more physx patches for games; The developer can directly turn on physx while developing the game. SOURCE: Epic Adds DX11 and PhysX to Unreal Engine 3 | MissuAll
    This makes me declare PhysX the winner because of the credentials UE3 comes with. (If this is going to spark another debate about UE3 count me out.) But mind you, this is an extrapolation. Plus we have yet to see a good title release (other than GTA IV) with bullet that supports processing on AMD GPU. Plus i have assumed that GPU-based physics automatically beats CPU-based Physics and hence beats havoc. You just have to youtube/read-more to believe this.
    EDIT: Bullet is ALSO supported by Nvidia. Thanks to you both.

    I am not going to keep arguing any more. I didnt want to share even this information because people just dont seem to be capable of dispassionate analysis. Sorry, PHYSX is not a gimmick and i support both it and Bullet. You know why? Because I am a gamer!

    (I will later be making a blog on this for people who like to get information. Ill put the link in my sig. Would anyone be interested and give encouragement?
    Any corrections or conflicting information please inform me. Ill edit.)
    -------------------------------------------------------------------------

    Unless there is some game for which i will need to upgrade (Say BF 4) I don't see myself upgrading to anything till its time for a new laptop. Ill be buying another one only when this clevo dies. This is a good reason for me to get the 485M now.
     
    Last edited by a moderator: May 8, 2015
  26. RAQemUP

    RAQemUP Notebook Evangelist

    Reputations:
    28
    Messages:
    383
    Likes Received:
    0
    Trophy Points:
    30
    Lots of good info there mountain. What should also be noted is that Bullet physics isn't limited to just AMD cards but also works with Nvidia cards as well.

    I just got in my NP8150. The only gpu upgrade at the time was the 485m. I am so far extremely happy with my purchase but if I were to order now, I would go with the 6970. The high price premium is not justified for a miniscule bump in possible performance and also Physx is in so few games it should not be a major factor.

    Maybe it might be different when UE3 comes out but even then I have a feeling most people would just disable physx eye candy in preference for higher performance and less graphical distractions when playing multiplayer FPS.
     
  27. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Please give me the source for this and Ill add it and Quote you.
    I am in doubt only because Nvidia is greedy and wouldn't allow other engines to use their GPU.
    They prevented ATI card desktop machines from using extra NVIDIA card for PhysX processing, through driver support. Source: http://www.techpowerup.com/105329/H..._PhysX_on_Windows_7_with_ATI_GPU_Present.html

    yes price and money mean 6970. no doubt there. But I have saved up like crazy (for 10 months) :D let me live a little eh?
     
  28. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
  29. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    Again, justify your money however you wish.

    However, you are rehashing the PhysX argument that has been discussed for years - no exaggeration - not just on this board but countless of other boards about GPUs. And the fact remains - PhysX hasn't made a dent for gamers (or you'd see Nvidia's market share improving dramatically over the same period, when in truth, it has gone down) - and you still ignore the fact that PhysX on degrades performance which will turn the 485M's tenuous lead over the 6970M into a loss.

    You can continue to justify your purchase however you wish, but do understand you're arguing about something people have been discussing for 3+ years now and it hasn't convinced people before and it's not going to now. So weigh it on yourself
     
  30. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    I thought i had read up on all points about everything physics in games. Well, i am humbled and my arrogance is removed. Thanks

    Ofcourse, I am no programmer and didnt go to levels of understanding physics programming.
     
  31. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    yes yes, but its novel to me. I am new to all this forum discussions and even to physics in games.

    And like i said, please leave aside the money matter. We have closed that matter with "justify your money however you wish. " many times.

    I didnt forget the performance hit. But you really are making a weak point. You have to lose some to get some. lets implement Physx on the AMD card and see if that doesnt take a performance hit.
     
  32. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    well put, mountainlifter.
     
  33. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    And FYI Havok physics has been used in titles such as SC2, Super Smash Bros Brawl, Company of Heroes, etc. has been around (and was bought by Intel) and doesn't require proprietary hardware either

    http://en.wikipedia.org/wiki/Havok_(software)
     
  34. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    No one is saying it won't take a performance hit - which isn't even the point of my argument, because again, you're trying to constantly post about PhysX on a thread that has:

    a) Been settled already (OP went 6970M)

    and

    b) Seemingly trying to convince yourself and others about PhysX, which has been argued about for some time

    One needs only dig through the countless threads on this issue in this forum, Anandtech,[H],XSForums, and countless other tech websites etc. which comes around every time a new gen of cards is released to see that the argument over PhysX has been repeated over and over, ad naseum
     
  35. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Well, Im sorry, I should have posted in the thread i created 485M vs 6970. But that died out and the two became similar and so i thought, why not post here.

    But in my defense, it doesn't matter how many sites have information on this stuff. At some point in time, there is always an information explosion like in this matter of Physics in games, that somebody has to collect and provide a coherent documentation of things. If you didn't like this, I understand. But if I get a single appreciation from someone, ill be happy. I am sorry you are nauseated, but like i said its new to me. Please discontinue reading this thread.
     
  36. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    "Plus i have assumed that GPU-based physics automatically beats CPU-based Physics and hence beats havoc." I quote myself from post #126

    MY aim is not to convince anyone sir! why don't you go back and read what i wrote. It was just to provide information and an opinion

    Thanks
     
  37. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    Glad that PDF file was helpfull to you.
    Read it myself and i had yet to find out that even my puny Wii supports that stuff "if" developers add the feature.
     
  38. Bevil

    Bevil Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    0
    Trophy Points:
    15
    After more than 10 years experience in gaming PCs, "assemble, selling, overclocking"
    I can only conclude that PCs equipped with an nvidia GPU last longer in the world of gaming than the competitor .
    I'm certainly not a fanboy!!! in the past i bought every four months a new vgacardt. I have purschased as much ati as nvidia graphics cards "from nvidia tnt2 - geforce2 gti - gf3 ti500 -gf 4ti xxx geforce 5700 "bad one" - gf 6600gt - gf 8800gt sli, GTX 260
    my amd collection: mach64, 3D rageII,Rage 128, radeon 7500, radeon 9600 xt, radeon 9800 Xt, radeon x800xl, radeon 1900xt, AMD 3850.
    usually sell my used graphic cards to friends, or use them in my second LANpc every time I realize that thrue the proper driver support from nvidia my old pc's are able to play new games .
    my old 8800GT SLI setup ran crysis warhead just smoothly at 1080p, 6month later I tested this game again with new drivers and I had a performance boost of more that a 35%.
    something I never had with ATI!

    this is for me the reason I've opted for a 485m nvidia GTX in my new gaming laptop. maybe both are evenly matched at this time, but I'm sure this will change with time "history repeats itself" belive it or not :)

    My GTX 485m testruns on youtube http://www.youtube.com/user/Bevilos83
     
  39. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Well, what a surprise. I have been faithfully watching your channel for new uploads. Didn't know you were also in NBR.
    Waiting on more videos from you on your channel.
     
  40. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    been using the Nvidia 8400M GS since summer 2007 on my HP Pavillion dv6000 (including heavy gaming), and still counting.
     
  41. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    nvidia & amd both "optimize"(cheat) in driver
    but nvidia mostly commit in aa/af which cant easily figure out, amd commit in graphic very big........i prefer nvidia :D

    same system/game setting but different card/driver:
    gts450
    [​IMG]
    hd5770 see the missing grass
    [​IMG]
    gts450
    [​IMG]
    hd5770 see the missing shadow
    [​IMG]
     
  42. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    Not a fanboy? Might want to reexamine yourself on that one. Because yeah, that must be why Nvidia is the one that has had numerous recalls / dying GPUs (8 and 9 series anyone?) on the mobility sector and on the desktop side, their drivers recently killed the GTX 590 :rolleyes:

    AMD has had problems with drivers, but it hasn't been the one flat out killing cards in recent times - and the numbers speak for themsleves. AMD went from being at 20-25% market share at 2007 to now where it's nearly 40-50% of the GPU market. Nvidia has hadn't such a great track record in recent years


    Exactly right. The 6970M isn't a budget selection - it's that the 485M's pricing is so absurd. You're paying a significant portion of money for a 5% gain at most, and if you turn on all the features the 485M has, you lose any performance advantage. And at 1920x1080, in benches, the 6970M actually climbs up to match the 485M.

    Think about desktop cards. Even the top flagship cards don't cost nearly as big a premium for 5% more performance (at least, since the 8800 Ultra of 2007).

    I wouldn't call the 6970M the budget choice, I'd call it the smart bang for your buck choice
     
  43. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Please cite your source on those numbers, i'm interested in reading more.
    I cannot simply take your word on those numbers. Cite your source sir.

    Valid point. 3D will cut frame rates in half. But that happens on ANY card, desktop or laptop, AMD or NVidia. What i mean is that, if the clevo with the 6970 had a 3D out, you couldn't even argue this point.

    Ill even say you can drop mentioning that 5% performance improvement. All benches coming out now say that the 6970 has the lead ever so slightly.

    But will PhysX drop frames as bad as 3D? I have so far not seen anyone post a MAFIA II video running on the 485M. We cannot presume frame rates will drop by half. I won't try to guess by how much it will drop, because I like to work with facts.

    I hope these are the only two other features that you are talking about. And i hope me mentioning "PhysX" doesn't get you angry again. :D Lets take it easy eh!
     
  44. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    I'm afraid we (or only me) lost you there.
     
  45. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    ok here :(
    [​IMG]
     
  46. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    some more goole translated comments believe it or not :D

    More exaggerated is the same as new World of Warcraft character, the screen will run World of Warcraft for a while, then clearly see the screen graphics 5770 From time to time will be slightly slower, is there significant LAG, but FPS .... but did not see fall, my colleagues, "suspect" it is because AMD's AI will automatically drop in FPS, the repeat copy paste the screen, this way the number of sheets is enough FPS, but the picture is LAG, you think about it, every 30 frames instead of stealing three terms with the copy, so you can increase performance 10% yeah. But such arguments are made in other cases users may also happen, so I can not prove the fact is, I can only say that obviously does amd graphics lag, but nv did not, and this continent with the most recent response to a lot of Internet cafes (cafe) users, AMD opened not long after the machine would require a machine to use NV may have some sort of graphics related to it.

    ATI've done it before, but, rather exaggerated the recent reform, even after the driver 10.2 so that users can not turn off AI, that no matter what you run, he will help you "optimize", as to what Stolen, sorry ~ it is not you can decide, perhaps material change bad, maybe something less .................
     
  47. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
  48. RKG72MP

    RKG72MP Notebook Geek

    Reputations:
    8
    Messages:
    80
    Likes Received:
    0
    Trophy Points:
    15
    Wow, 15 pages! I have some questions and I'm sorry if they already been asked.

    1.)Upgradeable! Can the 485M be upgraded to the next series GPU chip? As far as I know, this is unknown territory so far, but if so, then this alone might be worthy of the $250 because of the future proofing feature. If you go ATi/AMD, you can't come back to nvidia & vise versa, so keep that in mind.

    2.) Folding! If you are a folder (F@H for example) then the 485M is your ticket to more PPD.

    3.) Thermals! How hot does one chip get over the other. Hotter chips might not be so great for your notebooks internal ambient temps. Do we have max temps on the 6970 yet?

    4.) Overclocking ability! We know the 485M will overclock to 660Mhz on most units, but what about the 6970? How flexible is it when pushed to the limits?
     
  49. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    Says who? Because who the hell made this one up? You can upgrade from ATI/AMD to Nvidia and vice versa with the 8150/8170. There is no restriction here

    2) Yes, it's well known folding@home is better w/ Nvidia hardware, but if you're worried about keeping your notebook longevity (seeing as how you asked question #3), i wouldn't f@h on a notebook at all unless you're planning on using a notebook cooler. Besides, if you really do actually fold and want to do so, spend the $250 on a desktop GPU and run it on the desktop, and get far more performance

    3) Anandtech's reviews say its fine, there haven't been reports of it doing bad w/ the Alienware's either and no one's reported otherwise on it doing bad compared w/ the 485M. And if you are planning on Folding, then the 485M is far more likely to tax your heat on your notebook than the 6970M

    4) People have clocked it high, check out the Alienware M17xR3 forums

    Again, the fact of the matter is, the 485M's only bonus features are the only difference between it and the 6970M. If you do not fold/CUDA/physX, the 485M's $250 price tag is hefty for < 5% difference in performance.
     
  50. RKG72MP

    RKG72MP Notebook Geek

    Reputations:
    8
    Messages:
    80
    Likes Received:
    0
    Trophy Points:
    15
    {quote} Says who? Because who the hell made this one up? You can upgrade from ATI/AMD to Nvidia and vice versa with the 8150/8170. There is no restriction here{quote}

    1.)O.K, my bad, but...chewietobbacca...can you please back that statement up though? This is interesting in itself if true. Are you saying the socket on the motherboard of the 8170 can take either chip?

    2 +3.)Yes, I have a desktop w/ SLi and I do fold often. I'm not worried if my GPU is sitting around 70*c, but 80*c and I get concerned. If I planned on folding on my new toy, I'd definitely have a notebook cooler to keep internal ambient temps from reckon havoc on my HDD/SDD & memory.

    4.)I noticed that after I already posted the above, quite impressive. Seems the 6970 can overclock quite well, but again, the temps start to get out of hand past 700mhz. Now, I'm on the fence! damit! that $200 dollar savings is quite tempting to say the least. What I really want is something that will play the current modern games and last me longer then the damn X700 I purchased back in 05. I have had nvidia in all my desktops, and ATi in all my notebooks and neither have giving me grief outside of gaming.

    One last thing that people forget about with all these benchmarks comparing the two chips. Not which one gives use the highest peak Frames in games, but which ones gives use the "average FPS" in games is the key selling point. I would love to see detail graphs of all these games showing min-max-& average, that's the proper way to really see who is king of the hill.
     
← Previous pageNext page →