The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Should I switch out my 485M to the 6970M?

    Discussion in 'Sager and Clevo' started by meyer0095, Apr 5, 2011.

  1. meyer0095

    meyer0095 Notebook Guru

    Reputations:
    0
    Messages:
    70
    Likes Received:
    0
    Trophy Points:
    15
    For the record, I did switch from the 485m to the 6970M. Thank you for all the feedback.
     
  2. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    You made the right decision, as a gamer.
     
  3. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    as a gamer, 485M is the right decision.

    as a gamer on a budget, 6970M is the optimal one.
     
  4. ZahariasX

    ZahariasX Guest

    Reputations:
    0
    I couldn't agree more with you.
     
  5. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    yes, after seeing footage on youtube, I was thinking......... if Nvidia 485M and the HD6970 were each a 100 dollars, which one would you pick?? If i am not too wrong, the majority would go with the 485M as it can compete better over a range of features - 3D out, PhysX, slightly better performance etc.

    I have to agree. Its only the pricing that is turning heads towards the 6970.

    I understand that we gamers have to vote with our wallets, but if our wallets are wide enough, we would be obliged to vote for the best feature set.

    The downside is that if many people with wide wallets go with the 485M, Nvidia will be encouraged to keep its prices high. But this is not going to happen since the ppl wanting the 6970 are the majority.
     
  6. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    What is "right", about the GTX 485M?
     
  7. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    If prices were equal, isn't the 485m supposedly stronger on more benchmarks than the 6970m? I know the two cards are more or less equivalent but for those who absolutely need the top-end bleeding edge GPU the choice would probably be the 485m (for now).
     
  8. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    Sorta. According to Anandtech:

    "Comparing the 6970M and 485M once more, we find NVIDIA with a slight lead in five of the eight games, but if we call anything less than a 5% difference a tie there are really only three games where there’s even a moderate difference. NVIDIA is ahead by 8% in STALKER and 14% in DiRT 2; AMD leads by 10% in StarCraft II. Everything else is splitting hairs."

    This is referring to their 1080p benchmark suite.

    So less than a 5% difference, in the majority, either way. The GPUs are virtually equal. Neither does anything in game, that the other can't do just as well.
     
  9. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    except for like eye candy stuff of nvidia (physX, 3D)
     
  10. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    Add to that PhysX (big difference in visuals in games like batman) and 3D support.

    Again, it is the right decision for an absolute gamer who cares less about his wallet.
    For most of us it is a matter of compromise, we save the $250 by getting the 6970M and invest them in a better screen or buy a paintball marker magazine fed, or just save them up.
    the absolute gamer just buys all of these same time and proceed to pre-order the i-pad 2 just for the kicks maybe use it for a week or two then ditches it.
     
  11. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    I read that article, hence my argument that the 485m is slightly stronger than the 6970m based on benchmark scores. I fully agree both cards can do everything performance-wise for the latest games, the only difference being pricing and a few extra features avaiable to the nvidia card (PhysX, 3D vision).
     
  12. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Why are you ignoring PhysX support and 3D out?? You have to consider an imaginary gamer consumer that wants all possible features, when judging the worth of the two choices.

    For you, these two may not be needed and so your conclusion is that barring a 5% improved performance, one is just as good as the other, if the prices were the same.
     
  13. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    The one issue with PhysX, is that these notebooks don't have the horsepower to run it in most games, because they lack a second GPU dedicated to run it.

    But hey I'm not here to change anyone's mind. It's your $250.

    To be honest, this is the first time I've ever chosen against Nvidia, so I guess I shall now see the difference firsthand. I did decide, that MLAA means more than PhysX, for me.

    When you look at the list:

    Batman: Arkham Asylum
    Crazy Machines 2
    Cryostasis: Sleep of Reason
    Dark Void
    Darkest of Days
    Hot Dance Party
    Hot Dance Party II
    Mafia II
    Metal Knight Zero Online
    Metro 2033
    Mirror's Edge
    Nurien
    The Saboteur
    Sacred 2: Fallen Angel
    Sacred 2: Ice & Blood
    Shattered Horizon
    Star Tales
    Star Trek DAC
    Tom Clancy's Ghost Recon Advanced Warfighter 2
    Unreal Tournament 3 (and Extreme Physics Mod)
    U-WARS
    Warmonger: Operation Downtown Destruction

    There hasn't been many games, where the proprietary software actually mattered.

    And 3-D.. well I can only speak for myself, in saying that I just don't give a damn about it.

    EDIT: this list is probably missing a few insignificant games
     
  14. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Maybe a game will come out for which you may want physX.
    For eg, i found out that Deus Ex human revolution is getting Apex clothing, similar to PhysX (or the same thing, idk).

    And I think you are wrong in saying that a dedicated GPU is needed for PhysX. Enthusiasts use that config on desktops for extra performance. Notebookcheck maxed out MAFIA II with the 485M and it runs at 59 FPS on ULTRA settings (Im assuming that means PhysX on.) Computer Games on Laptop Graphic Cards - Notebookcheck.net Tech
     
  15. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    and you have to consider some successful titles which implemented PhysX and which will most probably have sequels using same engines.
    talking about Tom Clancy's and Metro 2033.
    Big fan myself.

    EDIT: @mountainlifter_k, as per notebookcheck, they disabled PhysX in all of these benchmarks.
    check here: http://www.notebookcheck.net/Mafia-2.35169.0.html

    But still, the 59 fps on ultra would drop to say what, 40fps with physX? good enough.
     
  16. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Don't underestimate the performance hit of PhysX on single-card GPUs, even those as powerful as the 485m. Looking at the list helpfully provided above, it's quite clear that the vast majority of the PhysX-supported games are FPS with a few RPG and RTS exceptions thrown in. That should give a good indication of whether or not the 485m really is the most suitable card for the price offered.
     
  17. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    If the small chance of a game supporting PhysX matters to you, spend the money. It's that simple, and I can't argue against it.

    edit: btw, both Mafia II and Batman:AA recommend a dedicated 9800 GTX, for PhysX processing.
     
  18. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    By the way, congrats on finally making a decision Kevin. i decided to wait for the IVY Bridge processor soon after summer.

    did you order the default screen or requested an upgrade? i see most resellers offer only one or two options : the default glossy super clear glare type, and the upgraded Matte type. Who is actually offering the V.4 glossy which is claimed to have the best quality.
     
  19. 1341

    1341 Notebook Guru

    Reputations:
    9
    Messages:
    73
    Likes Received:
    0
    Trophy Points:
    15
    the way it's meant to be played
    or 3d vision
    both NOT my choice

    in my practice the NVIDIA card's PUREVIDEO is the focus spec.
    because when I watch 1080p mkv this can up the fps from 24 to 60 by the SPLASH HD PLAYER PRO

    in games I use medium/low settings and 0aa to play in 1080p screen
    if 470 cant play fluently I think 485 also struggle

    must next-gen 4GB v-ram mobile card solve the low performance problem
     
  20. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Thanks for that info.

    Its not a small chance, I think. The list of games is small now but it definitely is going to only increase; not decrease right?. I won't presume to know the titles you like to play but if Arkham Asylum and Deus Ex are on your radar, i would say an Nvidia card is better. Given the number of years I am going to use the card, i'd say if 6 games with PhysX came out for my investment, i'd be fine.

    Personally, ill be trying out 3D with a projector and Nvidia kit a year from now, when the kit is cheap. This is a major bonus and is worth my investment. I know that 3D is crap now since its in the incubation stage. A year from now, it could be a game-changer.

    thanks for reminding us of the PUREVIDEO factor too.

    I .... uh... don't think the gap between the 470M and the 485M is that narrow. Atleast thats not what these guys on YOutube are saying:
    YouTube - Crysis 2 'time square' @ EXTREME 1080P, GTX 485 m, clevo p170hm crysis 2 times square battle
    YouTube - Bulletstorm Performance Playtest 1080p video on Sager 8150 Laptop bullet storm with full AA + fraps running at ~ 30FPS
    YouTube - Crysis 2 Performance Playtest 1080p video on Sager 8150 Laptop Crysis 2 multiplayer running on Extreme on 1920x1080 res averaging a steady 30fps.

    The point is that only crysis and metro 2033 have been known to struggle on the 485M so far.

    There are many more videos coming up.
    But so far nothing showing PHYSX running with the 485M. so we don't know the performance hit from PhysX on the 485M.

    Decayedmatter here in NBR just got his 8170 with the 485M who was also keen on getting PhysX in the games. I'll ask him to post some cool footage.

    This forum is so good. Good help from everyone. I am new here, and have thoroughly enjoyed discussing about laptops and GPUs.
     
  21. meyer0095

    meyer0095 Notebook Guru

    Reputations:
    0
    Messages:
    70
    Likes Received:
    0
    Trophy Points:
    15
    I really don't think Physx is worth really anywhere near an extra $250. I am spending the money saved on a nice headset, which I think beats out Physx. I personally would prefer high quality sound over an extra little gimmick that hurts FPS. For those actual gamers out there, most sacrifice graphics for FPS if absolutely needed. Most hardcore shooter players are not going to sacrifice 60 FPS to 40 FPS just for some little cool extra effect. Same thing with SC2. (I believe supports it) In a game like SC2, during large battles FPS drops even more, so 40 might go down to 25, which could cost you an entire game. Oh, and on top of this lets not forget it is an extra $250, while the AMD outperforms on a few games with the testing thats been done. But sure, go ahead and spend an extra $250 for the extra gimmicks, both which hurt your FPS (3D/Physx).
     
  22. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    No such thing as actual gamers. There are only humans and no two humans are alike. Please DONT feel free to define terms like "actual gamers".

    As for sacrificing graphics for gameplay, I am with you. I have been doing this for the last 10 years. Never have i had a good card.

    Sorry, but its not a gimmick that you cannot see, not a gimmick that runs somewhere in the background. See the youtube videos. I am sure the programmers hard at work on physx programming would disagree with you.

    You said you like sounds; i remember how the sounds in gears of war made the shooting feel very visceral. The same thing goes for PhysX. More particles flying out when you shoot at something lends a tangibility to the action and creates IMMERSION. THIS IS THE CORE OF GAMES - IMMERSION !

    We dont want physx simply because they put it out there. We want PHYSICS. You are welcome to disagree.

    I can play at 40FPS. even 30. no problems.

    Those that have money can spend it. Lets agree to disagree
     
  23. harmattan

    harmattan Notebook Evangelist

    Reputations:
    432
    Messages:
    642
    Likes Received:
    55
    Trophy Points:
    41
    GTX485M pros:
    • Tried and true driver support
    • Physx support (having run Penumbra, Mafia II, Batman and a number of other games on both nV and ATI hardware, I can tell you Physx can make a nice visual difference.)
    • +/-10% better performance on average
    Con:
    • $200 more than 6970m

    In summation, are the pros worth $200 to you? Only you can come to that decision based on how much you game, what games you play and what your financial situation is.

    I, for one, will never give AMD/ATI a cent again for mobility products after past experiences with horrible/non-existant driver support. They seem to have cleaned up their act in the past year, but I'm still holding the grudge.
     
  24. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    Im not gonna start with the whole debate over i choose "stuff" over "other stuff" ect.

    My reason for choosing the AMD Radeon HD 6970m is this:
    Save money on the GPU to afford a snappy SSD.
    In Norway tech is expensive.

    I owned both a ATI Radeon x1800m and Nvidia 8000 series mobile gpu.
    Both failed. D:
    ATI because of Fujitsu Siemens piece of s*** cooling solution and Nvidia because of production flaws.
     
  25. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    3d capability depends on the notebook and not so much the gpu. HP has a 3d notebook and it uses a radeon gpu.

    As far as physx goes, it is dying and didn't offer anything significant the way nvidia implemented it. How many noteworthy 2011 or upcoming titles use physx?

    The only things the 485m has over the 6970m is better idle battery life and CUDA (since it has more support over STREAM). And those aren't nearly as important for gamers for $300 more.
     
  26. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    I'll bet that there won't be five games this year, which use PhysX.

    I requested the upgrade. XoticPC was the only site still offering the V.4 glossy, so I had to buy from them.

    My original plan was to buy the screen aftermarket, but I didn't feel like risking my warranty over a savings of $50.
     
  27. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Very likely, you are right. I can name three though. Deus Ex human revolution, Arkham city and guess what... UE3 supports PhysX. Please also remember that Mafia 2 released this year.
    Sources: Unreal Engine Video Game, Engine 3 Features Overview | Video Clip | Game Trailers & Videos | GameTrailers.com
    NVIDIA APEX PhysX: CPU vs GPU Efficiency | NVIDIA APEX PhysX,PhysX CPU Performance,GPU Efficiency,NVIDIA APEX PhysX: CPU vs GPU Efficiency
    APEX Destruction | NVIDIA Developer Zone
    (Ill be clear and say that the first link only shows that UE3 supports APEX in-engine and i cant find if PhysX is seperate from APEX)

    This is all-important because you can say many games wont use PhysX but you cannot say many games wont use UE3.

    If this doesn't prove that physX is not a gimmick, i dont know what will : http://youtu.be/9lCkB77it-M

    We know we know.... discussions seem to going round and round. We are talking about the 3D capability of the CLEVO laptops with 485M. So far, nobody has confirmed that there is 3D out with the 6970.

    When you post claims like "PhysX is dying" please post your source. I cannot take your word for it unless you work for Nvidia when you say "they way they implemented it".

    I always take the time to post the links to articles or videos. "ALwAYS CITE THE SOURCE" is my policy.

    Actually, some people in this thread http://forum.notebookreview.com/sager-clevo/568190-battery-life-np8150.html were saying that the 6970 has better idle power consumption.
     
  28. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    What he means is, PhysX support is of a dying breed. Now that AMD has gained a much larger chunk of the market, less and less developers will risk the alienation of their potential customers.
     
  29. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    With respect, he was not saying what you are saying ^^. He was talking about physx dying because of the way they implemented it.

    You, however, have a valid point (as always). I could keep arguing and say that

    1.either AMD should develop its own GPU-based Physics acceleration to compete with nvidia.

    2. OR gamers should demand physics from game developers and AMD (because if you have seen this video, you want Physics for sure: YouTube - Art Gallery Destruction Demo in UE3 using APEX Destruction with GRB's The destruction is just too cool. An "actual" gamer, to use someone else's phrasing probably should demand for such tech in games.)

    3.the worst option is Nvidia dropping PhysX (due to reasons you mentioned) and physics gets chucked. Then, every gamer loses, until they (AMD, NVidia and the Devs) get it to work in some other way.

    And the other way cannot be CPU-based physics for now: I have already read here ( Analysis: PhysX On Systems With AMD Graphics Cards : Introduction) that some CPU based physics are still done (in non-physx apps) but that there is no way that the CPU is as fast as the GPU for Physics calculations.

    But all three points border on speculation. Most likely, market shares will level out and everybody will find their niche. (disclaimer: I am no market analyst, just an engineer. so this last opinion is purely speculative)
     
  30. decayedmatter

    decayedmatter Notebook Evangelist

    Reputations:
    22
    Messages:
    409
    Likes Received:
    0
    Trophy Points:
    30
    I've been playing Batman Arkham Asylum maxed out at 1080p with high Physx and it runs at a smooth 50-60 fps, i ran the benchmark for it maxed out and got 53 fps average. The 485m handles physx like a champ, and i must say the effects it pulls off are amazing, the fact you can cut a banner apart with your batarang piece by piece this it rips and falls down is awesome. I'm glad i didnt go with ati i didn't even realize ati didn't support physx

    Here's a vid i made of Batman AA
    http://www.youtube.com/watch?v=GBZdJNA8ad8
     
  31. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
  32. ronnieb

    ronnieb Representing the Canucks

    Reputations:
    613
    Messages:
    1,869
    Likes Received:
    0
    Trophy Points:
    55
    Reading through your comments I do realize that you are an Nvidia fanboy.

    OH WAIT RONNIE YOU MUST BE AN ATI FANBOY WITH YOUR AVATAR

    Nope, I just like whoever provides me with the most bang for my buck. PhysX? Really? What a joke. A few extra particles and 1-2 added cool effects is not worth $250 extra. Booting into my windows desktop within 5 seconds, heck yeah, thats worth that $250 extra.

    5% difference for $250? Really? And people are trying to convince you? Let's not forget that ATI drivers mature VERY well (have a mobile 5650 and it just gets better and better; same with my 4890 desktop).

    PhysX does take it's toll on computers, as the recommended setup is an Nvidia card dedicated to PhysX (note: this recommended setup is for desktops only). You have said earlier in this thread something like this:

    "Almost every game has a person whispering Nvidia and how it's the way it's meant to be played. It's the familiarity of it"

    If that's not fanboyism I don't know what is, because it's not a legit/valid reason to spend another $250.

    I'm with Kevin on this, if you don't have an SSD, then trash that overpriced Nvidia card, grab teh 6970 (which you already did GJ) and grab an SSD.

    Done and done.

    Don't be stupid, nobody needs that 5% increase for $250.
     
  33. mostwanted115

    mostwanted115 Notebook Consultant

    Reputations:
    38
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    Is it same as this option?

    17.3" FHD 16:9 "Glare Type" Super Clear Ultra Bright LED Glossy Screen w/ 90% NTSC Color Gamut (1920x1080) (Will add 4-7 business days to build time ( + $220 )
     
  34. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    yes

    10char
     
  35. Ranma13

    Ranma13 Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    I'm under the impression that those who think PhysX and Nvidia 3D Vision are gimmicks, have either never seen it themselves, or haven't seen it in the right game to really appreciate it. For games like Mafia 2 and Batman: AA, PhysX makes a noticeable difference that I think most people would prefer it with rather than without. For some games though, like Mirror's Edge where the only additions are destructible windows and banners, it can look pretty gimmicky.

    As for 3D, opinions are pretty much polar opposites. Some people love it, and some people hate it. I'm one of those who love it and think that it adds so much more to the games that games without stereo 3D just end up looking 'flat'.

    In either case, you can look at it this way. Nvidia gives you the option of using PhysX and 3D. ATI doesn't. It seems like most of the argument revolves around whether these two features are worth the extra $250 for the 485M over the 6970M. To some, yes. To others, no.
     
  36. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    i think those that are actually contributing to this thread are pretty set in their ways. they aren't going to change their way of thinking any time soon lol.. there are good points from each side, and this has been yet another nvidia vs ati (although more mild mannered then what i would have expected).
     
  37. i_wheeldon

    i_wheeldon Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Ati as always had better hardware, thats a fact. There advertising amd marketing are there bigest let down and there drivers sometimes suffered but were always fixed.

    Nvidia kills any company it feels threatened by (3dfx,ageria etc etc), sells faulty goods (bumpgate), claims they have the fastest single card when they dont and no company with any sense wants to deal with them anymore. THere fermi is there future CPUGPU processor that they had to rush into a GPU cause they found out intel shafted them on the x86 licence and stole nvidia's tech for there larbee project, which will re-appear when they feel they cant make money with there current supplys exhausted. Then it will be the all new singing and dancing thing.

    Phsixs is a joke, IT WILL NEVER TAKE OFF. A handful of games are not going save physix, that cost Nvidia badly. Theres not serious developers thinking of implimenting it seriously and as for 3d, Anyone who as gone with any of the current 3d tech (except nintendo 3ds) are just paying for the lost revenue of the people who have deveoloped what is a terrible implimentation of 3d. Nvidia as nothing going possitive for it, and if you ask me it stems from the top and a hell of alot of arragance :)

    Im not saying ATI/AMD are good company, non of them are, but i know who i prefer pass my money onto :)

    My 2 cents
     
  38. Ranma13

    Ranma13 Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    I think hizzaah hit it on the head. I vote that this thread be closed because people are going to feel one way or the other and no amount of back-and-forth can get them to change their minds.

    P.S. If you're going to offer your two cents, don't come off as a blind fanboy.
     
  39. i_wheeldon

    i_wheeldon Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Good job were not all the same, world would be a boring place wouldnt it :)

    Sorry if i did. I am far from a fanboy. I just read up about anything i buy in as much detail as i can and can only make my own conclusions. No offense or flaming meant to be initiated :)

    Nvidia do win alot of design awards and in general have good ideas. They've had some great cards in the past but since the 8 series (of which one i lost) my faith in them as dwindled :)
     
  40. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    i agree.. only thing that needs to be done now are benchmark threads.. i have a http://forum.notebookreview.com/sager-clevo/555871-485m-overclocking-results.html thread. kinda got off topic (not surprising) but with a bit of professionalism, it should be ok. just need a seperate one for the 6970 :)
     
  41. Ranma13

    Ranma13 Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    It's not so much what your conclusions are, it's how you state it. How can you state for a fact that ATI has better hardware? Historically, Nvidia has launched new hardware first, then ATI follows suit with something faster and cheaper. Depending on where you start looking at it, either Nvidia is following ATI or ATI is following Nvidia. The only thing that can really be said for certain is that Nvidia tries to make the fastest cards bar none, whereas ATI tries to focus on the best bang for the buck.

    As for PhysX, how do you know it cost Nvidia badly? Are you a market analyst, or an insider for Nvidia? How do you know there are no serious developers considering using it? And how do you know that Nvidia's 3D approach is a terrible implementation? Have you looked at the alternatives and have considered the pros and cons of each?

    These things may be your conclusions, but they're baseless and extremely opinionated.
     
  42. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
  43. pay928

    pay928 Notebook Enthusiast

    Reputations:
    10
    Messages:
    33
    Likes Received:
    0
    Trophy Points:
    15
    The title is about Driver needed though instead of mentioning benchmarking. Maybe you could start a new thread specifically for benchmarks and whatnot.
     
  44. i_wheeldon

    i_wheeldon Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    6990 beats the GTX590 and nvidia as been pulled up on this so they dont make the fastest single card :) Check AMD's fail; rate compared to Nvidia's, If you could possibly find an honest comparision tbh. i thinkyou would find that pretty much ati as less. I been around nearly 2 decades with computers and this is my opinion of my experience and probably over dedication to following graphics :)

    As for physix, ageia cashed in and agreed, its my opinion but do you see any light at the end of the tunnel? Theres not even enough games (known admitingly) this year been made to fit on 1 hand, nevermind 2! Physix as been around along time and the unreal engine is used by alot of games so does that really count as more than 1 game?

    And 3D. Using glasses for 3D cost the industry, not Nvidia, and now they flog as much of it to you when really it just isnt good enough. Now a TV thats 3d without any other perperials (which is in development) is TRUE 3D if you ask me. So i hope you see my point with the 3d malarky. It wont last and it will be out with the nintendo virtual boy hehe

    This isnt the place for us 2 to ramble. Sorry but just wanted to put you clear on what i meant by what i said. But your right, it is my opinion and sorry if i cause any offense, but i take critism and take it on board and am open to people's points :)

    On topic, I t is a matter of opinion but solely on the basis of the 2 cards in question, i'd swap to ati as the advantages would be zero for me with physix and 3D :)
     
  45. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    I asked for a moderator to fix the thread title.
    But i dident want to remake a thread just for a new headline.
     
  46. lawtq

    lawtq Notebook Evangelist

    Reputations:
    145
    Messages:
    374
    Likes Received:
    85
    Trophy Points:
    41
    This fanboy word is getting OLD! Tired of seeing the word, if someone wants to be a fan of one brand its up to them, whether it be ignorantly or not, let them be for feck sake!! I'm a nvidia fan, but 6970 is what I would purchase solely due to price performance
     
  47. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,334
    Messages:
    36,639
    Likes Received:
    5,033
    Trophy Points:
    931
    We've reached a point in this thread - time for it to be closed because now we're getting into arguments.
     
  48. lawtq

    lawtq Notebook Evangelist

    Reputations:
    145
    Messages:
    374
    Likes Received:
    85
    Trophy Points:
    41
    Totally agreed!
     
  49. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    wonder why it isn't closed yet lol..
     
  50. mountainlifter_k

    mountainlifter_k Notebook Consultant

    Reputations:
    39
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    What does Support mean?
    (a)NVIDIA PHYSX: This is what i think support means: Nvidia's hardware engineers design their hardware with their software package requirements in mind, which in this case is physics calculation. Then their software engineers develop software packages for physics algorithms which means that their hardware was built for their algorithms. Their support is thus both hardware and software. (Note: this is my model of how things work in design based on my work experience in the field of control and instrumentation.)
    (b) Does AMD design hardware for the requirements of the algorithms of BULLET engine?? Is it possible that the hardware design would extend itself to enable support for open source code calculations.
    For those that don’t get my point: let’s say you design a gpu that can do additions. So to do 23*45, you would have to add 45 to itself, 23 times = 23 clocks. Whereas, knowing that you would need to do multiplications if you also developed a Multiplier unit on the gpu, all this would take 1 clock.


    Thanks man. You were true to your word. I asked Decayed matter to inform us of the performance hit on the 485M when Physx is on. But batman AA is unreal tech and we know they optimize the hell out of their engine.

    Very precise summary of the discussion/debate. "To each his own" must be the final say. But lets not forget that some of us learnt a lot along the way and it was fun (i speak for myself). Game Tech always fascinated me.

    I commend your attitude! But please post some source for your claims or make a note saying that this is your personal undocumented observation.

    SORRY all for the long post. If you want to lets discuss AMD’s Physics and speculate on what games will support the Bullet engine over Physx and please start by posting examples of AMD Physics in games.
     
    Last edited by a moderator: May 8, 2015
← Previous pageNext page →