For the record, I did switch from the 485m to the 6970M. Thank you for all the feedback.
-
You made the right decision, as a gamer.
-
as a gamer, 485M is the right decision.
as a gamer on a budget, 6970M is the optimal one. -
-
mountainlifter_k Notebook Consultant
The downside is that if many people with wide wallets go with the 485M, Nvidia will be encouraged to keep its prices high. But this is not going to happen since the ppl wanting the 6970 are the majority. -
-
If prices were equal, isn't the 485m supposedly stronger on more benchmarks than the 6970m? I know the two cards are more or less equivalent but for those who absolutely need the top-end bleeding edge GPU the choice would probably be the 485m (for now).
-
"Comparing the 6970M and 485M once more, we find NVIDIA with a slight lead in five of the eight games, but if we call anything less than a 5% difference a tie there are really only three games where there’s even a moderate difference. NVIDIA is ahead by 8% in STALKER and 14% in DiRT 2; AMD leads by 10% in StarCraft II. Everything else is splitting hairs."
This is referring to their 1080p benchmark suite.
So less than a 5% difference, in the majority, either way. The GPUs are virtually equal. Neither does anything in game, that the other can't do just as well. -
except for like eye candy stuff of nvidia (physX, 3D)
-
Again, it is the right decision for an absolute gamer who cares less about his wallet.
For most of us it is a matter of compromise, we save the $250 by getting the 6970M and invest them in a better screen or buy a paintball marker magazine fed, or just save them up.
the absolute gamer just buys all of these same time and proceed to pre-order the i-pad 2 just for the kicks maybe use it for a week or two then ditches it. -
I read that article, hence my argument that the 485m is slightly stronger than the 6970m based on benchmark scores. I fully agree both cards can do everything performance-wise for the latest games, the only difference being pricing and a few extra features avaiable to the nvidia card (PhysX, 3D vision).
-
mountainlifter_k Notebook Consultant
For you, these two may not be needed and so your conclusion is that barring a 5% improved performance, one is just as good as the other, if the prices were the same. -
The one issue with PhysX, is that these notebooks don't have the horsepower to run it in most games, because they lack a second GPU dedicated to run it.
But hey I'm not here to change anyone's mind. It's your $250.
To be honest, this is the first time I've ever chosen against Nvidia, so I guess I shall now see the difference firsthand. I did decide, that MLAA means more than PhysX, for me.
When you look at the list:
Batman: Arkham Asylum
Crazy Machines 2
Cryostasis: Sleep of Reason
Dark Void
Darkest of Days
Hot Dance Party
Hot Dance Party II
Mafia II
Metal Knight Zero Online
Metro 2033
Mirror's Edge
Nurien
The Saboteur
Sacred 2: Fallen Angel
Sacred 2: Ice & Blood
Shattered Horizon
Star Tales
Star Trek DAC
Tom Clancy's Ghost Recon Advanced Warfighter 2
Unreal Tournament 3 (and Extreme Physics Mod)
U-WARS
Warmonger: Operation Downtown Destruction
There hasn't been many games, where the proprietary software actually mattered.
And 3-D.. well I can only speak for myself, in saying that I just don't give a damn about it.
EDIT: this list is probably missing a few insignificant games -
mountainlifter_k Notebook Consultant
For eg, i found out that Deus Ex human revolution is getting Apex clothing, similar to PhysX (or the same thing, idk).
And I think you are wrong in saying that a dedicated GPU is needed for PhysX. Enthusiasts use that config on desktops for extra performance. Notebookcheck maxed out MAFIA II with the 485M and it runs at 59 FPS on ULTRA settings (Im assuming that means PhysX on.) Computer Games on Laptop Graphic Cards - Notebookcheck.net Tech -
and you have to consider some successful titles which implemented PhysX and which will most probably have sequels using same engines.
talking about Tom Clancy's and Metro 2033.
Big fan myself.
EDIT: @mountainlifter_k, as per notebookcheck, they disabled PhysX in all of these benchmarks.
check here: http://www.notebookcheck.net/Mafia-2.35169.0.html
But still, the 59 fps on ultra would drop to say what, 40fps with physX? good enough. -
Don't underestimate the performance hit of PhysX on single-card GPUs, even those as powerful as the 485m. Looking at the list helpfully provided above, it's quite clear that the vast majority of the PhysX-supported games are FPS with a few RPG and RTS exceptions thrown in. That should give a good indication of whether or not the 485m really is the most suitable card for the price offered.
-
If the small chance of a game supporting PhysX matters to you, spend the money. It's that simple, and I can't argue against it.
edit: btw, both Mafia II and Batman:AA recommend a dedicated 9800 GTX, for PhysX processing. -
did you order the default screen or requested an upgrade? i see most resellers offer only one or two options : the default glossy super clear glare type, and the upgraded Matte type. Who is actually offering the V.4 glossy which is claimed to have the best quality. -
the way it's meant to be played
or 3d vision
both NOT my choice
in my practice the NVIDIA card's PUREVIDEO is the focus spec.
because when I watch 1080p mkv this can up the fps from 24 to 60 by the SPLASH HD PLAYER PRO
in games I use medium/low settings and 0aa to play in 1080p screen
if 470 cant play fluently I think 485 also struggle
must next-gen 4GB v-ram mobile card solve the low performance problem -
mountainlifter_k Notebook Consultant
Personally, ill be trying out 3D with a projector and Nvidia kit a year from now, when the kit is cheap. This is a major bonus and is worth my investment. I know that 3D is crap now since its in the incubation stage. A year from now, it could be a game-changer.
I .... uh... don't think the gap between the 470M and the 485M is that narrow. Atleast thats not what these guys on YOutube are saying:
YouTube - Crysis 2 'time square' @ EXTREME 1080P, GTX 485 m, clevo p170hm crysis 2 times square battle
YouTube - Bulletstorm Performance Playtest 1080p video on Sager 8150 Laptop bullet storm with full AA + fraps running at ~ 30FPS
YouTube - Crysis 2 Performance Playtest 1080p video on Sager 8150 Laptop Crysis 2 multiplayer running on Extreme on 1920x1080 res averaging a steady 30fps.
The point is that only crysis and metro 2033 have been known to struggle on the 485M so far.
There are many more videos coming up.
Decayedmatter here in NBR just got his 8170 with the 485M who was also keen on getting PhysX in the games. I'll ask him to post some cool footage.
This forum is so good. Good help from everyone. I am new here, and have thoroughly enjoyed discussing about laptops and GPUs. -
I really don't think Physx is worth really anywhere near an extra $250. I am spending the money saved on a nice headset, which I think beats out Physx. I personally would prefer high quality sound over an extra little gimmick that hurts FPS. For those actual gamers out there, most sacrifice graphics for FPS if absolutely needed. Most hardcore shooter players are not going to sacrifice 60 FPS to 40 FPS just for some little cool extra effect. Same thing with SC2. (I believe supports it) In a game like SC2, during large battles FPS drops even more, so 40 might go down to 25, which could cost you an entire game. Oh, and on top of this lets not forget it is an extra $250, while the AMD outperforms on a few games with the testing thats been done. But sure, go ahead and spend an extra $250 for the extra gimmicks, both which hurt your FPS (3D/Physx).
-
mountainlifter_k Notebook Consultant
As for sacrificing graphics for gameplay, I am with you. I have been doing this for the last 10 years. Never have i had a good card.
You said you like sounds; i remember how the sounds in gears of war made the shooting feel very visceral. The same thing goes for PhysX. More particles flying out when you shoot at something lends a tangibility to the action and creates IMMERSION. THIS IS THE CORE OF GAMES - IMMERSION !
We dont want physx simply because they put it out there. We want PHYSICS. You are welcome to disagree.
Those that have money can spend it. Lets agree to disagree -
GTX485M pros:
- Tried and true driver support
- Physx support (having run Penumbra, Mafia II, Batman and a number of other games on both nV and ATI hardware, I can tell you Physx can make a nice visual difference.)
- +/-10% better performance on average
- $200 more than 6970m
In summation, are the pros worth $200 to you? Only you can come to that decision based on how much you game, what games you play and what your financial situation is.
I, for one, will never give AMD/ATI a cent again for mobility products after past experiences with horrible/non-existant driver support. They seem to have cleaned up their act in the past year, but I'm still holding the grudge. -
Im not gonna start with the whole debate over i choose "stuff" over "other stuff" ect.
My reason for choosing the AMD Radeon HD 6970m is this:
Save money on the GPU to afford a snappy SSD.
In Norway tech is expensive.
I owned both a ATI Radeon x1800m and Nvidia 8000 series mobile gpu.
Both failed. D:
ATI because of Fujitsu Siemens piece of s*** cooling solution and Nvidia because of production flaws. -
As far as physx goes, it is dying and didn't offer anything significant the way nvidia implemented it. How many noteworthy 2011 or upcoming titles use physx?
The only things the 485m has over the 6970m is better idle battery life and CUDA (since it has more support over STREAM). And those aren't nearly as important for gamers for $300 more. -
I'll bet that there won't be five games this year, which use PhysX.
My original plan was to buy the screen aftermarket, but I didn't feel like risking my warranty over a savings of $50. -
mountainlifter_k Notebook Consultant
Sources: Unreal Engine Video Game, Engine 3 Features Overview | Video Clip | Game Trailers & Videos | GameTrailers.com
NVIDIA APEX PhysX: CPU vs GPU Efficiency | NVIDIA APEX PhysX,PhysX CPU Performance,GPU Efficiency,NVIDIA APEX PhysX: CPU vs GPU Efficiency
APEX Destruction | NVIDIA Developer Zone
(Ill be clear and say that the first link only shows that UE3 supports APEX in-engine and i cant find if PhysX is seperate from APEX)
This is all-important because you can say many games wont use PhysX but you cannot say many games wont use UE3.
If this doesn't prove that physX is not a gimmick, i dont know what will : http://youtu.be/9lCkB77it-M
I always take the time to post the links to articles or videos. "ALwAYS CITE THE SOURCE" is my policy.
-
What he means is, PhysX support is of a dying breed. Now that AMD has gained a much larger chunk of the market, less and less developers will risk the alienation of their potential customers.
-
mountainlifter_k Notebook Consultant
You, however, have a valid point (as always). I could keep arguing and say that
1.either AMD should develop its own GPU-based Physics acceleration to compete with nvidia.
2. OR gamers should demand physics from game developers and AMD (because if you have seen this video, you want Physics for sure: YouTube - Art Gallery Destruction Demo in UE3 using APEX Destruction with GRB's The destruction is just too cool. An "actual" gamer, to use someone else's phrasing probably should demand for such tech in games.)
3.the worst option is Nvidia dropping PhysX (due to reasons you mentioned) and physics gets chucked. Then, every gamer loses, until they (AMD, NVidia and the Devs) get it to work in some other way.
And the other way cannot be CPU-based physics for now: I have already read here ( Analysis: PhysX On Systems With AMD Graphics Cards : Introduction) that some CPU based physics are still done (in non-physx apps) but that there is no way that the CPU is as fast as the GPU for Physics calculations.
But all three points border on speculation. Most likely, market shares will level out and everybody will find their niche. (disclaimer: I am no market analyst, just an engineer. so this last opinion is purely speculative) -
I've been playing Batman Arkham Asylum maxed out at 1080p with high Physx and it runs at a smooth 50-60 fps, i ran the benchmark for it maxed out and got 53 fps average. The 485m handles physx like a champ, and i must say the effects it pulls off are amazing, the fact you can cut a banner apart with your batarang piece by piece this it rips and falls down is awesome. I'm glad i didnt go with ati i didn't even realize ati didn't support physx
Here's a vid i made of Batman AA
http://www.youtube.com/watch?v=GBZdJNA8ad8 -
Why would they make their own solution?
Downloads ect. here.
According to this forum post it even runs on the iPhone.
Physics Simulation Forum • View topic - Anytime Golf: Magic Touch (for iPhone) uses Bullet
Not a fan of iHardware myself but i have to say that what the guy there did is pretty cool. -
OH WAIT RONNIE YOU MUST BE AN ATI FANBOY WITH YOUR AVATAR
Nope, I just like whoever provides me with the most bang for my buck. PhysX? Really? What a joke. A few extra particles and 1-2 added cool effects is not worth $250 extra. Booting into my windows desktop within 5 seconds, heck yeah, thats worth that $250 extra.
5% difference for $250? Really? And people are trying to convince you? Let's not forget that ATI drivers mature VERY well (have a mobile 5650 and it just gets better and better; same with my 4890 desktop).
PhysX does take it's toll on computers, as the recommended setup is an Nvidia card dedicated to PhysX (note: this recommended setup is for desktops only). You have said earlier in this thread something like this:
"Almost every game has a person whispering Nvidia and how it's the way it's meant to be played. It's the familiarity of it"
If that's not fanboyism I don't know what is, because it's not a legit/valid reason to spend another $250.
I'm with Kevin on this, if you don't have an SSD, then trash that overpriced Nvidia card, grab teh 6970 (which you already did GJ) and grab an SSD.
Done and done.
Don't be stupid, nobody needs that 5% increase for $250. -
17.3" FHD 16:9 "Glare Type" Super Clear Ultra Bright LED Glossy Screen w/ 90% NTSC Color Gamut (1920x1080) (Will add 4-7 business days to build time ( + $220 ) -
yes
10char -
I'm under the impression that those who think PhysX and Nvidia 3D Vision are gimmicks, have either never seen it themselves, or haven't seen it in the right game to really appreciate it. For games like Mafia 2 and Batman: AA, PhysX makes a noticeable difference that I think most people would prefer it with rather than without. For some games though, like Mirror's Edge where the only additions are destructible windows and banners, it can look pretty gimmicky.
As for 3D, opinions are pretty much polar opposites. Some people love it, and some people hate it. I'm one of those who love it and think that it adds so much more to the games that games without stereo 3D just end up looking 'flat'.
In either case, you can look at it this way. Nvidia gives you the option of using PhysX and 3D. ATI doesn't. It seems like most of the argument revolves around whether these two features are worth the extra $250 for the 485M over the 6970M. To some, yes. To others, no. -
i think those that are actually contributing to this thread are pretty set in their ways. they aren't going to change their way of thinking any time soon lol.. there are good points from each side, and this has been yet another nvidia vs ati (although more mild mannered then what i would have expected).
-
Ati as always had better hardware, thats a fact. There advertising amd marketing are there bigest let down and there drivers sometimes suffered but were always fixed.
Nvidia kills any company it feels threatened by (3dfx,ageria etc etc), sells faulty goods (bumpgate), claims they have the fastest single card when they dont and no company with any sense wants to deal with them anymore. THere fermi is there future CPUGPU processor that they had to rush into a GPU cause they found out intel shafted them on the x86 licence and stole nvidia's tech for there larbee project, which will re-appear when they feel they cant make money with there current supplys exhausted. Then it will be the all new singing and dancing thing.
Phsixs is a joke, IT WILL NEVER TAKE OFF. A handful of games are not going save physix, that cost Nvidia badly. Theres not serious developers thinking of implimenting it seriously and as for 3d, Anyone who as gone with any of the current 3d tech (except nintendo 3ds) are just paying for the lost revenue of the people who have deveoloped what is a terrible implimentation of 3d. Nvidia as nothing going possitive for it, and if you ask me it stems from the top and a hell of alot of arragance
Im not saying ATI/AMD are good company, non of them are, but i know who i prefer pass my money onto
My 2 cents -
I think hizzaah hit it on the head. I vote that this thread be closed because people are going to feel one way or the other and no amount of back-and-forth can get them to change their minds.
P.S. If you're going to offer your two cents, don't come off as a blind fanboy. -
Sorry if i did. I am far from a fanboy. I just read up about anything i buy in as much detail as i can and can only make my own conclusions. No offense or flaming meant to be initiated
Nvidia do win alot of design awards and in general have good ideas. They've had some great cards in the past but since the 8 series (of which one i lost) my faith in them as dwindled -
-
It's not so much what your conclusions are, it's how you state it. How can you state for a fact that ATI has better hardware? Historically, Nvidia has launched new hardware first, then ATI follows suit with something faster and cheaper. Depending on where you start looking at it, either Nvidia is following ATI or ATI is following Nvidia. The only thing that can really be said for certain is that Nvidia tries to make the fastest cards bar none, whereas ATI tries to focus on the best bang for the buck.
As for PhysX, how do you know it cost Nvidia badly? Are you a market analyst, or an insider for Nvidia? How do you know there are no serious developers considering using it? And how do you know that Nvidia's 3D approach is a terrible implementation? Have you looked at the alternatives and have considered the pros and cons of each?
These things may be your conclusions, but they're baseless and extremely opinionated. -
Here are some benchmarks i ran on my notebook.
Stock clock on AMD Catalyst 11.3 x64.
http://forum.notebookreview.com/sag...-p170hm-amd-radeon-6970m-driver-needed-d.html
But please dont spam useless fud there.
Thanks in advance,
DEagleson -
-
Check AMD's fail; rate compared to Nvidia's, If you could possibly find an honest comparision tbh. i thinkyou would find that pretty much ati as less. I been around nearly 2 decades with computers and this is my opinion of my experience and probably over dedication to following graphics
As for physix, ageia cashed in and agreed, its my opinion but do you see any light at the end of the tunnel? Theres not even enough games (known admitingly) this year been made to fit on 1 hand, nevermind 2! Physix as been around along time and the unreal engine is used by alot of games so does that really count as more than 1 game?
And 3D. Using glasses for 3D cost the industry, not Nvidia, and now they flog as much of it to you when really it just isnt good enough. Now a TV thats 3d without any other perperials (which is in development) is TRUE 3D if you ask me. So i hope you see my point with the 3d malarky. It wont last and it will be out with the nintendo virtual boy hehe
This isnt the place for us 2 to ramble. Sorry but just wanted to put you clear on what i meant by what i said. But your right, it is my opinion and sorry if i cause any offense, but i take critism and take it on board and am open to people's points
On topic, I t is a matter of opinion but solely on the basis of the 2 cards in question, i'd swap to ati as the advantages would be zero for me with physix and 3D -
But i dident want to remake a thread just for a new headline. -
This fanboy word is getting OLD! Tired of seeing the word, if someone wants to be a fan of one brand its up to them, whether it be ignorantly or not, let them be for feck sake!! I'm a nvidia fan, but 6970 is what I would purchase solely due to price performance
-
Charles P. Jefferies Lead Moderator Super Moderator
We've reached a point in this thread - time for it to be closed because now we're getting into arguments.
-
-
wonder why it isn't closed yet lol..
-
mountainlifter_k Notebook Consultant
(a)NVIDIA PHYSX: This is what i think support means: Nvidia's hardware engineers design their hardware with their software package requirements in mind, which in this case is physics calculation. Then their software engineers develop software packages for physics algorithms which means that their hardware was built for their algorithms. Their support is thus both hardware and software. (Note: this is my model of how things work in design based on my work experience in the field of control and instrumentation.)
(b) Does AMD design hardware for the requirements of the algorithms of BULLET engine?? Is it possible that the hardware design would extend itself to enable support for open source code calculations.
For those that don’t get my point: let’s say you design a gpu that can do additions. So to do 23*45, you would have to add 45 to itself, 23 times = 23 clocks. Whereas, knowing that you would need to do multiplications if you also developed a Multiplier unit on the gpu, all this would take 1 clock.
SORRY all for the long post. If you want to lets discuss AMD’s Physics and speculate on what games will support the Bullet engine over Physx and please start by posting examples of AMD Physics in games.Last edited by a moderator: May 8, 2015
Should I switch out my 485M to the 6970M?
Discussion in 'Sager and Clevo' started by meyer0095, Apr 5, 2011.