Well, we had a good run. Thanks to those who held good discussion points. Boo to those of you who can't remain civil.
PROTIP: quoting a moderator, who is tired of arguing, then to continue throwing insults like "kid" is not indicative of logic processing
Peace.
-
mountainlifter_k Notebook Consultant
-
There are some examples of Bullet Physics on their homepage.
Both AMD and Sony (Used in Sony Physics SDK) seems to support this open source physics engine, and it looks like it has also been used with games & movies.
I guess if Nvidia wanted they could add DirectCompute accelleration of Bullet Physics too.
Maybe they already done it? -
chewietobbacca Notebook Evangelist
I'll say this right now: the hardware based physics is an annoying argument because people have brought it up for years now and it hasn't made a damned difference, except in the eyes of those who want to devolve a discussion into Nvidia vs. AMD.
The number of titles using PhysX exclusively has never grown large beyond a small collections of heavily Nvidia-funded titles. Big mainstream titles use Havok and other non-proprietary physics - don't believe me? Look at titles from Valve and Blizzard. The fact that PhysX can degrade GPU performance should not be forgotten either - especially since performance available to notebook class GPUs is very precious.
At the end of the day, justify how you want to spend your money the way you feel most comfortable. For me, personally, I see no point in brand loyalty - certainly not $250 worth of brand loyalty for perks I rarely encounter when that $250 can be put towards an upgraded screen or SSD which will be used daily -
I'm loving the physX effects in batman so far, definitely adds to the immersion, it only drops the framerate from 60 fps to 53 fps, i'd say it's worth it.
-
I personally haven't seen a huge difference between PhysX on or off in any game. IMHO there really isn't anything better about nVidia's chip than AMD's that makes it worth $250. To be honest I don't think there's anything better at all. I can fully understand remaining in your "comfort zone" since maybe all you've ever used were nVidia GPU's. But in this case $250 is worth the switch to AMD regardless of your "comfort zone".
Obviously the decision is yours how to spend your money, but in this case it's like debating whether to buy a Ford Focus or Toyota Corolla. -
One more thing is tessellation.
AMD cards have very weak support for this new trendy feature in graphics engines. (Games like Alien vs Predator, Stalker, , Metro 2033...)
check the benches of simulation software for the tessellation result on AMD cards; it is almost null.
So now we got %5-%10 extra fps, PhysX, Tessellation, 3D vision, and Pure Video encoding for the 485M.
May not be worth an extra $250 for many, but it is the card for the ultimate gamer. -
chewietobbacca Notebook Evangelist
Fact is, the AMD cards have not suffered in actual games with tesselation against Nvidia - far from it in fact if you look at desktop benches, where AMD actually has done better in AvP, Stalker, Metro, etc. than their counterpart in Nvidia
Bench numbers are nice to look at, but you can't play benches -
Anyway, according to anandtech review of the 6790M, The 485M performed slightly better (numbers wise) in those titles you mentioned.
Eurocom Racer: Why the Radeon HD 6970M Rocks - AnandTech :: Your Source for Hardware Analysis and News
but fair enough,we cant play benches, but we can enjoy the physics and tessellation eye candy the 485M provides on top. -
Hold on guys, you saying that the AMD Radeon HD 6970m cant do heavy tessellation in Unigine Engine? :O
Check the benches here.
Saved in attached zips in each post.
My run:
http://forum.notebookreview.com/sag...-amd-radeon-6970m-goodness-d.html#post7348112
Eivind's run:
http://forum.notebookreview.com/sag...md-radeon-6970m-goodness-d-5.html#post7352457
Seems like they perform basically the same.
Also did some other benchies if you want to compare, but i dont really care about that. :d
As long as i can finally play BF:BC2 im satisfied.
DEagleson -
Comparing the two cards is pretty simple
The 485 outperforms the 6970 by a small margin in most games.
If you can justify the increase in price for that improvement, do it. If not, then you're still getting a great card.
So long as they both run the games we need them to. -
chewietobbacca Notebook Evangelist
-
According to NotebookCheck.net, the AMD 6930m does better at Crysis 2 than the GTX 485m. I don't know how reliable their reviews are though.
-
The 485 is the highest rated mobile card for Crysis 2
NVIDIA GeForce GTX 485M - Notebookcheck.net Tech
Though, the 6970 is right on its heels -
Computer Games on Laptop Graphic Cards - Notebookcheck.net Tech -
Or are you talking about the other settings?
I'm talking about Ultra, as I dont think anyone is going to play at a lesser setting than they have to.
30 FPS in Crysis is pretty good -
-
mountainlifter_k Notebook Consultant
I remember i saved up a ton of money to get the NVidia Geforce 5200 back in the day, because i wanted to play Prince of Persia Sands of time. Just for that game! This new thing called pixel shader was needed to play that game.
Its the few games that you want to play that make you want the extra feature. For me, I somehow ended up buying crysis 2, Mafia 2, Batman Arkham Asylum and I am also looking forward to Deus Ex Human Revolution (very very important for me), Arkham City and whatever UE3 based games are coming this year (see Samaritan). You can see I like story driven FPS/TPS.
Most of these titles need PhysX to highlight their full glory and so I keep trying to tell people! Without realizing that they don't give a crap about these games.
What do you guys say? Have i hit the hammer on the nail??
Plus I may replay Crysis 2 a year later with a 120hz projector and 3D glasses. -
according to notebookcheck 6970m CF in crysis 2 gains only 13 fps from the second card! do u think it might be a driver issue? do we have any 6970m CF owners in here?
-
hi guys i have clevo p150hm,doyou know where can i buy the 6970m?
now i have the gtx460m.
maybe i'm going to need new power suply adapter?
right? -
-
chewietobbacca Notebook Evangelist
-
we also sell VGA modules only importing from Eurocom. I am sure there is more options in Europe where to get these. We are located in Czech Republic so feel free to ask any further questions. -
we also sell VGA modules only importing from Eurocom. I am sure there is more options in Europe where to get these. We are located in Czech Republic so feel free to ask any further questions. 16GB RAM is useless for gaming. -
mountainlifter_k Notebook Consultant
Most people in this thread seem to be citing personal reasons for getting one GPU over the other.
What I have been trying to establish is: leaving the personal reasons aside, assuming you are not buying anything today, dispassionately and critically analyze which gives you the best feature set (forget the money for a second).
The result is that we have ended up arguing about PHYSX and whether its even a feature or not. DEagleson refered to AMD supporting Bullet. As a result my interest was piqued. In my free time, I have now learnt about all possible Physics engines used in games some of which are more importantly used in engineering simulations. But I am tired of people using their fanboy talk and whatnot. I didn't want to share my information intially. but some good people like Kevin_jack2.0, chewietobacco, mostwanted, ranma etc. were good debaters and this is for them.
INFORMATION ON PHYSICS IN GAMES:
But a brief summary follows:
The aim is to try and see:
1.which physics engine (havok, PhysX (yes physx is an engine), ODE, Bullet etc.) will come to dominate and will be the game developers choice.
2.But thats not enough. The engines can be developed by anyone, but will not run on the GPU. Best example: havoc was trying to make itself run on GPUs but didn't. "The company was developing a specialized version of Havok Physics called Havok FX that made use of ATI and NVIDIA GPUs for physics simulations,[8] but may have been cancelled." source: just wikipedia search havoc. Whether CPUs are now fast enough to do physics is a separate debate. Currently, Physics has to run on the GPU to give usable dynamic simulations because GPUs are faster.
With respect to AMD and its support for Bullet:
Q1: "AMD's own "road to physics" has certainly been a painful one - the company was always in defensive mode when it comes with physics, with company executives even calling "GPU physics is dead" [later, the statement was clarified with "until DirectX 11, and even then maybe"]."
Q2:For those of you obsessed with market shares: "In case you wondered, according to Game Developer Magazine world's most popular physics API is nVidia PhysX, with 26.8% market share [if there was any doubt that nVidia PhysX isn't popular, which was defense line from many AMD employees], followed by Intel's Havok and its 22.7% - but Open sourced Bulled Physics Library is third with 10.3%."
Source: AMD supports OpenCL Bullet; Are Havok and PhysX in trouble? - Bright Side Of News*
This article is from 2009. So, we need to extrapolate to 2011; keep reading.
So this means devs using Bullet can run the physics code in the AMD processors from 2009 onwards. Look at the list of bullet games so far:
Toy_Story_3
Grand Theft Auto IV and Red Dead Redemption by Rockstar Games,
Trials HD by RedLynx.
Free Realms.
HotWheels
Gravitronix
Madagascar Kartz
Regnum Online
3D Mark 2011 by Futuremark.
Blood Drive
Source: wikipedia
Summary: So, Havoc doesn't run on either GPU, Bullet Engine runs on AMD, PHYSX runs on NVidia.
In a fight between PhysX, AMD's bullet and Havoc, Havoc wins. Just see the marvellous list of games it has. go to their website.
In a fight between GPU-based physics, the loser is Havoc, but the winner is subjective. (For me, PhysX wins, because the games to-date for bullet are crap IMO. You be the judge. List of physx games can be googled.)
Extrapolation on market shares to 2011: UE3 adopts physx into its engine. This means no more physx patches for games; The developer can directly turn on physx while developing the game. SOURCE: Epic Adds DX11 and PhysX to Unreal Engine 3 | MissuAll
This makes me declare PhysX the winner because of the credentials UE3 comes with. (If this is going to spark another debate about UE3 count me out.) But mind you, this is an extrapolation. Plus we have yet to see a good title release (other than GTA IV) with bullet that supports processing on AMD GPU. Plus i have assumed that GPU-based physics automatically beats CPU-based Physics and hence beats havoc. You just have to youtube/read-more to believe this.
EDIT: Bullet is ALSO supported by Nvidia. Thanks to you both.
I am not going to keep arguing any more. I didnt want to share even this information because people just dont seem to be capable of dispassionate analysis. Sorry, PHYSX is not a gimmick and i support both it and Bullet. You know why? Because I am a gamer!
(I will later be making a blog on this for people who like to get information. Ill put the link in my sig. Would anyone be interested and give encouragement?
Any corrections or conflicting information please inform me. Ill edit.)
-------------------------------------------------------------------------
Last edited by a moderator: May 8, 2015 -
Lots of good info there mountain. What should also be noted is that Bullet physics isn't limited to just AMD cards but also works with Nvidia cards as well.
I just got in my NP8150. The only gpu upgrade at the time was the 485m. I am so far extremely happy with my purchase but if I were to order now, I would go with the 6970. The high price premium is not justified for a miniscule bump in possible performance and also Physx is in so few games it should not be a major factor.
Maybe it might be different when UE3 comes out but even then I have a feeling most people would just disable physx eye candy in preference for higher performance and less graphical distractions when playing multiplayer FPS. -
mountainlifter_k Notebook Consultant
I am in doubt only because Nvidia is greedy and wouldn't allow other engines to use their GPU.
They prevented ATI card desktop machines from using extra NVIDIA card for PhysX processing, through driver support. Source: http://www.techpowerup.com/105329/H..._PhysX_on_Windows_7_with_ATI_GPU_Present.html
yes price and money mean 6970. no doubt there. But I have saved up like crazy (for 10 months)let me live a little eh?
-
@ mountainlifter_k
Check out this pdf here.
http://www.nvidia.com/content/GTC/documents/1077_GTC09.pdf
Its about Bullet Physics and its hostet by Nvidia too. -
chewietobbacca Notebook Evangelist
Again, justify your money however you wish.
However, you are rehashing the PhysX argument that has been discussed for years - no exaggeration - not just on this board but countless of other boards about GPUs. And the fact remains - PhysX hasn't made a dent for gamers (or you'd see Nvidia's market share improving dramatically over the same period, when in truth, it has gone down) - and you still ignore the fact that PhysX on degrades performance which will turn the 485M's tenuous lead over the 6970M into a loss.
You can continue to justify your purchase however you wish, but do understand you're arguing about something people have been discussing for 3+ years now and it hasn't convinced people before and it's not going to now. So weigh it on yourself -
mountainlifter_k Notebook Consultant
Ofcourse, I am no programmer and didnt go to levels of understanding physics programming. -
mountainlifter_k Notebook Consultant
And like i said, please leave aside the money matter. We have closed that matter with "justify your money however you wish. " many times.
I didnt forget the performance hit. But you really are making a weak point. You have to lose some to get some. lets implement Physx on the AMD card and see if that doesnt take a performance hit. -
-
chewietobbacca Notebook Evangelist
And FYI Havok physics has been used in titles such as SC2, Super Smash Bros Brawl, Company of Heroes, etc. has been around (and was bought by Intel) and doesn't require proprietary hardware either
http://en.wikipedia.org/wiki/Havok_(software) -
chewietobbacca Notebook Evangelist
a) Been settled already (OP went 6970M)
and
b) Seemingly trying to convince yourself and others about PhysX, which has been argued about for some time
One needs only dig through the countless threads on this issue in this forum, Anandtech,[H],XSForums, and countless other tech websites etc. which comes around every time a new gen of cards is released to see that the argument over PhysX has been repeated over and over, ad naseum -
mountainlifter_k Notebook Consultant
But in my defense, it doesn't matter how many sites have information on this stuff. At some point in time, there is always an information explosion like in this matter of Physics in games, that somebody has to collect and provide a coherent documentation of things. If you didn't like this, I understand. But if I get a single appreciation from someone, ill be happy. I am sorry you are nauseated, but like i said its new to me. Please discontinue reading this thread. -
mountainlifter_k Notebook Consultant
MY aim is not to convince anyone sir! why don't you go back and read what i wrote. It was just to provide information and an opinion
-
Read it myself and i had yet to find out that even my puny Wii supports that stuff "if" developers add the feature. -
After more than 10 years experience in gaming PCs, "assemble, selling, overclocking"
I can only conclude that PCs equipped with an nvidia GPU last longer in the world of gaming than the competitor .
I'm certainly not a fanboy!!! in the past i bought every four months a new vgacardt. I have purschased as much ati as nvidia graphics cards "from nvidia tnt2 - geforce2 gti - gf3 ti500 -gf 4ti xxx geforce 5700 "bad one" - gf 6600gt - gf 8800gt sli, GTX 260
my amd collection: mach64, 3D rageII,Rage 128, radeon 7500, radeon 9600 xt, radeon 9800 Xt, radeon x800xl, radeon 1900xt, AMD 3850.
usually sell my used graphic cards to friends, or use them in my second LANpc every time I realize that thrue the proper driver support from nvidia my old pc's are able to play new games .
my old 8800GT SLI setup ran crysis warhead just smoothly at 1080p, 6month later I tested this game again with new drivers and I had a performance boost of more that a 35%.
something I never had with ATI!
this is for me the reason I've opted for a 485m nvidia GTX in my new gaming laptop. maybe both are evenly matched at this time, but I'm sure this will change with time "history repeats itself" belive it or not
My GTX 485m testruns on youtube http://www.youtube.com/user/Bevilos83 -
mountainlifter_k Notebook Consultant
Waiting on more videos from you on your channel. -
-
nvidia & amd both "optimize"(cheat) in driver
but nvidia mostly commit in aa/af which cant easily figure out, amd commit in graphic very big........i prefer nvidia
same system/game setting but different card/driver:
gts450
hd5770 see the missing grass
gts450
hd5770 see the missing shadow
-
chewietobbacca Notebook Evangelist
AMD has had problems with drivers, but it hasn't been the one flat out killing cards in recent times - and the numbers speak for themsleves. AMD went from being at 20-25% market share at 2007 to now where it's nearly 40-50% of the GPU market. Nvidia has hadn't such a great track record in recent years
Think about desktop cards. Even the top flagship cards don't cost nearly as big a premium for 5% more performance (at least, since the 8800 Ultra of 2007).
I wouldn't call the 6970M the budget choice, I'd call it the smart bang for your buck choice -
mountainlifter_k Notebook Consultant
I cannot simply take your word on those numbers. Cite your source sir.
Ill even say you can drop mentioning that 5% performance improvement. All benches coming out now say that the 6970 has the lead ever so slightly.
But will PhysX drop frames as bad as 3D? I have so far not seen anyone post a MAFIA II video running on the 485M. We cannot presume frame rates will drop by half. I won't try to guess by how much it will drop, because I like to work with facts.
I hope these are the only two other features that you are talking about. And i hope me mentioning "PhysX" doesn't get you angry again.Lets take it easy eh!
-
mountainlifter_k Notebook Consultant
-
-
some more goole translated comments believe it or not
More exaggerated is the same as new World of Warcraft character, the screen will run World of Warcraft for a while, then clearly see the screen graphics 5770 From time to time will be slightly slower, is there significant LAG, but FPS .... but did not see fall, my colleagues, "suspect" it is because AMD's AI will automatically drop in FPS, the repeat copy paste the screen, this way the number of sheets is enough FPS, but the picture is LAG, you think about it, every 30 frames instead of stealing three terms with the copy, so you can increase performance 10% yeah. But such arguments are made in other cases users may also happen, so I can not prove the fact is, I can only say that obviously does amd graphics lag, but nv did not, and this continent with the most recent response to a lot of Internet cafes (cafe) users, AMD opened not long after the machine would require a machine to use NV may have some sort of graphics related to it.
ATI've done it before, but, rather exaggerated the recent reform, even after the driver 10.2 so that users can not turn off AI, that no matter what you run, he will help you "optimize", as to what Stolen, sorry ~ it is not you can decide, perhaps material change bad, maybe something less ................. -
Wow, 15 pages! I have some questions and I'm sorry if they already been asked.
1.)Upgradeable! Can the 485M be upgraded to the next series GPU chip? As far as I know, this is unknown territory so far, but if so, then this alone might be worthy of the $250 because of the future proofing feature. If you go ATi/AMD, you can't come back to nvidia & vise versa, so keep that in mind.
2.) Folding! If you are a folder (F@H for example) then the 485M is your ticket to more PPD.
3.) Thermals! How hot does one chip get over the other. Hotter chips might not be so great for your notebooks internal ambient temps. Do we have max temps on the 6970 yet?
4.) Overclocking ability! We know the 485M will overclock to 660Mhz on most units, but what about the 6970? How flexible is it when pushed to the limits? -
chewietobbacca Notebook Evangelist
3) Anandtech's reviews say its fine, there haven't been reports of it doing bad w/ the Alienware's either and no one's reported otherwise on it doing bad compared w/ the 485M. And if you are planning on Folding, then the 485M is far more likely to tax your heat on your notebook than the 6970M
4) People have clocked it high, check out the Alienware M17xR3 forums
Again, the fact of the matter is, the 485M's only bonus features are the only difference between it and the 6970M. If you do not fold/CUDA/physX, the 485M's $250 price tag is hefty for < 5% difference in performance. -
{quote} Says who? Because who the hell made this one up? You can upgrade from ATI/AMD to Nvidia and vice versa with the 8150/8170. There is no restriction here{quote}
1.)O.K, my bad, but...chewietobbacca...can you please back that statement up though? This is interesting in itself if true. Are you saying the socket on the motherboard of the 8170 can take either chip?
2 +3.)Yes, I have a desktop w/ SLi and I do fold often. I'm not worried if my GPU is sitting around 70*c, but 80*c and I get concerned. If I planned on folding on my new toy, I'd definitely have a notebook cooler to keep internal ambient temps from reckon havoc on my HDD/SDD & memory.
4.)I noticed that after I already posted the above, quite impressive. Seems the 6970 can overclock quite well, but again, the temps start to get out of hand past 700mhz. Now, I'm on the fence! damit! that $200 dollar savings is quite tempting to say the least. What I really want is something that will play the current modern games and last me longer then the damn X700 I purchased back in 05. I have had nvidia in all my desktops, and ATi in all my notebooks and neither have giving me grief outside of gaming.
One last thing that people forget about with all these benchmarks comparing the two chips. Not which one gives use the highest peak Frames in games, but which ones gives use the "average FPS" in games is the key selling point. I would love to see detail graphs of all these games showing min-max-& average, that's the proper way to really see who is king of the hill.
Should I switch out my 485M to the 6970M?
Discussion in 'Sager and Clevo' started by meyer0095, Apr 5, 2011.