Hi, everyone. Newbie here, and I'm a bit stumped. I'm planning on buying a new Sager NP8150, and I need help with the video cards. There are four options: GTX 560M, GTX 485M, GTX 580M, and HD6990M. The 560M is too weak for me, and the 580M is less powerful than the HD 6990M and GTX 485M and yet $250 more expensive (what's up with that???). Anyway, I've narrowed it down to the GTX485M and the HD6990M. They both cost $245, and their benchmarks are very very similar ( http://www.videocardbenchmark.net/high_end_gpus.html). They both have 2 GB GDDR5. So, I was thinking that I should probably get the HD6990M because you get more bang for the buck, but, well, NVIDIA is, you know, NVIDIA. They're the industry standard, right? They also have PhysX and all that stuff, and I'm familiar with the NVIDIA control panel. So which one?
P.S. How does a program that uses PhysX work on ATI? What's the ATI equivalent to PhysX? Maybe I'm saying something very stupid but I don't really understand what PhysX is and how it works, so if someone could clarify I would be grateful.
-
Oops, the link I gave appears to be broken. Try this and click on "High End Video Card Chart." PassMark Software - Video Card Benchmark Charts
-
Meaker@Sager Company Representative
PhysX does not run on AMD hardware, the game will use CPU based physics instead.
580M is more powerful than the 485M (new chip revision with higher clocks).
Basically do you want more performance in the 6990M or things like PhysX from the 485m?
As for who is the industry standard, that's mostly Microsoft and directX 11. Both Nvidia and AMD both follow that. -
Wait, I'm confused. First of all- do I want PhysX? I don't know. What does it do- how will it help me? Will it make my games faster? I said, I don't really understand what PhysX is and how it works. What does it do? Is it bad that the game will use CPU based physics? And how can Microsoft be the industry standard if it doesn't make video cars? I mean video card companies. Microsoft doesn't make video cards. And, no, the 580M isn't more powerful than the 485M. Look: PassMark Software - Video Card Benchmark Charts Click on High End Video Card Chart. The 580M has a benchmark of 2,110 and 485M has 2,132. I know, it's not a noticeable difference but for some reason the 580M is $250 more???
-
The 6990m is not more powerful than the 580m, they are pretty much equal in most respects, the 580m seems to have more overclocking headroom and lower temps, but considering the 6990m price, the 6990m is the better deal.
PhysX affects a few games, Batman Arkham Asylum for one, you can google PhysX and find out which games support it. It is hardly a deal breaker, since most games do NOT use it.
Passmark is NOT a good GPU comparison. 3dmarkVantage and 3dMark11 are much better, but really gaming benchmarks are the best. Most people on this forum use this website for quick graphics and CPU references... Notebookcheck.com graphics comparison. I have been reading on this extensively, the 6990m and 580m are too close to get worried about. The 485m and 580m are the same GPU, but the 580m is faster, the passmark scores you are looking at are an anomaly. Is the 580m worth $250 more than 485m? It is if you have a 3d monitor or want to game in 3d, but otherwise I would say no.
I hope this helps. -
Meaker@Sager Company Representative
Let me correct certain parts for you:
-
i have been researching what card the 485m is based on for a while now and i have deemed that it must be the gtx 560ti, as the 485m the 580m and 560ti all have 384 cores and 256bit memory. there are two differences between the 580m and 485m, the voltage on the 485m is 1.00, and the voltage is 0.87 on the 580m. the clocks on the 485m are 575mhz and the clocks on the 580m are 620mhz. so basically the 580m is a more refined and tuned 485m that is mildly overclocked. and when i say mildly i mean around 7%, but i have seen 580ms run at clocks faster than those of the 560ti.
oh and the 485m will do 3d so im not sure where you heard that it would not. -
I would get the 6990 for sure
-
If this is purely a debate between 485m and 6990m, then it is easy, 6990m, unless you want 3D.
@aduy, you are right, the 485m does do 3D, if you laptop has the monitor. I would not expect the 580m to perform as well as 560ti though, only a seriously overclocked 580m could do that, and from what I've seen on the forum, those clocks are not stable for gaming, just benchmarking.
@ Meaker, I see what you are saying, but I think we are both right, I was basically quoting this...
Regarding the 580m
" GF114 architecture
The GF114 core is a power optimized version of the GF104 (used in the GTX 485M) but with no architectural changes. Therefore, the performance per MHz stays the same, but Nvidia is able to clock the shaders higher while still remaining in the same power envelope. The GTX 580M offers all 384 shader cores in the GF114. More informations on the similar GF104 architecture can be found on the GeForce GTX 485M page."
So in my view we are both right, it is the same GPU physically (which is what I said), but it is power optimized and faster (which is what you said). -
-
Thanks for the quick responses, everyone. Wait a minute- how is Passmark inacurrate? Their results are virtually the same as the Notebook Review benchmarks- the 485M is just a tiny bit less powerful than the HD6990M. The same as Passmark's results. Anyway, I think I'll be going with the 485M, because, at one point I may want to game in 3D. How come ATI graphics cards never support 3D? Also, I see everybody's talking about overclocking. Just want to make it clear that I will absolutely NOT do any overclocking because I believe its too dangerous- I'm afraid to mess with a $2000 machine. So, I see people throwing a lot of information at me, but there doesn't seem to be a general consensus about which one I should get. I see most people leaning towards the 485M because of 3D. Since I really won't see a difference in performance between the 6990M and the 485M, I should get the nVidia one, right? The reason I was leaning towards the 6990M in my first post is because I forgot about nVidia's 3D. So, if anyone would make a list of the pros and cons of each I would be grateful. Thanks everyone, great forum!!
-
Go For The 485m An Get 24fps Go For The 6990m And Get 28fps. You Decide
-
-
So, basically, what's happening here is exactly what I was afraid of: everybody would throw at me a bunch of information that I didn't understand. Please, at the end of your message, conclude which one is better. I don't understand, when you start talking about shader clocks and that kind of stuff, what is supposed to be better and what is supposed to be worse. How about you all just list the pros and cons of each? That would clear up a lot for me. Thanks everyone.
-
Karamazovmm Overthinking? Always!
and what stops you from learning something?
-
Look, both cards are really powerful. The Hd6990 might have a slight advantage overall compared to the 485m but they perform on a similar level so there is no huge difference at all.
Both will give you great performance, so the difference boils down to features. While both have 3D support, the nvidia solution is a bit more mature. The difference are Physx and CUDA. If you use profesional apps, CUDA helps with faster acceleration in some of them. As for physx, not a lot of games support it but it is basically an extra feature that uses propietary physics calculations (they can be done via the CPU but they are not really developing the software for that so it runs like crap... damm you nvidia).
Some games like Batman AA and Mirrors Edge can use physx and can enhance the experience a bit, tho such feature tends to be a performance hit regardless.
It is advisable that you take your time little by little to get to know what you are buying and general knowledge. Otherwise you are going to have headaches when small things like a driver issue prevents you from enjoying a specific game etc.
I would take the HD6990 personally, but if the price of the 485m was lower, I might consider it easily as it is a great card too. -
Crysis 10 Char
-
-
-
-
Mechanized Menace Lost in the MYST
Get the 6990m the differences are minimal and you get the best bang for your buck with the 6990m, Since you obviously don't need CUDA and don't know what Physx is. It performs better in Dirt, Dirt 3, and Dirt 2, unless you want PhysX get the 485M
here is what PHYsX will get you.
Batman Arkham Asylum - Physx and No Physx Comparison [HD] - YouTube
http://www.youtube.com/watch?v=j-44t4M_RSg&feature=related
This video is just awesome:
http://www.youtube.com/watch?v=_JHVxOgUyDo&feature=related -
Why Are People Still Debating Which Is Better? Obviously The 6990m Is Faster So Quit Saying They Are On Par.
-
IMHO take 6990m over 485m unless you need cuda. physx in 485m is not worth it for the performance loss. 3d is not very feasible on a single mobile GPU even the high-end ones.
580m is worth it only if budget is of absolutely no concern to you. -
Right now 6990M and 580M is a complete overkill. Even for gamers.
-
Meaker@Sager Company Representative
Really? Let's see how many games fail to meet a 60FPS average at 1920x1080p with highest details that I can list off the top of my head.
Crysis Warhead
Crysis 2
Battleforge
Metro 2033
Dirt2
Dirt3
Mass effect 2
Also we are ignoring having an inbuilt 120hz display or an external one.
So tell me again how even for gamers they are complete overkill? -
Well it cannot play all games in ultra mode in 1080p. But they can handle almost everything except Witcher 2 and Metro.
AMD Radeon HD 6990M - Notebookcheck.net Tech
You don`t need 60FPS to play a game. And you can take away Dirt 2 and 3, Crysis from your list.
For every other game than a select few, the GPU is a complete overkill and produce too much heat and use too much power. -
Karamazovmm Overthinking? Always!
what about 30fps?
try to get a shogun 2 total war in 30 fps in a 8000 man battle, on ultra it wont be possible, it might be on high at 1080p -
-
Meaker@Sager Company Representative
So in your OPINION the 6990M is overkill, and maybe for you it is. But for those with 120hz screens, or like to have 60-120fps as often possible then the high end cards (or more performance) is needed.
Oh and recheck your Dirt3 numbers, they show 40fps at highest settings along with crysis 2 at 34fps, warhead at 42.7 and the original at 30fps.
Seriously. -
The HD6990 packs more power, and if you don't know what CUDA is, chances are, you won't even be using it at all.
Get the HD6990! the only way to get a 485m is if its price was lower than the HD card, otherwise get the radeon. -
masterchef341 The guy from The Notebook
ATI also has their own proprietary acceleration programming setup. Cuda had a head start in this whole thing, and so there are a very few notable apps like photoshop which are currently accelerated with nvidia gpu's using cuda. Eventually openCL will likely take over. -
Thanks all. How about that Supersonic Sled thing by NVIDIA, though? Sounds like hours of fun to me (I know, I 'm weird
)- would it run just as well on the ATI?
This is the Supersonic Sled: NVIDIA GTX 470 DX11 Supersonic Sled demo - YouTube
I heard that it uses NVIDIA stuff like CUDA and PhysX, since it was developed by NVIDIA. Would it run smooth on the HD6990M- NVIDIA recommends it for GTX 400 or higher video cards.
P.S. I also read that NVIDIA puts in less transistors in their video cards and then overclocks them, which means a less stable card that gets very hot, while ATI packs their cards with transistors and underclocks them, resulting in very stable and cool video card operation. Could anyone confirm if this is true or not? Also, it wouldn't be possible, by any chance, to somehow tweak PhysX to run on ATI? Is this a software restriction by NVIDIA, or is it that ATI cards are physically unable to run PhysX? -
Meaker@Sager Company Representative
Nvidia tech demos are locked to their cards.
No it's not true. They are different designs but both equally valid and stable. -
-
And it just makes sense, OpenCL isn't just cross platform on OS, it's cross-platform on devices as well. -
So I'm thinking more NVIDIA now, since I know that all of NVIDIA's awesome demos won't run on ATI. Anybody know if the following is true or not:
-
While the 6990m and 580m are fine GPUs and can definitely play most games at reasonable frame rates with a few sacrifices, you still aren't guaranteed over 30fps minimum in every game. Never mind maximum and never mind average... the time when you will care is when you dip below 30fps.
The 6990m is an underclocked Desktop 6870.
The 580m is an underclocked Desktop 560ti.
Neither one of those desktop cards is truly overkill for the latest games even just at 1080p.
And that's paired with an overclocked 4GHz+ i5 or i7 desktop CPU as well.
(the mobile CPUs are good, but aren't quite on the level of a 4GHz i5/i7)
Gaming laptops have come a long way... but even the best are not quite "overkill" yet. -
I don't know how many people are hardcore 3d animators or extreme video editors, but the seconds of improvement that CUDA gives doesn't justify it... considering MOST of the main focus of the card is gaming(seeing as it's posting in the gaming subforum...).
Physx causes a large performance hit in games... which may be bearable in single player games(but still really annoying, a ton of mirrors edge gameplay vids with physx on are really horrendously laggy) but is just not acceptable for multiplayer games(can't really think of many multiplayers that support physx anyways).
Ditch the favoritism that you may or may not have, and look for raw performance instead of superficial justifications. -
I am starting to think "Baloney" (hmmm...) is just stoking us all. Anyone else here suspicious?
-
-
Look, everyone, I'm a bit confused. You're all saying that I may get lag in video games, but, here's from my personal experience: my old laptop had a GT130M. It is considered a mid-range mediocre video card- certainly not for games at max. settings. But here's the story. I could play Flight Simulator X, at high-mostly max. settings, in 1080p, at maybe 15-25 FPS. That's very good. Now, how could a video card with a benchmark ALMOST 6 TIMES HIGHER possibly have any lag with the games I play?? 6 TIMES LOWER video card on my old laptop and I got about 25 FPS in FSX on mostly max. settings in full HD. Since most car and simulation games aren't THAT demanding (not as demanding as war games which I don't play), I don't think there should be a huge problem here (correct me if I'm wrong). I think it should all come down to features. Now, since NVIDIA's AWESOME supersonic sled demo uses PhysX, I think I should probably take the 485M. There is also another option. I could trade my 2860QM for a 2760QM, bring the RAM down from 1600 Mhz 12 GB to 1333 Mhz 8 GB, and get the 580M. I think this is a bad idea, though- right? Too much sacrifice. I don't really need that much RAM anyway but the processor difference is big.
-
So, how about you read my post and stop dragging this discussion on. Physx is not justification to get the 485. Cuda is useless for gaming. Unless you want to buy a card to SPECIFICALLY run that "AWESOME" sled?(what?) demo... yeah
For the record, Batman Arkham Asylum is considered one of the most impressive physx games... and all it gains is a couple of banners, smoke, and a floor that crumbles(trading performance for useless game features)
Basically, you're sounding like you already made up your mind, and it's useless to ask questions if you're already set on blindly going for nvidia.
Oh yeah, and on your PC build... the 2760 is more than enough for gaming. 12GB of ram does not make sense at all... it doesn't even run in proper dual channel because of the extra dimm.
-
Meaker@Sager Company Representative
Well the ram does (2x2 + 2x4) but yes, stop posting and read the advice already given.
-
I guess this discussion is over then. Thanks everyone- my final decision was to select the 580M and downgrade my processor and RAM.
-
Can we close this topic? It's going to make me develop a headache, hardcore.
-
Meaker@Sager Company Representative
I'd like to point out for gaming, he made the right choice in the end.
GTX 485M vs HD6990M?
Discussion in 'Gaming (Software and Graphics Cards)' started by Baloney, Sep 24, 2011.