Na look like my socket got pull out, putting in an RMA for a refix![]()
-
Lol. Well my point was that I'd much rather pay $300 for a 6990m and get 23 fps on Metro 2033 than pay $300 for a 485m and get 16-17fps.
I don't play Metro 2033.. But it's good to use terribly designed code as an example.
-
You mean paying $200 dollars for a 6990m as compare to paying $300 dollars for a 485m
Due note your also paying for an extra 1 month wait btw. -
The 6990m is $300 more than the 6870m on the Alienware website. I have no idea about elsewhere.
-
Well we don't know that... I think it is a pretty solid rumor that Nvidia isn't in any of the three next generation consoles, however. AMD is in the Wii U, is probably in the next Xbox. Sony has been quite mum, but who else would've they gone with?
Anyway, OpenCL can't do physics? That is laughable. OpenCL is simply a compute library for GPGPUs (and CPUs too, actually). You can do just about any math via OpenCL, but for a lot of cases that is a bad idea. Where OpenCL and GPGPU works exceedingly well is embarrassingly parallel problems and independent problems like Fast Fourier Transform, which are much faster when conducted in a parallel manner. A lot of physical simulations, much like graphics work, is a pretty good match for such an approach, though no perfect. The fact of the matter is that there is nothing physically wrong with AMD's graphics cards that prevent them from running PhysX like simulations in real time, in fact there are multiple open source solutions out their that do use OpenCL or DirectCompute to do physics simulations on compliant GPUs, like Bullet. Heck, CPUs have been able to do real-time PhysX like effects for some time as well. Nvidia is doing what any company would do, and that is differentiate their products from the competition. It doesn't mean they've surrendered on the performance race, just that is a secondary consideration compared to tying customers to your ecosystem. And that is what I find appalling from the position of one who wishes to maintain the ability to choose.
As for the 3D effects... I found the 3DS to be very painful to my eyes. If other methods produce results at all similar to this, I'm bound to never enjoy them.
As for the 6990M vs. the 485M I always err on the side of price and performance considerations. If they're the same price? It may be a tough choice for you. But if in typical manner the AMD product is cheaper, definitely go for it. Laptops are generally GPU bound. -
The answer to your question is 42.
-
Wow I have found the ultimate question to the ultimate answer!!! What does an AMD fanboi say when he has no evidence to backup his false claims? 42, obviously. Excellent, move the question "What is the meaning of life the universe and everything?" over to the solved column. AMD fanbois, at least they are good for a laugh.
P.S. it was the waffling between the "it's a done deal" and "Sony is not 100%" that gave away the false claims. -
Seriously? Videocard fanboy accusations? Who cares?
-
42 as in post #42. Those are the links to the sources you asked for.
I figured you'd miss that meaning though....just like how you missed the source links I had already posted when Mr MM's requested them. Had you caught those links originally you wouldn't have had to dig up that ancient link to a rumor Sony denied almost two and a half years ago. -
Karamazovmm Overthinking? Always!
but the confusion was indeed amusing, so long and thanks for all the fish -
LOL this is kinda going off topic now.
I'm not a fan of either but I much rather prefer Nvidia mainly due to better driver support. -
Xbox 1 = Nvidia. But I do agree, I don't see them using nvidia anymore.
-
Oh I get it yes post 42 where you posted a rumor as fact, explains the waffling later. Sorry, for the confusion, I actually thought your link was a joke.
Hmmm shoots this down, can probably see how it comes off as Fanboi, especially in light of your "evidence":
Sounds like you have some hard evidence there, any more links or is that the only one you have? -
Does it really chap your that much that a stranger posted a rumor that next gen consoles won't have Nvidia chips? Really, who cares?
-
He seems to just be a huge fanboi/troll ....
-
Can the 485M only use 1 external monitor or can it use like 2? I was hoping to use 2 external monitors with eyefinity, but the laptop I might be getting doesn't have an AMD card option
-
Getting it after the fact isn't really getting it. It's okay though, as long as I get a laugh then my jokes are still funny.
Call me a fanboy all you want...I never play at hiding it like some who want to pretend their bias is the only truly objective one. -
GapItLykAMaori Notebook Evangelist
tbh if you clock the 485 it will essentialy have the same performance as the 6990m. The 6990m can be clocked as well, but when both cards are OC'd to their maximum i assume the difference between them lessens. Go for whatever is cheaper. Laptop cards are not like the desktop world, cards from each side can only be a small percentage faster than another.
-
yup on laptop the diference is minimal caus nvidia cannot put out 385 w eating monster
-
That will depend on the flexibility of each model (485m vs 6990m) and between each card. In any case, given both performances levels, I agree that going for whatever is cheaper is the optimal option.
-
Sorry for reviving an old thread but this question is relevant for me because i can get both for the same price, i guess the 6990 overall is stronger but I could see myself using CUDA and/or physics. How much stronger is the 6990 than the 485?
-
10% for now. The margin will most likely increase after AMD releases updated drivers for the 6990M.
-
true but the 485m is easy to overclock by at least 20%, on mine i have it oc to 27%.
-
Karamazovmm Overthinking? Always!
and the 6990m as well, 800-850 core clocks and 1000+ mem clocks are what people are getting with it.
pretty close to the 6870 clocks (900 and 1050)
its also the same with the 580m people are getting OCs that reaches the 560ti clocks, not the norm, being the average close to it -
6990M any day. 3D is a FPS hater and PhysX is a real gimmick, not too many games support it, it looks nice, but there's not much games that support it (18 games estimate?).
-
I think is like the double. But besides the few games that use it: does it adds anything meaningful (i.e. enhanced gameplay experience)? I think not since PhysX is always an afterthought no matter how much nVidia pays for it.
-
Karamazovmm Overthinking? Always!
I think he meant games that matter, Im a fan of the dance dance games, thats why I always buy nvidia hardware, physics FTW!
-
Getawayfrommelucas Notebook Evangelist
The physX rules in Batman AA and you can only assume it will rule even harder in Batman AC
-
I don't think you should base your buying decision on whether your gpu has PhysX or not. It not a gimmick, but it not fully used either. If Nvidia put more time and effort into it, It could potentially make or break a buyer purchasing decision, but as of now, it not worth it.
It still better to have then to not have but it won't affect gameplay much, if at all. I think if you have seen PhysX then you will certainly notice it on AMD card which doesn't have it, but otherwise you won't notice it. -
I do agree that whether the GPU has PhysX or not should not be the buying decision... yet you contradict yourself in your 2nd paragraph...
-
Well, PhysX is pretty awesome, but it isn't something that would determine a GPU buying choice.
-
Unless you plan on using the CUDA for rendering images in professional programs like I would, then go for ATI.
You get a higher performance for the same money. -
The difference for commercial GFX cards is minimal when it comes to CAD rendering. Although I do not trust Notebookcheck 100%, the 6990M performs better in almost all tested programs than the gtx 485M.
-
You can use ATi Stream units too wich perform also like CUDA...
-
Most professional programs just use OpenGL. It's not a big deal, AMD is fully supported by the majority of professional software and by ALL the major professional software.
CUDA never actually took off. It only seems that way among the uninformed because the uninformed keep throwing it around like a cookie jar and it seems more glorious than it ever was.
All the major software companies if they are going to utilize GPGPU will be using OpenCL, not CUDA. And there is a reason why they use OpenGL instead of DirectX and OpenCL is maintained by the same group that maintains OpenGL. Developers care about cross platform and something that is easily portable. CUDA is none of that.
Even Apple's pro studio desktops use AMD now, not Nvidia, frankly because most professional 3D applications use CPU to render not the GPGPU, since CPU is more flexible and of higher quality.
485m (3d/physX) or 6990m (performance)?
Discussion in 'Gaming (Software and Graphics Cards)' started by vNaK, Jul 15, 2011.