From what I've heard, the Radeon 6000 series won't come until 2011, so the 5870 seems like a valid reference for Nvidia to use against the GTX 480.
-
usapatriot Notebook Nobel Laureate
To be honest, I would expect ATI 6xxx late 2010. Fermi on the other hand, seems like it will be sticking for a long while. After costing Nvidia huge amounts of cash, they will probably just focus on die shrinks and faster core and memory speeds rather than a new architecture.
-
ATI is planning a refresh on the desktop HD5870 this summer "code-named" Trillian. Basically what it looks like is ATI will give the same treatment to the HD5870 that Nvidia gave to the GTX 285M.
-
with the disaster that ion and tegra were, nvidia is refocusing on their core business of gpu manufacturing. with that in mind i can see the gpu wars heating up again and that could only be a good thing for the consumer.
the funny thing is desktop gpus are already so overpowered, and i cant even think of any game coming out this year that would even tax these cards one bit- thanks to the console effect on pc gaming.
anyways i think nvidia could still surprise us with fermi's capabilities yet and hopefully they offer it at a sane price.
nvidia is also showing greater interest in the mobile market, and although its near impossible for a mobile fermi to come out this year, we may have some mobile gpus based on the desktop 260cores within q3 of 2010.
ati however has been giving excellent value for money to customers, and although ive been a nvidia user ever since, i wouldnt mind the switch if they offered something really interesting this year that would be compatible with my gpu slot. -
Problem with this is that the GTX480 will cost almost twice as much as the HD5870. Not sure how that is competition. That's like a Lexus saying our competition is a Honda (and this comparison is of course related to price).
I have no doubts ATi will be released X2 versions on the HD5870 and the HD590s, those will be the competition for the GTX480s. And likely beat the GTX480 at a lower price still. That's my prediction. Two dedicated tessellation units vs Nvidia's shoving tessellation through shader units, yeah ATi win here I think.Last edited by a moderator: May 6, 2015 -
I don't think it could get worse for nVidia than the Geforce FX fiasco, but I'm eager to see how this turns out.
-
I thought the HD 5970 was already the x2 of the 5870?
-
I think he's referring to the two GPU on one card solutions they had for the 4xxx gen.
-
Right. So the HD5970 does have two units. That would be closer to the GTX480 at an MSRP of $599. So still cheaper and probably about on par in performance. We have to wait and see.
-
usapatriot Notebook Nobel Laureate
Well rumor has it that the GTX280 is just a touch better than the 5870, in which case the 5970 would absolutely blow away the GTX280. I wouldn't expect to see a Fermi dual-chip card for a long while until they find a way to cool those things down. -
Nvidia officially shows GTX 480 score vs 5870
The dual GPU HD5970 will still be the fastest single card solution. ATI also has all the potential to reduce the price on the HD5970 and come in cheaper than Fermi. -
usapatriot Notebook Nobel Laureate
That graph is just propaganda. Just like ATI's graphs from a while back.
-
In my opinion, if the Fermi 480 absolutely and utterly destroyed the 5870, there is no way in HELL Nvidia could keep quiet about it. The only reason they are not releasing any information or benchmarks is because their cards are not performing as they had hoped. They could still possibly be outperforming the 5870, but I think Nvidia set goals to embarass ATI with this card and touted it as such.
-
Yes. The wait has just been too long, there's gotta be something wrong. There have been plenty of rumors about it yes, but some of them i think are true.
To someone on the first page - The MX400/MX440 actually was a great card. I would call it the best budget card ever made. The FX5200 on the other hand was junk. It was advertised as the first DirectX 9 capable card. True? Yes. Did it run DX9 games at any playable rate? No. -
I agree and this fits the reason for the lack of info presented at cebit. Showing meager performance gain during 'parts' of one benchmark isn't doing much in promoting the card. imo, failing to deliver crucial gaming performance comparisons to the 5870 less than a month before launch is not a good sign.
On the other hand, cebit ends tomorrow so we can only hope that nVidia has surprises left in the bag... else my expectations for fermi will dim quickly -
I remember reading an article about the FX5200 that said "what is the point of touting DX9 capability if it runs the games like a powerpoint presentation?"
-
So what agree!
I saw the video on NVDIA's official facebook page yesterday... obviously made to look the GTX480 is better than 5870... -
Very nice video, I enjoyed it thoroughly. As for The heaven benchmark, I did it with my 5870 a while back and didnt like the fps dip very muchos. But then again I dont care much about the diff in dx 10.1 and 11 as of right now, I sure will when it is implemented in all games, but that isnt the case now.
So so far, I love my 5870 soon to be twins. who came out with most of what was shown here way, WAAAYY before nvidia did. -
The problem I heard was that the 480's were coming out of the pre-production line with instant failure rates of 90%.
-
that just sounds like a pathetic rumor. If they did this they would insta kill their billion dollar company. That is just not possible even coming from nvidia.
Source please? -
I'm hoping like hell that the week after Fermi launches, ATi launches their next wave of GPUs, showing 10%-20% stronger performance than Fermi at competitive pricing.
I'm only depressed that Nvidia commands so much attention in the laptop market. It's frustrating. -
Do any of us doubt that when DX11 games become more prevalent that ATi will do a rigorous driver optimizations to make DX11 features run better and faster? If I was ATi I'd wait also to see how developers implement DX11 before spending time and effort to optimize DX11 hardware.
Don't forget, ATi has a dedicated Tessellation core and over 2 teraflops of computational power from their shader cores also!
Nvidia no doubt like they always do, optimized their drivers for Heavenly benchmark.
The initial drivers ATi provided Asus were good. But just with the preview 10.3, G73 saw 10% more more gains in benchmarks and all games! I have a feeling for you desktop HD5870 owners, there is untapped power still waiting. -
Fermi is 5-10% faster than ATI HD5800.
http://vr-zone.com/articles/nvidia-geforce-gtx-480-final-specs--pricing-revealed/8635.html -
That's pretty sad... an unreleased next-gen card is just barely faster than what ATI is shipping now. And dissipates more heat than two ATI chips. Sad.
-
To give credit, Semi-Accurate lived up to their domain name. While not 100% spot on, their claims Re:Fermi now look to have had grounds for support.
Releasing a cut down version of the Fermi that was planned does make it look like Nvidia had problems manufacturing their GPU. -
Semi-Accurate is now claiming that in order to meet the deadline, Nvidia is just throwing out their lowest tier of the GF100 out and we won't be seeing the big guns until next year.
Which doesn't matter to me until gaming developers actually focus on the PC anyhow, the whole porting from console to PC makes all of this a moot point. -
Until the card comes out and reputable reviewers get it, it is all questionable material.
-
masterchef341 The guy from The Notebook
I doubt it, yes. Drivers are great, but it's going to take new hardware to get real performance changes. Nvidia and ATI both make sure their drivers perform well on synthetic benchmarks, there is no getting around that. That is why intelligent people benchmark a variety of actual prevalent games to get an idea about performance. -
i wanna see guru3d's take on these cards darn it.
anyway, if it was really 5-10% only then that'd be epic fail for nvidia. the 5800 series (mainly 50,70,90) oc quite a lot pretty easily.
e.g. 5890 was only set on lower clocks so it's power consumption wouldn't go over 300 W. i'd clock that baby to hell (in a positive [?] sense), it's not like the few watts would make someone go bankrupt. -
5-10%; that is in the area of where focus on driver optimization will determine who performs better.
I wonder if they used the new 10.3 drivers that have yielded huge gains for some in their comparisons.... -
well it'd actually be great if the cards had fairly similar performance hw-wise.
maybe we'd finally see more optimization in drivers from both ati & nvidia. (constant competition)
altho ati seemingly is already prepared to kick a**, the two 10.3 previews have been stunning.. -
Well i totally agree with u Ross... this all should be taken with a pinch of salt...i still doubt ferni is faster...
-
TBH I think the biggest performance gains still reside with the developers. Luckily there is still Bioware and Blizzard and other gems like Batman Arkham. For online play, I admit DICE BC2 runs pretty well considering.
-
Double post
-
agreed, i could run batman and me2 on mostly high settings, with only 16 puny SPs. great work by Epic. (check this out, very impressive)
-
I'm hoping IDTech 5 Engine will be like the next U3 Engine.
-
Trillian cherry-picked and/or tweaked HD5870s could also potentially cover a 5-10% performance gap.
-
Yeah, it's not like they would collude to fix prices or anything, amirite?
Nvidia's Fermi is Broken and Unfixable.
Discussion in 'Gaming (Software and Graphics Cards)' started by anexanhume, Feb 17, 2010.