This is huge if true.
http://www.semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/
excerpt:
For those of you waiting for a Fermi part (and especially those of you holding out for a new nvidia mobile part that isn't a re-badge), it may be time to start looking at what Evergreen has to offer.
edit: Updated title to reflect that it's a rumor story.
-
-
I have yet to see a credible source for any of this stuff, and his own articles on the same site don't count. -
-
Yeah...worth talking about because Demerjian looks funny wearing a tinfoil hat.
In all seriousness, though, I don't put any stock in rumors like this. I'll wait until Fermi comes out and see how it is, and then and only then will I pass judgment on it. -
Until Nvidia release it, and its "Broken officially"
This is just another aticle, -
Oh Charlie again, the man who hates Nvidia.
-
Well, NV30 was a utter piece of crap, and knowing that NV had serious problem with the early production of Fermi, I wouldn't automatically dismiss this article... Wait and see I guess.
-
Yeah this dude's hate for Nvidia is impressive and well known. But his articles are fun to read, just don't believe it all.
-
My first Laptop had a 8600m GT 512 DDR2.
That dreaded card dissapointed me so much. The heat was... sigh.
I've had many horrible Geforce cards like 400 MX and Geforce 5200 FX... Budget low crap cards that came with the computers I got. I have also had a few decent ones, but as a whole, I don't think highly of Nvidia.
But I have only had Nvidia cards in my entire life... Why?
I guess I am just one of those people who sticks to what they know.
It's intimidating seeing "Runs great on Nvidia" sequences upon start up of many of my favorite games.
Having an ATI card would be interesting, but I don't their process with drivers, stability, control center and all that...
As a consumer you just want less trouble and more stability. If you buy an expenssive laptop, you just can't get a new card. Nvidia got away with all those faulty mobile GPU cards in the 8000 series!
And so many people got screwed over.. so many laptop manufactureres got screwed over.
I saved for over a year to get that Laptop I wanted. Six months later, the GPU scandal becomes apparent -
And the "DustBuster" is reborn once again!!
-
usapatriot Notebook Nobel Laureate
Actually, Charlie has been pretty damn accurate regarding "Fermi"...
-
His conclusions and opinions re: Nvidia may sound as Apocalyptic as anything Nostradamus or the Mayans could come up with but you got to give it to the guy for knowing how to line up the facts to support his case.
-
I haven't had an ati card before but when I look back at nvidias bad quality cards I'm seriously considering one. I'm not familiar with ati's driver issues but they can't be that bad or can they? I mean...it would be kind of ridiculous if you couldn't game with your ati graphics card.
Whatever nvidia will bring out, if it won't support directx11 decently i'm not going to be interested(the 8600m gt suposedly supports direct x10, unless you try playing assassins creed, which I can only play with x9). -
Yeah don't know about the credibility of that article.
-
That said, Nvidia is not really any better than ATI. Their issues with the GeForce 8 cards are well know (although I have a laptop with an 8600M GT that is turning 2 years old this month and the card has not given me any problems so far). I'm hoping that Intel finally makes its way into the GPU arena, although they don't seem to be doing well thus far.
BTW, there is simply no way you will see the Fermi GPUs in a laptop in their current incarnation. Charlie usually exaggerates, but the fact that these cards devour a lot of power (200W+) is well known and not even the Clevo monstrosities can handle that. It will take at least one die shrink and one revision before this makes sense for mainstream notebooks. I suppose Nvidia could try doing a laptop version of the GT200, but I suspect they'll just re-badge G92 for the nth time. -
I might as well wait another few weeks for these things, and then decide. I'll pick up a 5870 (for my desktop) if they're overpriced.
-
-
I really don't understand FERMI. Why would someone want all these general shader units to do general processing? If one is willing to spend this much money on a GPU I'd imagine they would would have at least a Core 2 Duo Quad, i5, i7 which begs the question, why do we need to have GPU do general processing? Especially when the GPU is going to consume 250 Watt or more and i7 is far more powerful and power efficient? I just can't wrap my head around it.
-
I think the drama and speculation will finally end when it comes out. either its going to be a big leap or a big failure
-
Something like GT 240M good card, just 8 months too late. -
-
I thought so, i also used 5200 and 5600 fx alot, and they've never let me down,
-
I loved my old FX5200, and loved it right up to the day I got my 6600GT.
-
I have never had problems with Nvidia desktop cards, my first nvidia notbook card was the ill fated Go 7600 GT. It died on me in January after 3.5 years although it started to overheat last summer.
-
-
masterchef341 The guy from The Notebook
in a sense, a gtx 280 has 240 cores at 600 mhz each. obviously, it doesn't scale linearly, but the total compute power of the gpu is large given proper programming and an applicable task. -
Masterchef hit it on the head. Some tasks will take advantage of the huge number of cores, even if they are individually rather slow compared to a "normal" CPU. It's similar to the concept behind cloud computing, a la the @home projects.
-
I used 7900gt for 3years and it didn't let me down until graphics started to crash. It was overclocked for 3years without a cooler so it's understandable. This is my first ati card (mobility 5870) and it did not let me down. Performanc e is outstanding and once catalyst 10.3 beta comes out I'm expecting it to run better than now. I like both of them, I'm expecting alot in fermi. I agree, fermi will be a failure or a great success. We'll find out eventually. Alot of people expected mobility5870 to be faster than gtx 260m about 20% yet it's better about 35% more. I hope this happens to fermi also.
-
Like all news from Charles, I will wait and see.
-
-
To do something like that, an application has to be written to support CUDA. Needless to say, there are not many applications out there that do that (very few, in fact) and CUDA usage is mostly limited to professional/academic use.
-
masterchef341 The guy from The Notebook
just the same, it's those same professionals / researchers / scientists who need inane amounts of processing power, and i7's, and will probably welcome fermi for its gpu compute power.
if you are a gamer, then unless you are *really* particular you can get by on about $100 worth of processor and $100 worth of GPU. -
Cpu's might be slower than gpu's in the professional/academic world, but they have yet to be proven reliable. Nvida and ati both have yet to provide a product that can handle 100% usage 24/7. The fail rate for gpu's goes through the roof when they are put through that kind of stress. Cpu's have been around much longer and have been proven in that regard.
-
masterchef341 The guy from The Notebook
link? -
Well my XFX 8800 GTX was overclocked for over 3 years and that GPU which is a 8 series didn´t have any problems such as the Core 92 architecture has.
-
-
The only thing I want to see changed is the business practice.
I want to stop seeing these bullcrap Meant to be Played on Nvidia ads in games.
I'd like to see Nvidia stop forcing companies to sign contracts to not use AMD products.
I'd like to see Intel sued and stop for doing the same above.
I'd actually like to see a fair competition between AMD, Nvidia and Intel and allow us consumers decide for ourselves what we want, not by stupid contracts and marketing b.s. I don't get why game developers would even want meant to be played on nvidia in their game, if they know ATi has a large following even though they may not be the best but the best in price/performance. Makes no sense to me why they would want to say FU to ATi users?
- Especially considering AMD/ATi has been very cooperative with both game developers in Microsoft with DX requirements whereas Nvidia has been the whiny, crying spoiled brat.
I'd like to see what kind of new technology we will see in the future without all the unfair business practices used by Nvidia/Intel and let competition show what new innovations we will see. -
I think its a bit weird ati has taken the lead with the most powerfull gpu's. Normally they bring out weaker, but more budget friendly gpu's, while nvidia always focused on bringing out the most powerfull, and most expensive cards.
So eventhough FERMI might be a failure, I'm still wishfull thinking about nvidia wanting to take its place back by releasing an entire new direct x11 card that will blow everything ati has away. -
never mind. -
nVidia on the other hand has, as you mention, usually targets performance and then they trickle the arhcitecture down by shrinks. I'd assume thats what they were planning on doing with Fermi, and it could still work out for them. What's critical for them currently is getting the new 40nm process (which ATI has much more experience with currently) working efficiently and effectively for them to produce acceptable yields. I have doubts that Fermi won't be able to compete. We'll just have to wait and see how it pans out. -
Aw, you mean the mobile parts are going to be ANOTHER iteration of the bloody G8x/9x core?!
I might just have to switch to ATi next laptop.
(note: fully aware the GeForces are now GT200-based) -
-
The weird part is, these things are almost adequate. All they realistically need to run are console ports and these are generally not that GPU intensive. -
Seems that Charlie is right on this one :
http://www.xbitlabs.com/news/video/...e_Pull_Stride_Next_Quarter_CEO_of_Nvidia.html -
-
Btw, why does wishfull thinking about a super strong nvidia gpu make me a fanboy? I would never intend to spend the price they ask, and would only applaud it since it would make ati lower its prices. -
Oh, and Demerjian's more right than not if my desktop experience is anything to go by. It's just the way he says it that rubs people the wrong way. Reminds me of someone I know... -
masterchef341 The guy from The Notebook
$100 cards are sufficient for most people, even serious gamers.
if you are not in that category, nvidia offers quad sli solutions.
it doesn't really make sense for them to R&D and release $2500 graphics cards to the gaming consumer market, there just isn't enough of a market for that type of thing to pay off. So, the cheaper solution on their end is to minimize additional hardware requirements and come up with a clever software solution. quad sli. bam.
as far as gaming notebooks, again, it's just a matter of R&D dollars and time. we will need die shrinks to increase performance, as there is a serious power and heat wall when dealing with notebooks.
ATI has really competitive notebook offerings right now, the mobility 5870 beats out the gtx 285m, and their midrange cards also offer good performance against nvidia cards per the dollar. they don't have the same level of market penetration that nvidia does though, that is for sure.
although, i personally have always had issues with ATI drivers... -
H.A.L. 9000 Occam's Chainsaw
-
Yeah after my mobile Nvidia burnt itself out and reading that it wasn't Asus's fault but an Nvidia design flaw, I waited patiently and then bam, G73 arrived. I jumped on that boat immediately, screw Nvidia.
But right now, hardware is amazing. HD5870 mobile with 800 shader cores and computational power of over 1.2 teraflops paired with i7 Quads that can process 8 threads and I never imagined I would have 8GB of ram at 1333 speed. Considering the XBox360 and the PS3 will not be replaced for another 4-6 years, and the increasing number of console gamers, it's the software that needs to improve. The PC hardware is overkill, it's just not up to par I believe because game developers just don't care to optimize games for PC hardware anymore. I think there is too much assumption that PC gamers will just upgrade yearly to make up for their half-assed crap work. It needs to stop.
Also I think PC gaming has gone down the drain. It's all the same. What do we look forward to? Games where you run around shoot someone in the head, and then run over and set a bomb. People camping spawn spots, snipers lying around just aiming for heads. This is a good game? It's good because it pushes our hardware to the limits?
- If Blizzard has the record for the best selling game titles ever, why don't other game developers catch on? A good game isn't the one that looks the best and the best looking game is not the best selling game.
PC Gaming needs to go back 10 years when graphics wasn't the main reason to play. More games that are intellectually invigorating and push the limits of our imagination. I've played one game in the last 2 years where I felt I was pulled into the game. Mass Effect 2. The list ends at one game. How pathetic is that?
Nvidia's Fermi is Broken and Unfixable.
Discussion in 'Gaming (Software and Graphics Cards)' started by anexanhume, Feb 17, 2010.