The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Nvidia's Fermi is Broken and Unfixable.

    Discussion in 'Gaming (Software and Graphics Cards)' started by anexanhume, Feb 17, 2010.

  1. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    This is huge if true.

    http://www.semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/

    excerpt:

    For those of you waiting for a Fermi part (and especially those of you holding out for a new nvidia mobile part that isn't a re-badge), it may be time to start looking at what Evergreen has to offer.

    edit: Updated title to reflect that it's a rumor story.
     
  2. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    Imagine that. :rolleyes:

    I have yet to see a credible source for any of this stuff, and his own articles on the same site don't count.
     
  3. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    I only picked up on the article because Anand of anandtech tweeted it. I don't know the source, but I trust Anand to only retweet stuff he thinks is worth talking about.
     
  4. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    Yeah...worth talking about because Demerjian looks funny wearing a tinfoil hat. :p

    In all seriousness, though, I don't put any stock in rumors like this. I'll wait until Fermi comes out and see how it is, and then and only then will I pass judgment on it.
     
  5. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Until Nvidia release it, and its "Broken officially"

    This is just another aticle, :)
     
  6. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Oh Charlie again, the man who hates Nvidia.
     
  7. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106
    Well, NV30 was a utter piece of crap, and knowing that NV had serious problem with the early production of Fermi, I wouldn't automatically dismiss this article... Wait and see I guess.
     
  8. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    Yeah this dude's hate for Nvidia is impressive and well known. But his articles are fun to read, just don't believe it all.

     
  9. luffytubby

    luffytubby Notebook Deity

    Reputations:
    354
    Messages:
    829
    Likes Received:
    10
    Trophy Points:
    31
    My first Laptop had a 8600m GT 512 DDR2.
    That dreaded card dissapointed me so much. The heat was... sigh.

    I've had many horrible Geforce cards like 400 MX and Geforce 5200 FX... Budget low crap cards that came with the computers I got. I have also had a few decent ones, but as a whole, I don't think highly of Nvidia.

    But I have only had Nvidia cards in my entire life... Why?


    I guess I am just one of those people who sticks to what they know.

    It's intimidating seeing "Runs great on Nvidia" sequences upon start up of many of my favorite games.

    Having an ATI card would be interesting, but I don't their process with drivers, stability, control center and all that...


    As a consumer you just want less trouble and more stability. If you buy an expenssive laptop, you just can't get a new card. Nvidia got away with all those faulty mobile GPU cards in the 8000 series!
    And so many people got screwed over.. so many laptop manufactureres got screwed over.

    I saved for over a year to get that Laptop I wanted. Six months later, the GPU scandal becomes apparent :(
     
  10. OneCool

    OneCool I AM NUMBER 67

    Reputations:
    77
    Messages:
    713
    Likes Received:
    0
    Trophy Points:
    30
    And the "DustBuster" is reborn once again!! :D
     
  11. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    Actually, Charlie has been pretty damn accurate regarding "Fermi"...
     
  12. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    His conclusions and opinions re: Nvidia may sound as Apocalyptic as anything Nostradamus or the Mayans could come up with but you got to give it to the guy for knowing how to line up the facts to support his case.
     
  13. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    I share your pain (check the card in my sig lol). I worked the entire summer and had to endure a serious cut in my allowance for months to pay for my laptop (1300€).
    I haven't had an ati card before but when I look back at nvidias bad quality cards I'm seriously considering one. I'm not familiar with ati's driver issues but they can't be that bad or can they? I mean...it would be kind of ridiculous if you couldn't game with your ati graphics card.

    Whatever nvidia will bring out, if it won't support directx11 decently i'm not going to be interested(the 8600m gt suposedly supports direct x10, unless you try playing assassins creed, which I can only play with x9).
     
  14. Slaughterhouse

    Slaughterhouse Knock 'em out!

    Reputations:
    677
    Messages:
    2,307
    Likes Received:
    2
    Trophy Points:
    56
    Yeah don't know about the credibility of that article.
     
  15. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Ha! That was exactly what I thought before I assembled a desktop with an ATI Radeon card: "How bad can their drivers possibly be?" The desktop works very nicely with 3D graphics... but give it youtube videos or Windows Media player or anything else 2D, and it very quickly crashes and crashes hard (the display goes monochromatic and there's nothing to be done except push the reset button). I've checked the ATI forums and I'm not the only one with this problem -- the solution is to effectively over-volt your card when idle (the issue is that the drivers are too aggressive at lowering the voltage in 2D to save power). They've released drivers 9.12 (and a hotfix because without it there was a different crash) and then 10.1, but the problem is still there. So yeah, the answer is "pretty darn bad."

    That said, Nvidia is not really any better than ATI. Their issues with the GeForce 8 cards are well know (although I have a laptop with an 8600M GT that is turning 2 years old this month and the card has not given me any problems so far). I'm hoping that Intel finally makes its way into the GPU arena, although they don't seem to be doing well thus far.

    BTW, there is simply no way you will see the Fermi GPUs in a laptop in their current incarnation. Charlie usually exaggerates, but the fact that these cards devour a lot of power (200W+) is well known and not even the Clevo monstrosities can handle that. It will take at least one die shrink and one revision before this makes sense for mainstream notebooks. I suppose Nvidia could try doing a laptop version of the GT200, but I suspect they'll just re-badge G92 for the nth time.
     
  16. Alien-bear

    Alien-bear Newbie

    Reputations:
    5
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    I might as well wait another few weeks for these things, and then decide. I'll pick up a 5870 (for my desktop) if they're overpriced.
     
  17. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    Agreed... Ferni looks like an utter failure so far...
     
  18. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    I really don't understand FERMI. Why would someone want all these general shader units to do general processing? If one is willing to spend this much money on a GPU I'd imagine they would would have at least a Core 2 Duo Quad, i5, i7 which begs the question, why do we need to have GPU do general processing? Especially when the GPU is going to consume 250 Watt or more and i7 is far more powerful and power efficient? I just can't wrap my head around it.

     
  19. satan194p

    satan194p Notebook Guru

    Reputations:
    3
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    I think the drama and speculation will finally end when it comes out. either its going to be a big leap or a big failure
     
  20. downloads

    downloads No, Dee Dee, no! Super Moderator

    Reputations:
    7,729
    Messages:
    8,722
    Likes Received:
    2,247
    Trophy Points:
    331
    Or both. It can be substantially faster than current ATI HD 5K generation but end up being available shortly before new ATI HD 6K hit the market.
    Something like GT 240M –good card, just 8 months too late.
     
  21. unnamed01

    unnamed01 Notebook Deity

    Reputations:
    194
    Messages:
    982
    Likes Received:
    0
    Trophy Points:
    30
    Wasn't the 400 MX pretty good back in the day?
     
  22. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    I thought so, i also used 5200 and 5600 fx alot, and they've never let me down,
     
  23. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    I loved my old FX5200, and loved it right up to the day I got my 6600GT. :p
     
  24. hardhousehead

    hardhousehead Notebook Consultant

    Reputations:
    6
    Messages:
    161
    Likes Received:
    0
    Trophy Points:
    30
    I have never had problems with Nvidia desktop cards, my first nvidia notbook card was the ill fated Go 7600 GT. It died on me in January after 3.5 years although it started to overheat last summer.
     
  25. Zero

    Zero The Random Guy

    Reputations:
    422
    Messages:
    2,720
    Likes Received:
    0
    Trophy Points:
    55
    Maybe I help provide a possible answer for this. nVidia has traditionally always started by designing and producing big chips with a large die area and then putting them to market. Then, in the meantime, they'd focus on shrinking it down and reducing the power requirement to an acceptable level, so it could be implemented in lower power systems such as notebooks. Having a GPU where some general load could be offloaded from the CPU in those circumstances would be quite beneficial, particularly if your intended market segment usually has a selection of slow CPUs.
     
  26. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    for some tasks, the programmable shaders (in their large quantity) are much more powerful than a modern cpu.

    in a sense, a gtx 280 has 240 cores at 600 mhz each. obviously, it doesn't scale linearly, but the total compute power of the gpu is large given proper programming and an applicable task.
     
  27. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    Masterchef hit it on the head. Some tasks will take advantage of the huge number of cores, even if they are individually rather slow compared to a "normal" CPU. It's similar to the concept behind cloud computing, a la the @home projects.
     
  28. euisungkim

    euisungkim Notebook Deity

    Reputations:
    203
    Messages:
    714
    Likes Received:
    0
    Trophy Points:
    30
    I used 7900gt for 3years and it didn't let me down until graphics started to crash. It was overclocked for 3years without a cooler so it's understandable. This is my first ati card (mobility 5870) and it did not let me down. Performanc e is outstanding and once catalyst 10.3 beta comes out I'm expecting it to run better than now. I like both of them, I'm expecting alot in fermi. I agree, fermi will be a failure or a great success. We'll find out eventually. Alot of people expected mobility5870 to be faster than gtx 260m about 20% yet it's better about 35% more. I hope this happens to fermi also.
     
  29. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Like all news from Charles, I will wait and see.
     
  30. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    Does this mean that a laptop with a weak cpu like the AW m11x could perform many times better with to a FERMI gpu?
     
  31. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    60
    Trophy Points:
    466
    To do something like that, an application has to be written to support CUDA. Needless to say, there are not many applications out there that do that (very few, in fact) and CUDA usage is mostly limited to professional/academic use.
     
  32. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    agreed.

    just the same, it's those same professionals / researchers / scientists who need inane amounts of processing power, and i7's, and will probably welcome fermi for its gpu compute power.

    if you are a gamer, then unless you are *really* particular you can get by on about $100 worth of processor and $100 worth of GPU.
     
  33. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    Cpu's might be slower than gpu's in the professional/academic world, but they have yet to be proven reliable. Nvida and ati both have yet to provide a product that can handle 100% usage 24/7. The fail rate for gpu's goes through the roof when they are put through that kind of stress. Cpu's have been around much longer and have been proven in that regard.
     
  34. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    i haven't heard of any sort of open-cl gpgpu related hardware failure

    link?
     
  35. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Well my XFX 8800 GTX was overclocked for over 3 years and that GPU which is a 8 series didn´t have any problems such as the Core 92 architecture has.
     
  36. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    It has nothing to do with GPGPU except that these cards are expected to be working at full load the entire time. The issue Rorschach is referring to is that GPUs are at least an order of magnitude less reliable than CPUs. Think about how often you see "help, I think my GPU died" relative to "help, I think my CPU died" on the internet. To be honest, I recall practically no instances of the latter at all (only dead-on-arrival and incompatible parts). Here's a site that tries to track GPU failure rates by the number of returns. It's probably not entirely accurate, but you can get an order of magnitude out of it and GPUs range from about 2% to about 10% (high end ones are more likely to fail since the chip is physically larger).
     
  37. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    The only thing I want to see changed is the business practice.

    I want to stop seeing these bullcrap Meant to be Played on Nvidia ads in games.

    I'd like to see Nvidia stop forcing companies to sign contracts to not use AMD products.

    I'd like to see Intel sued and stop for doing the same above.

    I'd actually like to see a fair competition between AMD, Nvidia and Intel and allow us consumers decide for ourselves what we want, not by stupid contracts and marketing b.s. I don't get why game developers would even want meant to be played on nvidia in their game, if they know ATi has a large following even though they may not be the best but the best in price/performance. Makes no sense to me why they would want to say FU to ATi users?
    - Especially considering AMD/ATi has been very cooperative with both game developers in Microsoft with DX requirements whereas Nvidia has been the whiny, crying spoiled brat.

    I'd like to see what kind of new technology we will see in the future without all the unfair business practices used by Nvidia/Intel and let competition show what new innovations we will see.
     
  38. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    I think its a bit weird ati has taken the lead with the most powerfull gpu's. Normally they bring out weaker, but more budget friendly gpu's, while nvidia always focused on bringing out the most powerfull, and most expensive cards.
    So eventhough FERMI might be a failure, I'm still wishfull thinking about nvidia wanting to take its place back by releasing an entire new direct x11 card that will blow everything ati has away.
     
  39. ajreynol

    ajreynol Notebook Virtuoso

    Reputations:
    941
    Messages:
    2,555
    Likes Received:
    0
    Trophy Points:
    55
    that's not been the case at all. they've been going back and forth for literally YEARS. they don't "normally bring out weaker" anything.

    oh. you're an Nvidia fanboy. explains the first paragraph.

    never mind.
     
  40. Zero

    Zero The Random Guy

    Reputations:
    422
    Messages:
    2,720
    Likes Received:
    0
    Trophy Points:
    55
    The last couple of generations of GPU releases has definitely seen ATI releasing more price-competitive products instead of going straight for the performance crown. It was probably the smart thing to do at the time considering AMD wasn't particularly profitable and going for performance with a new chip would be quite a risky strategy. I guess they believed playing it safe with a decently performing, but critically, a well priced product would be more beneficial.

    nVidia on the other hand has, as you mention, usually targets performance and then they trickle the arhcitecture down by shrinks. I'd assume thats what they were planning on doing with Fermi, and it could still work out for them. What's critical for them currently is getting the new 40nm process (which ATI has much more experience with currently) working efficiently and effectively for them to produce acceptable yields. I have doubts that Fermi won't be able to compete. We'll just have to wait and see how it pans out.
     
  41. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Aw, you mean the mobile parts are going to be ANOTHER iteration of the bloody G8x/9x core?!
    I might just have to switch to ATi next laptop.

    (note: fully aware the GeForces are now GT200-based)
     
  42. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    I'm not aware that there has been an architectural change since the 8000 series. They've just been alternating die shrink / overclock for each cycle.
     
  43. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Probably.

    The weird part is, these things are almost adequate. All they realistically need to run are console ports and these are generally not that GPU intensive.
     
  44. mujtaba

    mujtaba ZzzZzz Super Moderator

    Reputations:
    4,242
    Messages:
    3,088
    Likes Received:
    516
    Trophy Points:
    181
  45. Chirality

    Chirality Notebook Consultant

    Reputations:
    62
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
  46. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    Yea i'm a total nvidia fanboy, my 8600m gt has given me nothing but joy with the black screen freezes, rainbow sparks and purple triangle lines -_-'. I'm going to replace it with an ati hd 4650, and if it will work well, I'll probably never go back to nvidia.
    Btw, why does wishfull thinking about a super strong nvidia gpu make me a fanboy? I would never intend to spend the price they ask, and would only applaud it since it would make ati lower its prices.
     
  47. Vogelbung

    Vogelbung I R Judgemental

    Reputations:
    3,677
    Messages:
    4,067
    Likes Received:
    699
    Trophy Points:
    181
    Well - they need to run console ports better. Since the last SLI gaming notebook I had was the M1730 with the 8800's, and I never really used it for gaming (beyond an occasional session of NFS: Prostreet and BF2 maybe), I've been out of touch with what notebook GPU's can do - specifically those from NVidia - and I must say that looking into it recently with a view to replacing my desktop, I was somewhat surprised at the apparent lack of progress in this regard - and the market is NVidia's to lose, with ATI seemingly hanging on by the skin of their teeth to release competing products.

    Oh, and Demerjian's more right than not if my desktop experience is anything to go by. It's just the way he says it that rubs people the wrong way. Reminds me of someone I know...
     
  48. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    $100 cards are sufficient for most people, even serious gamers.

    if you are not in that category, nvidia offers quad sli solutions.

    it doesn't really make sense for them to R&D and release $2500 graphics cards to the gaming consumer market, there just isn't enough of a market for that type of thing to pay off. So, the cheaper solution on their end is to minimize additional hardware requirements and come up with a clever software solution. quad sli. bam.

    as far as gaming notebooks, again, it's just a matter of R&D dollars and time. we will need die shrinks to increase performance, as there is a serious power and heat wall when dealing with notebooks.

    ATI has really competitive notebook offerings right now, the mobility 5870 beats out the gtx 285m, and their midrange cards also offer good performance against nvidia cards per the dollar. they don't have the same level of market penetration that nvidia does though, that is for sure.

    although, i personally have always had issues with ATI drivers...
     
  49. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    I do agree with this 100%. But then again NVIDIA's high-end cards are always stupid-expensive.
     
  50. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    Yeah after my mobile Nvidia burnt itself out and reading that it wasn't Asus's fault but an Nvidia design flaw, I waited patiently and then bam, G73 arrived. I jumped on that boat immediately, screw Nvidia.

    But right now, hardware is amazing. HD5870 mobile with 800 shader cores and computational power of over 1.2 teraflops paired with i7 Quads that can process 8 threads and I never imagined I would have 8GB of ram at 1333 speed. Considering the XBox360 and the PS3 will not be replaced for another 4-6 years, and the increasing number of console gamers, it's the software that needs to improve. The PC hardware is overkill, it's just not up to par I believe because game developers just don't care to optimize games for PC hardware anymore. I think there is too much assumption that PC gamers will just upgrade yearly to make up for their half-assed crap work. It needs to stop.

    Also I think PC gaming has gone down the drain. It's all the same. What do we look forward to? Games where you run around shoot someone in the head, and then run over and set a bomb. People camping spawn spots, snipers lying around just aiming for heads. This is a good game? It's good because it pushes our hardware to the limits?
    - If Blizzard has the record for the best selling game titles ever, why don't other game developers catch on? A good game isn't the one that looks the best and the best looking game is not the best selling game.

    PC Gaming needs to go back 10 years when graphics wasn't the main reason to play. More games that are intellectually invigorating and push the limits of our imagination. I've played one game in the last 2 years where I felt I was pulled into the game. Mass Effect 2. The list ends at one game. How pathetic is that?

     
 Next page →