The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia's Fermi is Broken and Unfixable.

    Discussion in 'Gaming (Software and Graphics Cards)' started by anexanhume, Feb 17, 2010.

  1. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    I feel the same way, except my game of choice is Spore. I wish LBP would be released to PC's, because that would definitely steal some months of my life.
     
  2. Levenly

    Levenly Grappling Deity

    Reputations:
    834
    Messages:
    1,007
    Likes Received:
    0
    Trophy Points:
    55
    how old are you though? is it not unwise to say you enjoyed games much longer ago, though, entertainment was much easier to come by then (should you be ~20-25 years old currently)? 10 years ago i was 12. i thought legos were fun. kirby was a lot of fun, but now, kirby is boring and incredibly easy. of course i have my favorites (LOZ series), but even back then the graphics were considered amazing for such games. graphics back then were considered a real reason to purchase the games just like they are today. Final Fantasy 10 had incredible graphics, especially the cut scenes for the time. many people suggested the game because of the graphical complexity and the 'realism' it provided, and this was 9 years ago.

    we're not playing Zork or anything anymore - the games have GUI and to build upon the complexity of the interface, the graphics need to be improved upon. there are a myriad number of games that are released weekly, and the bulk of them don't ever see the spotlight because what sells is fun easy-to-pick-up games. there are still those games out there that provoke the imagination and 'pull you in'.
     
  3. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    I wouldn't want us to go back by ten years, but if you look at ffx10, those graphics are still by far more than good enough for my needs. All I care about is gameplay. I just want a strong gpu so my laptop will last for a few years, which is practicily impossible.

    The statement that you need good graphics to get sucked into a game is nonsense. The playstation one Spyro games created the best environment/setting where you could lose yourself thanks to creative use of colours.
     
  4. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    I am gonna go to ATI... All i have had with NVDIA is a hot hot GPU... I had some much trouble with this laptop and only after months of tweaking does it run cool now... but i have to wait till spetember...hopefully they have a a new 5000 series ATI GPU with 1600 shaders...
     
  5. Levenly

    Levenly Grappling Deity

    Reputations:
    834
    Messages:
    1,007
    Likes Received:
    0
    Trophy Points:
    55
    well that isn't what i was saying - i was saying flashiness has always been a reason to check out a game. style and fashion has always been fused into games once games went mainstream. not that i'm saying graphics will be the sole reason to be 'addicted' to a game. if you're looking for imagination, check out Infocom's games...

    even though you might not notice you said it, but Spyro created a good environment to you because of the GRAPHICS. i remember Donkey Kong 64 was a phenomenal looking game for the 64. Majoras Mask was so impressive with the environment and detail. these games were very fun, but also, they were so new with what was done. when LOZ went 3D it was AMAZING. you could do so much in the game. the game felt like a breath of fresh air. 12 years later, 3rd person action is very normal for character control.

    would you play spyro again had it just came out now in 2010, with nothing but a texture / polygon update and the series hadn't came out yet? i know i probably wouldn't get Mario 64 and many other games. i'm older. i like different things then when i was a kid. i can't really translate the entertainment value as entertainment was so much different back then. i was so fascinated with many games, especially because they were so new and ground breaking on the PS1 and N64.

    game play is always pure win for everyone; however, how many wii games do you own? aren't they the must purist of 'gameplay over graphics' games around? they're downright fun, but, i'm not looking to get those games.

    the best selling titles 'gamer' titles do have GOOD graphics. Blizzard's games have excellent graphics. WoW is 5-6 years old or something, yet with each expansion they update the graphics and detail the environment much much more.

    i think most people are more disgruntled with the use of physics in game versus graphics. graphical engines are now heavily physics based which tax the hardware very much.

    think of it this way - there will be a point in time where graphics can no longer exceed in terms of visuals. they will be at the utmost visual complexity that they can be. nothing can be realer than life itself. as the market progresses towards being able to run extreme graphical content easily, we will see a slow transition back to much more 'imaginative' gameplay. graphics are already good for the majority of people. this brought forth the Wii. games with great gameplay are there. they're just not games that 'hardcore' gamers would play.

    compare the average game from 2000 to the average to 2010, there is a vast improvement. look at 1990 compared to 2010. imagine in 2020 and 30 what visuals would be like.

    especially if we're able to get on exascale computing for personal usage, we'd be able to crunch out very, very impressive results for games in the future.

    back on topic - i think ATI will hold the marketshare for new gpus still since many people are waiting for Fermi to have the 5000 series get a price drop.
     
  6. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    I had a great time reading that article and would mostly agree with that guy.

    The article aside, I have said it before, but I hate being taken for a fool, and although not being a particular fan-boy of nvidia, always tended (is that a word) to go more towards them. but renaming after renaming and the 8600 crap (my GFs pcs 8600 burned, and...w/e long story) I just feel, well, I feel marketed ><.

    And I am stubborn, my desktops card is an ATI and I have nothing but enjoyment from it and because I am stubborn, there is no way I am giving my money out to people I feel consider me as nothing else than a money bag, so I'll be an ATI fanboy. Until ATI pulls that same crap, then ill just give up.

    Anyway great read that article.
     
  7. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    I like your setup. I use vintage JBL monitors, the L series that JBL created, studio monitors for home use.

    Like your AKG's, they look like the studio K240s? I've got the Sennheiser HD650 and Audio Technica AD900 headphones with a dedicated amp like yourself. Audio is fun right?

    I've got the exactmat also, but hate it. Thinking of ordering a cloth mat Goliathus or the Steel qCK.

     
  8. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    Nice JBLs :)

    AKGs are k271 mkII

    I dont have a dedicated amp but a External sound card, thinking about the amp tho.

    Audio is the best.

    I love the mat, but not too sure about the armrest, I had a cloth one before, never again. Its all about personal taste i think.

    /OFF TOPIC I FEEL T3H BAN STIKZ ABOVE MY H3ADZ!
     
  9. Kuu

    Kuu That Quiet Person

    Reputations:
    765
    Messages:
    968
    Likes Received:
    18
    Trophy Points:
    31
    From what little I understand about transistors and whatnot, this seems like Nvidia is trying to jump too many steps in the evolution of technology, or is being lazy/stupid in how they're designing the card.

    It also seems like whoever they're getting to make the cards has issues with making them, or the tech to control the manufacturing of defects isn't there yet either.


    I should study more.
     
  10. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    From what I read, it seems to be a mix of both. When they try to cheat and don't do their homework, they get mad and try to cope with verbal abuse. I have to hand it to ATI on this one... they've done what NVIDIA just can't seem to get through their thick corporate skulls.

    I think that too. I've known about TSMC for quite a while going back, and it seems every time the have a lith shrink, they get worse. I've always wondered what would happen if Intel would share some of that AWESOME fab insight they have with different companies, but that's just wishful thinking on my part. Granted, its a vastly different archetecture than what Intel builds, but they seem to always get it right, plus they have their OWN fabs. I've also wondered what would happen if NVIDIA were to one day go completely "in-house" with the whole process, what would happen to their yields then?

    IDK... it just seems to me that TSMC is trying to take the cheapest way out possible, even if that means sub-par transistors, and substrate layers, and that someone else could step up to the plate and show TSMC how it's done. Granted ATI did what NVIDIA can't, but they had to re-design to compensate for TSMC's shortcomings.
     
  11. trvelbug

    trvelbug Notebook Prophet

    Reputations:
    929
    Messages:
    4,007
    Likes Received:
    40
    Trophy Points:
    116
    the [.problem with nvidia is that in the past year or so they have been trying to become a company thats NOT a gpu maker. they have dumped considerable resources on their nvidia ion cous and their nvidia tegra mobile platforms.
    unfortunately for them theyve been hit by major problems. intel filed a legal case against them since they are not registered in the SEC as cpu manufacturers - a legal battle that nvidia will likely loose.
    the only gadget to use the tegra is the hd zune, and unfortunately devastating for nvidia is the fact that they are NOT a development partner for windows 7 phone.
    these are gigantic setbacks indeed. and nvidia released a statement to investors last quarter that they are ideed refocusing on their core business - making gpu's.
    the mobile refreshes i believe are a direct effect of this former shifting of priorities. fermi was also definitely a 'casualty' as it was set back in favor of tegra i believe. now they seem to have no choice but to focus on gpu tech for both the professional and enthusiast markets.
    that could bode well for everyone, as they will not only be forced to come out with newer tech, but also cheaper products to fight of the ati tide.
     
  12. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    Yes spyro attracted me because of its nice environment, but the graphics were not that good. I mentioned that game because it managed to create a wonderfull world with very limitied graphic options.

    If I would buy it in 2010? YES! My first console was a ps2 (which I bought 2-3 years after it was released), and I started playing games like ffx10, timesplitters 2, and so on. So by the time I got to know spyro the graphics were already very outdated. Nevertheless its one of the best games I eve rplayed thanks to gameplay and briliant colours.

    I own 0 wii games, but that has nothing to do with their graphics. Nintendo consoles tend to bring out +-5 good games after the first 1-2 years of the console release, after that your console is just a dust collector. Nonetheless, I bought a gamecube just for zelda the windwaker. Yes...just for that 1 game. Why? I love good gameplay with flashy colours ^^ (like a moth to a flame rofl).

    About the Fermi release, i'm with you. I'm also waiting for nvidia to release a super strong gpu just so the ati 5 series will drop in price. Nvidia's quality has an even worse than toyota quality reputation to me now.
    I think i'm one of the many who just want a gpu that will last a long time and that will be affordable. By answering all your aspects about games and graphics I hope your convinced that hardcore gamers like myself care more about gameplay that graphics. Though I must admit that new graphic wonders like Assassins creed would not have had as much fans with ps1-ps2 graphics...
     
  13. desu

    desu Notebook Evangelist

    Reputations:
    139
    Messages:
    341
    Likes Received:
    0
    Trophy Points:
    30
    but if nvidia brings out a super strong gpu it is going to be priced more expensive because nvidia really can't afford to drop the prices. However ati is in a position to drop prices to put the hurt on nvidia. if we see a price drop at all its because Ati is trying to force nvidia to lose market share not because nvidia is trying to put the hurt on ati
     
  14. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    Exactly. That was my thinking. The FERMI and Nvidia's route just doesn't make much sense when the i7 Quad is cheaper and more powerful, I don't need a GPU to do general processing. Some rail against ATi for their lack of 16 reference frames in DXVA support, but honestly the i7 barely uses 10% CPU to run the best 1080p encodes I've seen.

    Nvidia should focus on the GPU. As it is most users still view the Nvidias as GPUs, but since they are not, really throws off for gaming community. Nvidia didn't just ruin it for themselves but ruined it for the rest of us.

    If Nvidia was on board in making GPUs with DX compliance, we'd have far more DirectX 10 games and DirectX 10 would be what DX11 is now, since MS had to strip so much of DX10 away because of Nvidia's refusal to comply with the rest of the gaming community. That's why I don't like Nvidia, they didn't just screw themselves over, they screwed us all, the gamers, the developers also.

     
  15. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    You're being melodramatic, Nvidia didn't strip anyone of anything. We know you dislike Nvidia, can we move on?
     
  16. classic77

    classic77 Notebook Evangelist

    Reputations:
    159
    Messages:
    584
    Likes Received:
    0
    Trophy Points:
    30
    "A broken clock is correct twice a day"

    Charlie's nVidia bashing is so constant, eventually he was bound to get it right...I think they really have dropped the ball on Fermi.

    I just put together a desktop for the first time in a long time (not for me) and it was unbelievable how much more appealing ATIs products are at the moment, and will be for a while. (Im betting Fermi will flop pretty hard, this article makes a severly good case for that idea, much better than charlie's usual stuff)

    Knowing the market's history though, and with nVidia's not implementing Tegra into anything substancial, I think we will see NVidia come back into its traditional market strong with a solid traditional GPU..their new ideas have failed and they will retreat back to their strength. There last quarter's plan stated so. (Keep in mind though, it will be a year at the earlies before this happens).
     
  17. desu

    desu Notebook Evangelist

    Reputations:
    139
    Messages:
    341
    Likes Received:
    0
    Trophy Points:
    30
    yeah they did and you know how they did it by refusing to support The full dx 10. That's why we have dx 10.1 which also flopped and why Nvidia is so late with Fermi because they refused to cooperate with the indrusty. They have been dragging their heels implementing dx10.1 and dx11. If you would ever bother to check anything you would see Directx 10 was supposed to include dx 10.1 and parts of dx11.
    Microsoft, game developers, Ati and Nvidia have a committee setup by microsoft to decide what gets put in each version of directx. Notice Nvidia hasn't been playing nice and hasn't supported dx 10.1 ever and still has yet come close to supporting dx 11. It isn't an accident and couldn't be an accident when all parties have a knowledge of what is going into each directx years in advance. you think directx 12 specs aren't being worked on now probably even in finalized and in development.
     
  18. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    Never say never.
     
  19. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    I know right, I play BioShock in 10.1. Works great, but there's really not much difference, I have to say.
     
  20. desu

    desu Notebook Evangelist

    Reputations:
    139
    Messages:
    341
    Likes Received:
    0
    Trophy Points:
    30
    i didn't forget the n it was supposed to be ever
     
  21. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    Taken in context. "Hasn't ever" is probably grammatically incorrect, and should be "has never". I'm no literary professor, but I know how to read, write, and listen.

    Nvidia has in fact, released DX10.1 GPU's. Any of their 40nm parts are DX10.1 as well, and they have been available for the past 6 months.
     
  22. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    "Has not ever" can be written or spoken both ways as far as i know. The meaning definitely shifts slightly depending on whether you consider the negation as part of the verb or part of the adverb, which the contraction makes explicit.
     
  23. rl2e

    rl2e Notebook Guru

    Reputations:
    0
    Messages:
    55
    Likes Received:
    2
    Trophy Points:
    16
  24. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    surprise surprise, the truth is not what the extremists say.
     
  25. hardhousehead

    hardhousehead Notebook Consultant

    Reputations:
    6
    Messages:
    161
    Likes Received:
    0
    Trophy Points:
    30
    Its been confirmed March 27th release date of Fermi in the US at least.
    So what that guy said may not be the truth, I never thought it was at all. :)
     
  26. Lostinlaptopland

    Lostinlaptopland Notebook Consultant

    Reputations:
    18
    Messages:
    168
    Likes Received:
    0
    Trophy Points:
    30
    A launch does not necessarily mean availability.
     
  27. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    Most probably won't be available till June... I think charlie is right on this one...
     
  28. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    My suspicion also. Launched but not readily available.

    I'm still curious if it will release consuming the 250 watts as previews suggest and whether people will need larger cases to house this thing. If so, then that's a fail right off the bat on launch date.

    Anyone going to Pax East to find out?
     
  29. desu

    desu Notebook Evangelist

    Reputations:
    139
    Messages:
    341
    Likes Received:
    0
    Trophy Points:
    30
    I remember checking 4 months ago when I had heard a rumor that nvidia was doing that. I never found any dx10.1 nvidia parts. As the rumor went hp, dell, e-machine were pressuring nvidia to put out dx 10.1 parts.
     
  30. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
  31. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    The Fermi looks a hell of a lot like the 3dfx Voodoo5 cards. Which came out just before Nvidia bought 3dfx... hmmm
     
  32. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    I saw that on engaget. I secretly hope it doesnt blow my 5870 away ><

    EDIT: and if it doesnt, be sure ill be here telling you all how bad nvidia really is!
     
  33. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    Largest I've ever seen by far.
     
  34. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Hell, its massive!

    Maybe they are getting all the 3dfx peops to fix nvidia haha.
     
  35. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    Haha, next we'll see a picture of an external power source for the FERMI!
     
  36. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    That's not the GPU die. It is the heatspreader/random piece of aluminum covering the die.
     
  37. Lostinlaptopland

    Lostinlaptopland Notebook Consultant

    Reputations:
    18
    Messages:
    168
    Likes Received:
    0
    Trophy Points:
    30
    Exactly. But it is still going to be huge. 50% bigger than the 5870 isn't it?

    Or is that just the amount of transistors?
     
  38. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    NVidia's high end GPU dies have always been much larger than ATI's since the 6 series vs X series 5-6 years ago.
     
  39. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    Dude I know we both hate nvidia, but you have to wait before it comes out and fail before you cant start bashing it, your making us look bad! :p
     
  40. Lostinlaptopland

    Lostinlaptopland Notebook Consultant

    Reputations:
    18
    Messages:
    168
    Likes Received:
    0
    Trophy Points:
    30
    And bar the 8 series have always struggled to beat them or lost. Does not bode well that line of thinking.
     
  41. Kuu

    Kuu That Quiet Person

    Reputations:
    765
    Messages:
    968
    Likes Received:
    18
    Trophy Points:
    31
    That special cooling system needs holes embedded in the PCB itself? Can't be good.

    Does anyone make green PCB's anymore?
     
  42. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    Those are just screw holes for the cooling solution mounts.
     
  43. Lostinlaptopland

    Lostinlaptopland Notebook Consultant

    Reputations:
    18
    Messages:
    168
    Likes Received:
    0
    Trophy Points:
    30
    I think blaze-senpai was referring to the quarter circle cut outs on the PCB. I have heard these have been done before and are nothign new though.
     
  44. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Desktop Fermi requires a minimum 600w power supply.


    Some of it maybe but most of the area underneath that is the die. Fermi is reported as being 530 sq. mm. ATI's RV870 is 384 sq. mm.


    There's no way in the world the GPU in that picture is ever fitting into a notebook. It's going to have to get cut down.... a lot....and whatever is left will have to go up against a notebook card based on ATI's desktop HD5770.
     
  45. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Define "notebook." I know a few Clevos that would be able to fit this bad boy in. ;)
     
  46. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    My mistake.
     
  47. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    There is no way. According to those values the NVidia GPU is only 17.5% longer on each side than the ATI GPU. That isn't a whole lot and is way smaller than the heatspreader.


    Of course not, but just like you say, ATI isn't using their top gun either. Nvidia isn't going to use a cut down Fermi, but another of their GPU's it looks like.
     
  48. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    It doesn't seem this has been posted yet:

    <param name="movie" value="http://www.youtube.com/v/vpdPSZB8A8E&hl=en_US&fs=1&"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/vpdPSZB8A8E&hl=en_US&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width='640' height="385"></embed></object>
     
    Last edited by a moderator: May 6, 2015
  49. trvelbug

    trvelbug Notebook Prophet

    Reputations:
    929
    Messages:
    4,007
    Likes Received:
    40
    Trophy Points:
    116
    nice video
     
  50. ronnieb

    ronnieb Representing the Canucks

    Reputations:
    613
    Messages:
    1,869
    Likes Received:
    0
    Trophy Points:
    55
    Yes that's the 5870. it's been out for a couple of months now. You have yet to release your new GPU's, and soon enough AMD will have their new lineup
     
    Last edited by a moderator: May 6, 2015
← Previous pageNext page →