The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    NVidia's attempts to produce a 40nm version of GT200 were "disastrous at best"

    Discussion in 'Gaming (Software and Graphics Cards)' started by Jlbrightbill, Mar 4, 2009.

  1. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Via DailyTech:

    NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries. This has led to price pressure on NVIDIA, especially in the performance mainstream segment.

    NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.

    One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.

    Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.

    The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.

    There is currently no GeForce 100 series for desktop or mobile markets.​

    http://www.dailytech.com/article.aspx?newsid=14480
     
  2. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    Haha nvidia is playing catchup this is exciting because now the competitiveness of both companies will come out and ultimately one person will win.........














    The consumer :D
     
  3. gengerald

    gengerald Technofile Extraordinaire

    Reputations:
    674
    Messages:
    1,961
    Likes Received:
    0
    Trophy Points:
    55
    *goes and cries in corner*
     
  4. v_c

    v_c Notebook Evangelist

    Reputations:
    124
    Messages:
    635
    Likes Received:
    0
    Trophy Points:
    30
    Ati are going to take the HD4860 and go running with it.

    Just imagine - they are comfortable with 40nm technology, and they are also willing to put GDDR5 in a mobile card. What's next? - a GDDR5 card with 800 pipelines (ie same as 4870) but built in 40nm? Of course that's not going to happen any time soon, but it just shows that they will be ready to counter anything nvidia do with some amazing cards going into 2010 and beyond.
     
  5. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    Well technically 800 shaders / GDDR5 was supposed to have happened already, it just hasn't reached consumers yet. the 40nm 128-bit gddr5 4860 being better than the 55nm 256-bit gddr3 4850 is pretty amazing though, memory bus has until now been the most important aspect in determining GPU performance by far. This will give ATi a HUGE advantage though because a smaller memory bus width is much much much cheaper to manufacture.
     
  6. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    From what I'm reading around places, the reason we're seeing 40nm so soon is that ATI's yields are much better than even they expected. They're able to launch full scale in full quantities earlier they thought, and this of course also means that they're doing it at lower cost because there are fewer chips that need "binning".
     
  7. terminus123

    terminus123 Notebook Deity

    Reputations:
    4
    Messages:
    766
    Likes Received:
    4
    Trophy Points:
    31
    "the first 40nm chips from NVIDIA will be in the GeForce 300 series."

    is that the Mobility 300? or desktop?
     
  8. hgfdsa

    hgfdsa Notebook Consultant

    Reputations:
    0
    Messages:
    161
    Likes Received:
    0
    Trophy Points:
    30
    I think desktop first and after to laptops(I hope)...
    ATI FTW...
     
  9. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    i think someday we are gonna start seeing the release of notebook GPU's and desktop GPU's at the SAME time.
     
  10. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    The 4860 is coming out for notebooks first.
     
  11. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    ohhhhhhh people used to hate on ATI and be like yea i got a nvidia xxxx now im like pshhh i got a ATI FTW lol

    the 40nm will make every thing so much faster and better and cheaper what more can you ask for?
     
  12. Ahmed_p800

    Ahmed_p800 Notebook Evangelist

    Reputations:
    205
    Messages:
    350
    Likes Received:
    0
    Trophy Points:
    30
    OOOOh man bad news for nvidia

    i think its time to switch to ATI

    Did u people see the AMD cinema 2.0 trailer??

    HERE

    that was amazing in game fotage
     
  13. terminus123

    terminus123 Notebook Deity

    Reputations:
    4
    Messages:
    766
    Likes Received:
    4
    Trophy Points:
    31
    I have a question, what exactly is Cinema 2.0? I thought it was something to play videos at higher quality with, but then...why do they have an "in-game" trailer for it? Is it like CUDA or something?
     
  14. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106
    That's the name of the trailer. The video was rendered in real time by the GPU, what they mean is that in theory every game could look just as good as what is in that video and playing games would be pretty much like watching a move hence the Cinema 2.0 title.
     
  15. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    Asus W90 is going to melt face with ATI cards, 6gb ram, a quad core cpu and at a price on par with your average gaming notebook and cheaper than your average Mac.

    There top version is dual 4870's right now, I wonder if they will put 4860's in there at some point. I guess you may lose a bit of rendering power but would make it run cooler and have longer battery life and lower the cost.
     
  16. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    That trailer reminds me of Elephant's Dream...
     
  17. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    This competition is great! The moment one wins its bad news for the consumer.
     
  18. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    900
    Trophy Points:
    131
    I'm losing faith in Nvidia now :(

    Rebranding same old crap all the time.
     
  19. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    I read about how ATi started the miniaturization of their manufacturing process much earlier than Nvidia last year or thereabouts when they were behind Nvidia in the mobile market and it seems the payoff is being seen now when Nvidia's performance lead has been cut down to size.

    I'm going to wait 2 more years before my next upgrade but I reckon the next system I get may be equipped with an ATi card performance card the rate things are progressing.
     
  20. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    No, actually Cinema 2.0 is their name for their architecture, basically meaning that they're trying to blend the line between traditional games, and a movie like experience.

    It's the exact same thing as nVidia's name for their GPU architecture that they used to call CineFX, but now they call it Graphics Plus.

    What's sad here is that nVidia as a whole company is worth more than 3x as much as AMD, which is AMD/ATI combined since the buyout. And yet they're still recycling the same chips and branding them as new since 2006. The 8800GTX came out in 2006, and every GPU after that has basically been a die shrink of THAT card, with very minor modifications made, there has yet to be an actual change in the design for the most part. Much like what AMD has been doing.
     
  21. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    Lets hope whoever is buying this isnt planning on taking it with them :eek:
     
  22. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    well, one could carry it around instead of going to a gym)) stay fit --> save money)))))
     
  23. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    lol you are right who needs weights anyways? :D
     
  24. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    The W90's 4870 cards aren't much more than 4850+. As designed the 4860 reference card from ATI might outperform the W90 cards.
     
  25. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    well i can imagine the OC potential on the 4860 will possibly be greater :D
     
  26. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The 4800s so far have only allowed +50 on the core and memory. Not so impressive.
     
  27. gengerald

    gengerald Technofile Extraordinaire

    Reputations:
    674
    Messages:
    1,961
    Likes Received:
    0
    Trophy Points:
    55
    In relation to Nvidia's failings, they just made Engadget, once again, in regards to issues with the MBP 9600M (only) showing signs of death. I wonder if a class action suit would be applicable, legal, or logical at this point. I am (or was, not sure) a Nvidia fanboy, yes I will admit it. However, the last year or so has been pretty darn tough, I may have to return to neutrality and bounce over. Gar, Nvid, get your cr*p together like you did so in the past.
     
  28. dondadah88

    dondadah88 Notebook Nobel Laureate

    Reputations:
    2,024
    Messages:
    7,755
    Likes Received:
    0
    Trophy Points:
    205
    Well the only good news about the rebranding is when people who bought the old 8800m gtx see this, they say "Oh looks like my card is going to last a bit longer." from 2006. come oh if i knew that i would of gotten it in sli and would of been good from even longer. Lol. (i love my whitebook and ati.)

    game devolpers wont make a game that cant by played by almost everyone.(excluding crysis because EA was apart of it.)

    but i'm glad ati is doing what there doing. they just need to fix there clock speeds and separate there shader from there core. (maybe i'm not getting why they didnt already.)
     
  29. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    Wow props to you my friend for being one of the only people to have the guts to say you were an nvidia fan boy but now you see the light i salute you.
     
  30. Beric1

    Beric1 Notebook Evangelist

    Reputations:
    2
    Messages:
    511
    Likes Received:
    0
    Trophy Points:
    30
    Heh, I was a Nvidia fanboy as well. However, if there's an ATI card available when I buy, I'll definitely give it some consideration. Nvidia has made WAY too many mistakes recently.
     
  31. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    Yes, and it will be hard to forgive them(especially the ones who faced the 8 series prob)

    But if this makes them try harder to make GPU's for all notebook buyers(low, mid, and high end) better im all for it :D
     
  32. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Paper thin notebook having problems cooling a mid-range GPU? I'm not surprised. There's no other product having 9600 heating issues.
     
  33. Beric1

    Beric1 Notebook Evangelist

    Reputations:
    2
    Messages:
    511
    Likes Received:
    0
    Trophy Points:
    30
    Yup. People love Macs because they're thin and run quiet. Then they wonder why they get so hot. Maybe lack of proper cooling and the thinness would be the issue?

    I've got a Macbook, and boy does it run hot.
     
  34. joshthor

    joshthor 100% Crazy Sauce

    Reputations:
    163
    Messages:
    1,119
    Likes Received:
    0
    Trophy Points:
    55
    i love it, nvidia is just dominating (at least performance wise, not price/performance) the desktop market but ati is quickly picking up steam with the notebook market. i was getting worried there was gonna be another intel/amd situation.
     
  35. nizzy1115

    nizzy1115 Notebook Prophet

    Reputations:
    2,557
    Messages:
    6,682
    Likes Received:
    1
    Trophy Points:
    205
    I think we will have the desktop ones first since they are easier to make.
     
  36. gengerald

    gengerald Technofile Extraordinaire

    Reputations:
    674
    Messages:
    1,961
    Likes Received:
    0
    Trophy Points:
    55
    I guess the only salvation will be one of their high end cards in an effort to not get a dud. But then the issue is the laptop's cooling. It would be nice if Nvidia set a thermal cooling standard and co-op'ed with designers to ensure it would be reached. It may be fairly extreme, but if there was some kind of standard, it may be helpful. Like set level 1, 2, and 3. Level 1 cooling is required for all low-mid range cards and provides adequate cooling. Level 2 would be great cooling for low/mid plus adequate cooling for high. Then 3 would be awesome cooling for high. I would certainly purchase a certified level 3 for an extra $500 if it covered the key points. Just a couple of my cents :D...
     
  37. jb1007

    jb1007 Full Customization

    Reputations:
    165
    Messages:
    1,230
    Likes Received:
    0
    Trophy Points:
    55
    The GPU wars continue, and that's what we should all be thankful for. There was a time when ATI dominated, then Nvidia picked it up and they dominated, now ATI is coming back strong.

    I really don't care for either company, whether it says ATI or Nvidia in front who cares. What matters is that they push each other to deliver better products and we, the user, stand to benefit from their competitiveness.
     
  38. MAG

    MAG Notebook Deity

    Reputations:
    459
    Messages:
    1,036
    Likes Received:
    0
    Trophy Points:
    55
    Can't agree more. :)
     
  39. Exostenza

    Exostenza Notebook Evangelist

    Reputations:
    252
    Messages:
    493
    Likes Received:
    14
    Trophy Points:
    31
    The way I see it I hope there is push and pull between the two companies for ever. I was with ATi with the Rage 128 then I went nVidia with the Geforce 2 then I went ATi with the 9800 Pro, then Ati 1900 XTX, then I got my 7800 GT then over to the 8800 GTX now I have my 9800m GTS in my laptop. I can't wait to go back to ATi and then back to nVidia and so on and so forth! Love the competition! Same with AMD / Intel, back and fourth we go!

    So far though my favorite cards where the 9800 Pro and the 8800 GTX

    Being a fanboy is for morons imo, why not just get whats the best on the market?!!?!?
     
  40. gengerald

    gengerald Technofile Extraordinaire

    Reputations:
    674
    Messages:
    1,961
    Likes Received:
    0
    Trophy Points:
    55
    Good thing its your opinion ;)
     
  41. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    ....what?

    4870 X2 = GTX 295 (Entirely game dependent for who wins)

    GTX 260 Core 216 = HD 4870 1GB (Game dependent for who wins)

    GTX 260 Core 192 = HD 4870 512MB (Game dependent again)

    HD 4850 > 9800 GTX+ & 9800 GTX

    HD 4830 = 9800 GTX (Game dependent) and > 9800 GT

    HD 4670 > 9600 GSO

    The only unchallenged NVidia card is the GTX 285, ATI is either dead heat or winning every other price point. HD 4890, dropping next month, will change that as well (And note that NVidia has no new GPUs coming for months and months).
     
  42. Quiz

    Quiz Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    Until u turn on AA and AF ,then its Nvidia > Ati.
     
  43. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    that was true with the HD2*** series, but not anymore.
     
  44. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    with the hd4000 series, ati>=nvidia when AA is active.
     
  45. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Like others have said before me, this is not the case with the HD 4000 series. G92 especially takes a swan dive when you raise the resolution and include AA.

    It's incredibly annoying when people who don't follow graphics very well at all post nebulous statements about things they are grossly misinformed about.
     
  46. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Though my GTX 260 doesn´t take a toll when enabling AA, sure some but not as the G92 core. It was the same with my 8800GTX it could cope with AA enabled. I wish Nvidia could have made a real GTX 260M out of the new Core. But nope. I´ll pass until the next gen Nvidia GPU´s hits the mobile market.
     
  47. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    in every single game I play with my HD4850's, turning on 4xMSAA drops my frames by 5fps at most when the framerate is already over 150fps. I play every game in 1920x1080, usually with 4xMSAA or 8xMSAA and never have performance issues. The only type of anti aliasing that does kill my FPS is edge detect AA, which even 4x EDAA can half my framerate, but I don't even think any of the nVidia cards even support it.
     
  48. cathy

    cathy Notebook Evangelist

    Reputations:
    47
    Messages:
    551
    Likes Received:
    0
    Trophy Points:
    30
    Wasn't ATI bashed in the past for being unable to handle AA?
     
  49. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Yes, mostly with the 2xxx series though the 3xxx series was only a bit better at AA. The 4xxx series finally fixed any problems.
     
  50. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I admit, I was a fan of Nvidia for a while, but then there was the heat death issue and the insistence of using one core over and over and over again in its GPUs (8800M GTX = 9800M GTX = GTX 280M, all using a G92).
    That's the problem with resting on your laurels - competitors will succeed in dethroning you, and your laurels will hurt after a while.
     
 Next page →