The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Nvidia 40nm

    Discussion in 'Sager and Clevo' started by Blacky, Jun 8, 2009.

  1. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
  2. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    And in further news :
    http://www.fudzilla.com/content/view/14115/1/

    The 280M GTX on 55nm will remain Nvidia's fastest chip, while the fastes 40nm chip will have 96 shaders thus it will be die shrink of the former 9800M GT/ 8800M GTX but on 40nm and with directx 10.1 support.

    Expect them to be lunched at the end of june.
    This also makes me speculate that the 40nm 280M GTX will see the light of day at the end of August and why not, we might also get a "295M GTX" on the long awaited core 216 :D.
     
  3. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    ...Where is next gen stuff with more than 128 SPs?
     
  4. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    If it's based on the GT200, it's clearly not a die shrink of the 9800M GT.
     
  5. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    I'd really like to know the power consumption of the 40nm 96 shader GPU. If it's 40-45w, then at least they're making progress.
     
  6. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    They say it's based on the GT200 but when I saw 96 shaders I assumed it's just a marketing gimmick.

    We just have to wait till the end of June.
     
  7. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I just remembered something.

    This chip, the GT 240M, is rumored to be based on GT215, the performance level chips of the 40nm GT200, which were reportedly scrapped from the desktop line. In other words, it's not considered an enthusiast level chip, so expect the higher level mobile cards to exceed 128 processing cores.

    Mobile 40nm has only moved up because the desktop line has been revamped. The enthusiast level GT212 based chip was on the roadmap for Q4.
     
  8. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
  9. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    If my grandmother had wheels, she'd be a wagon.
     
  10. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Call me stupid but I don't get it.
     
  11. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Nvidia's desktop cards have yet to transition over to GDDR5. Historically, Nvidia's mobile GPU's trail desktop GPU's by several months. Even if this were to be true and Nvidia does release both desktop and mobile GPU's with GDDR5, only the top shelf cards would have them equipped, and they are not going to be cheap like the GTX 280M.
     
  12. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Aye... he's quoting Mr Scott, he means it sounds far fetched to him.
     
  13. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    ATI is using GDDR5 in their mobile 40nm to make up for a 128bit bus. I'm guessing Nvidia is trying to do the same to keep the chip size down. Further, this could help offset the cost of the GDDR5 ram.
    Asus is putting the GDDR5 ATI card in their new mainstream "no frills" K series, so I doubt it will be expensive.
     
  14. bestbacon

    bestbacon Notebook Evangelist

    Reputations:
    11
    Messages:
    442
    Likes Received:
    0
    Trophy Points:
    30
    Bah, this sucks i was hoping the 40nm chip would at least give performance close to the gtx260m but on a smaller 40nm manufacturing process.

    News of the k51ab has still be vague and far in between though.
     
  15. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    Agreed, but I believe that is because of the trouble ATI has had with 40nm, not related to GDDR5.
     
  16. Static Space

    Static Space Notebook Guru

    Reputations:
    2
    Messages:
    51
    Likes Received:
    0
    Trophy Points:
    15
    Hmm, very interesting indeed this find. Should I wait or should I get 260M upgrade knowing how slow mobile gpus are being released?
     
  17. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Nvidia already said that the 280M will remain top of the line and I assume the 260M will as well. My guess is that the real 40nm high-end generation will be announced in August this year and become available in late September.

    For now, the 40nm generation will be midrange and lower-highend with the most powerful GPU having 96 shaders and apparently GDDR5.

    It should all become official in two weeks from now and from there at least one month until you get the new cards in notebooks.
     
  18. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Not so much AMD, but rather TSMC. It's rumored that the AMD R800 will be coming soon (recent price drops + clear inventory = R800, coincidence?).
     
  19. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    which in turn is giving AMD trouble :D
     
  20. emike09

    emike09 Overclocking Champion

    Reputations:
    652
    Messages:
    1,840
    Likes Received:
    0
    Trophy Points:
    55
    I think everybody puts far too much attention into GDDR5. Memory performance had the very least effect on overall GPU performance. SPU count and Core/Shader frequency are really what should be looked at. It is my opinion that hyping GDDR5 is nothing but a marketing scheme to make people think the card is better. Clock for clock, shader for shader, and memory for memory, nVidia outperforms ATI by leaps and bounds. How many shaders do high end ATI cards have? Something like 800?
     
  21. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    ATI has always marketed the number of "Logical" Shader Processors rather than the number of "Physical" Shader Processors, divide the number by 5, and you will get the real number of Shader Processors their DirectX 10 compliant Graphics Cards have.

    They market it as such because every one of their Shader Processors can handle 5 tasks at once, unlike Nvidia's which can only handle 1 task at once, a bit of a marketing gimmich, because, oftenly, Nvidia's single-task shader processors outperform ATI's multi-task shader processors.

    40nm Nvidia Graphics Cards sound rather interesting, personally, I am waiting on GT300 so that I can start to build my desktop computer. (The leaked out specs are amazing.)
     
  22. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    I agree GDDR5, like most things, is over hyped. However, that does not mean it is not significant. For instance, one reason Nvidia is falling behind in the laptop market is because of their reliance on a huge memory bus width. Unfortunately a 512bit memory bus just isn't practical in a laptop. So if you want more performance out of a 256bit or even 128bit bus then you're going to need to move to GDDR5. Take a 128bit memory bus and put GDDR5 on the other end and suddenly have something roughly equivalent to a 256bit bus and GDDR3.
    What does it matter that Nvidia beats ATI "Clock for clock, shader for shader, and memory for memory," when ATI has a more power efficient design that is more scalable for laptops and cheaper to make. Besides direct comparisons of these metrics between two different architectures means nothing. I thought most people learned that after AMD started using ratings instead of marketing clock speed for their Athlon CPU.
     
  23. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Well, Nvidia's new highest-end cards are going to sport a 512-Bit Bus and GDDR5 Memory, basically, a 1024-Bit effective Bus, but I agree, it would not be effective for a laptop, not cost-effective nor power-effective, but it would be awesome nonetheless.

    It does not look like that ATI has anything up their sleeve to counter that amount of raw performance, but will continue to compete in the Low, Medium and Medium-High markets.
     
  24. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    lol, yeah I certainly wouldn't turn down a 512bit bus + GDDR5 :)
     
  25. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Me either, I am waiting on GT300 just to start building my new Gaming PC, also, GT300 brings a completely new architecture and lots of different features than anything ever seen by Nvidia or ATI.

    More information on GT300:

    Nvidia's GT300 Specs Revealed | It is a CGPU!
     
  26. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    PM me when they have a mobile derivative. lol
     
  27. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    My guess is that GT300 for Notebooks won't be released until that GT400 and / or 32nm Nvidia Graphics Cards are released, sadly, for now, laptop owners will have to stick to GT200.
     
  28. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Yeah, sign me up too.

    Damn, Crylo. You always pop up out of nowhere.

    +1 rep says that there will be atleast three new threads in this forum regarding a mobile GT300 by 12:00 PST. Winner will be the third thread author.
     
  29. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Unfortunately. But on the bright side, GT200 will still be sweet for laptops, and apparently the upcoming 40nm mobiles will be sporting GDDR5 as well, but I'm looking forward to the new architecture GT300 brings, playing a game will literally be like watching a movie, and not a CGI movie either a real movie.

    Guess I'll have to build myself a desktop as well. :/
     
  30. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Yep I'm worse than a cold sore.
     
  31. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Alright, Soviet, I will send you a PM once that GT300 is out, or if there is any more important and relevant information about the new architecture.
     
  32. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Agreed, I wonder what it will be like to play Far Cry 2 on two 24" Monitors that are at 1920x1200 each... simply awesome.

    Yeah, when I bought my Sager NP9262, I realized that the "High-Performance Notebooks" world was not really for me, but it made me more experienced around Notebooks and Computers, so, it was really worth it.

    However, when it comes to gaming, I guess that Desktops are the King Of The Hill.
     
  33. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Nvidia said it will make the GT300 series available at the end of this year and they will be 40nm tech. According to what I know, the 32nm transition at TSCM will be complete by spring next year, which leaves us to speculate that the GT300 series in notebooks will be announced probably in summer next year and become available in augut 2010. Nvidia has shown itself to take the notebook market really serious as it is now larger than the desktop market, the fact that the first 40nm cards will be for notebooks is proof of that.

    See, with a bit of official information you can figure out the future :).

    For now, the 280M will remain king until probably autumn this year when we should expect to get a notebook card which will be based on the core 216. I actually expect it to have the 192 core with even some cut out shaders in order to maintain the 75W TDP. In such a scenario, the new card should be about 15-20% more powerful than the current 280M. But this I admit is more speculation. All I am saying is not to expect a huge increase in performance with this new card.
     
  34. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Thanks for posting that information, Blacky, I guess that I will wait for the GTX 300 32nm Series in Spring, I know that at the end of the day, the wait will be worth it.

    True, very true, the future for the notebooks world looks bright, I must say.
     
  35. Cookie

    Cookie Notebook Evangelist

    Reputations:
    32
    Messages:
    527
    Likes Received:
    0
    Trophy Points:
    30
    GT300 eh...looking good :D
     
  36. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I have to disagree with those who feel GDDR5 is not living up to the hype. It raises 128-bit to the level of the 256-bit, despite having less shaders. How then, will the 256-bit GT 240M not overtake the GTX 280M, when there's only a 32 shader difference? Unless they have to gimp the clocks, it will be faster.

    As has been said, ATi is keeping up with Nvidia just by the strength of GDDR5. I know it's not technically 1:1, but 800 of ATI's shaders is roughly equivalent to 160 of Nvidia's processing cores. So Nvidia is using double the memory bus width and way more cores, and the 4890 is only trailing the GTX 285 by 10-15%.

    The results speak for themselves.
     
  37. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Let me correct my previous post, the next production technology at TSMC is 28nm not 32nm.

    And thanx again to Fudzilla (no I am not affiliated to them, but I read them a lot LoL), here is clear comparison between gddr3 and gddr5. They use almost exact systems and video cards the only difference being that one uses 1024 GDDR3 and the other 512 GDDR5. The GDDR5 shows an increased performance of about 10% over the same card with GDDR3.
    http://www.fudzilla.com/content/view/13812/40/1/1/

    EDIT: I guess Kevin might be right. We just have to wait a bit to see.
     
  38. dondadah88

    dondadah88 Notebook Nobel Laureate

    Reputations:
    2,024
    Messages:
    7,755
    Likes Received:
    0
    Trophy Points:
    205
  39. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Correct, but in your original post you made it seems like AMD was the only one having 40nm problems. In fact, Nvidia is having more problems than AMD is (larger die and all that). It seems AMD however has managed to take it in stride and seems poised to release the R8xx series soon: pics of working silicon can be found online and the recent 48xx price drops (to clear inventory for the next gen?).
     
  40. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    The 4890 runs at a high clockspeed as well, and can overclock to ~1ghz. It's not just the GDDR5...
     
  41. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Turning out really, really interesting.
     
  42. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Here you go, finally some real news:

    http://www.fudzilla.com/content/view/14216/1/

    The top of the line of the 40nm card will the GTS 260M, 96 shaders 1 GB GDDR5 and slower than GTX 280M

    Quoting:

    Nvidia's first official 40nm chip with DirectX 10.1 support, Geforce GTS 260M will work at 550MHz core and shaders clocked at 1375MHz. As we reported earlier, it will come with 1GB GDDR5 memory and this memory will be clocked at impressive 1800MHz.

    It was kind of logical to expect a 128-bit bus, as with GDDR5 can deliver twice as much bandwidth compared to GDDR3 at the same 128-bit bus. Since Geforce GTX 280M has 128 shaders, it will remain the performance king, despite the fact it is based on the G92b 55nm chip.

    Nvidia plans to unveil total of five GPUs all 40nm and DirectX 10.1 and the plan is that they will coexist with existing Geforce 200M / 100M series.


    I was about to get the GTX 280M, but now I am curious if the new cards will be made on MXM 2.1.
     
  43. dondadah88

    dondadah88 Notebook Nobel Laureate

    Reputations:
    2,024
    Messages:
    7,755
    Likes Received:
    0
    Trophy Points:
    205
    i hope they will fit my socket as well
     
  44. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    And because I enjoy keeping up to date, it is almost official:

    [​IMG]

    Further info here:
    http://www.pcmag.com/article2/0,2817,2348694,00.asp

    I can't really understand why there is such a huge difference of 10W between the 250M and 260M in this case. A 10% OC can't really justify it.
     
  45. dondadah88

    dondadah88 Notebook Nobel Laureate

    Reputations:
    2,024
    Messages:
    7,755
    Likes Received:
    0
    Trophy Points:
    205
    but i don't think it will beat out the 280m unless it's heavuly overclocked to recover the extra 128bit bus less.

    take the 4770 vs the 4850 out performs this

    the 4850 vs the 4870 at the same exact clock(4850 highly overclocked) the 4870 wins by only 10 percent
     
  46. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, they're making plenty progress ;)
     
  47. bestbacon

    bestbacon Notebook Evangelist

    Reputations:
    11
    Messages:
    442
    Likes Received:
    0
    Trophy Points:
    30
    I'm really interested in the gts 250m then because of the 28W TDP. If one could overclock it without raising voltage then magic!
     
  48. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    I must say I'm impressed. I was pretty dead set on switching to ATI because of their TDP advantage. Now I have no idea which to get(which is a good thing).
     
  49. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Interesting, and impressive. The GTS 260M being 128-bit GDDR5 @ 28w is really intriguing.

    1800MHz on the memory clock is... just wow!

    btw, check the new sig... I'm back in the game.
     
  50. bestbacon

    bestbacon Notebook Evangelist

    Reputations:
    11
    Messages:
    442
    Likes Received:
    0
    Trophy Points:
    30
    well technically if the 4860 mobile releases, it should host a lower TDP and perform better than a 280m which in term is better than the gts 260m for horsepower but worse for TDP. The mobile 4850 had a TDP of 25W so its expected that the 4860 won't be too far off.

    I still think that if ATI successfully releases the mobility 4860 that it'll be crowned the best single card solution for performance to TDP ratio.

    its 38W bro.
     
 Next page →