Looks like the 40nm is just around the corner:
http://www.fudzilla.com/content/view/14107/1/
Updated:
The current confirmed 40nm cards are as fallowing:
![]()
Further info on Nvidia's website: http://redirectingat.com/?id=525X832&url=http://www.nvidia.com/object/geforce_m_series.html
-
And in further news :
http://www.fudzilla.com/content/view/14115/1/
The 280M GTX on 55nm will remain Nvidia's fastest chip, while the fastes 40nm chip will have 96 shaders thus it will be die shrink of the former 9800M GT/ 8800M GTX but on 40nm and with directx 10.1 support.
Expect them to be lunched at the end of june.
This also makes me speculate that the 40nm 280M GTX will see the light of day at the end of August and why not, we might also get a "295M GTX" on the long awaited core 216.
-
...Where is next gen stuff with more than 128 SPs?
-
-
I'd really like to know the power consumption of the 40nm 96 shader GPU. If it's 40-45w, then at least they're making progress.
-
They say it's based on the GT200 but when I saw 96 shaders I assumed it's just a marketing gimmick.
We just have to wait till the end of June. -
I just remembered something.
This chip, the GT 240M, is rumored to be based on GT215, the performance level chips of the 40nm GT200, which were reportedly scrapped from the desktop line. In other words, it's not considered an enthusiast level chip, so expect the higher level mobile cards to exceed 128 processing cores.
Mobile 40nm has only moved up because the desktop line has been revamped. The enthusiast level GT212 based chip was on the roadmap for Q4. -
And it gets even better:
GDDR5 FTW!!!
http://www.fudzilla.com/content/view/14165/1/ -
Soviet Sunrise Notebook Prophet
If my grandmother had wheels, she'd be a wagon.
-
-
Soviet Sunrise Notebook Prophet
Nvidia's desktop cards have yet to transition over to GDDR5. Historically, Nvidia's mobile GPU's trail desktop GPU's by several months. Even if this were to be true and Nvidia does release both desktop and mobile GPU's with GDDR5, only the top shelf cards would have them equipped, and they are not going to be cheap like the GTX 280M.
-
Megacharge Custom User Title
-
Asus is putting the GDDR5 ATI card in their new mainstream "no frills" K series, so I doubt it will be expensive. -
-
-
Hmm, very interesting indeed this find. Should I wait or should I get 260M upgrade knowing how slow mobile gpus are being released?
-
Nvidia already said that the 280M will remain top of the line and I assume the 260M will as well. My guess is that the real 40nm high-end generation will be announced in August this year and become available in late September.
For now, the 40nm generation will be midrange and lower-highend with the most powerful GPU having 96 shaders and apparently GDDR5.
It should all become official in two weeks from now and from there at least one month until you get the new cards in notebooks. -
-
-
-
ATI has always marketed the number of "Logical" Shader Processors rather than the number of "Physical" Shader Processors, divide the number by 5, and you will get the real number of Shader Processors their DirectX 10 compliant Graphics Cards have.
They market it as such because every one of their Shader Processors can handle 5 tasks at once, unlike Nvidia's which can only handle 1 task at once, a bit of a marketing gimmich, because, oftenly, Nvidia's single-task shader processors outperform ATI's multi-task shader processors.
40nm Nvidia Graphics Cards sound rather interesting, personally, I am waiting on GT300 so that I can start to build my desktop computer. (The leaked out specs are amazing.) -
What does it matter that Nvidia beats ATI "Clock for clock, shader for shader, and memory for memory," when ATI has a more power efficient design that is more scalable for laptops and cheaper to make. Besides direct comparisons of these metrics between two different architectures means nothing. I thought most people learned that after AMD started using ratings instead of marketing clock speed for their Athlon CPU. -
It does not look like that ATI has anything up their sleeve to counter that amount of raw performance, but will continue to compete in the Low, Medium and Medium-High markets. -
-
More information on GT300:
Nvidia's GT300 Specs Revealed | It is a CGPU! -
Megacharge Custom User Title
PM me when they have a mobile derivative. lol
-
-
Soviet Sunrise Notebook Prophet
Yeah, sign me up too.
Damn, Crylo. You always pop up out of nowhere.
+1 rep says that there will be atleast three new threads in this forum regarding a mobile GT300 by 12:00 PST. Winner will be the third thread author. -
Megacharge Custom User Title
Guess I'll have to build myself a desktop as well. :/ -
Megacharge Custom User Title
-
-
Yeah, when I bought my Sager NP9262, I realized that the "High-Performance Notebooks" world was not really for me, but it made me more experienced around Notebooks and Computers, so, it was really worth it.
However, when it comes to gaming, I guess that Desktops are the King Of The Hill. -
Nvidia said it will make the GT300 series available at the end of this year and they will be 40nm tech. According to what I know, the 32nm transition at TSCM will be complete by spring next year, which leaves us to speculate that the GT300 series in notebooks will be announced probably in summer next year and become available in augut 2010. Nvidia has shown itself to take the notebook market really serious as it is now larger than the desktop market, the fact that the first 40nm cards will be for notebooks is proof of that.
See, with a bit of official information you can figure out the future.
For now, the 280M will remain king until probably autumn this year when we should expect to get a notebook card which will be based on the core 216. I actually expect it to have the 192 core with even some cut out shaders in order to maintain the 75W TDP. In such a scenario, the new card should be about 15-20% more powerful than the current 280M. But this I admit is more speculation. All I am saying is not to expect a huge increase in performance with this new card. -
True, very true, the future for the notebooks world looks bright, I must say. -
GT300 eh...looking good
-
I have to disagree with those who feel GDDR5 is not living up to the hype. It raises 128-bit to the level of the 256-bit, despite having less shaders. How then, will the 256-bit GT 240M not overtake the GTX 280M, when there's only a 32 shader difference? Unless they have to gimp the clocks, it will be faster.
As has been said, ATi is keeping up with Nvidia just by the strength of GDDR5. I know it's not technically 1:1, but 800 of ATI's shaders is roughly equivalent to 160 of Nvidia's processing cores. So Nvidia is using double the memory bus width and way more cores, and the 4890 is only trailing the GTX 285 by 10-15%.
The results speak for themselves. -
Let me correct my previous post, the next production technology at TSMC is 28nm not 32nm.
And thanx again to Fudzilla (no I am not affiliated to them, but I read them a lot LoL), here is clear comparison between gddr3 and gddr5. They use almost exact systems and video cards the only difference being that one uses 1024 GDDR3 and the other 512 GDDR5. The GDDR5 shows an increased performance of about 10% over the same card with GDDR3.
http://www.fudzilla.com/content/view/13812/40/1/1/
EDIT: I guess Kevin might be right. We just have to wait a bit to see. -
dondadah88 Notebook Nobel Laureate
it's a 10 percent increase.
http://www.xtremesystems.org/FORUMS/showthread.php?t=192690&page=4
http://www.xtremesystems.org/forums/showthread.php?t=193085
http://www.xtremesystems.org/forums/showpost.php?p=3104944&postcount=4
provided to me by ichime -
-
-
Turning out really, really interesting.
-
Here you go, finally some real news:
http://www.fudzilla.com/content/view/14216/1/
The top of the line of the 40nm card will the GTS 260M, 96 shaders 1 GB GDDR5 and slower than GTX 280M
Quoting:
Nvidia's first official 40nm chip with DirectX 10.1 support, Geforce GTS 260M will work at 550MHz core and shaders clocked at 1375MHz. As we reported earlier, it will come with 1GB GDDR5 memory and this memory will be clocked at impressive 1800MHz.
It was kind of logical to expect a 128-bit bus, as with GDDR5 can deliver twice as much bandwidth compared to GDDR3 at the same 128-bit bus. Since Geforce GTX 280M has 128 shaders, it will remain the performance king, despite the fact it is based on the G92b 55nm chip.
Nvidia plans to unveil total of five GPUs all 40nm and DirectX 10.1 and the plan is that they will coexist with existing Geforce 200M / 100M series.
I was about to get the GTX 280M, but now I am curious if the new cards will be made on MXM 2.1. -
dondadah88 Notebook Nobel Laureate
i hope they will fit my socket as well
-
And because I enjoy keeping up to date, it is almost official:
Further info here:
http://www.pcmag.com/article2/0,2817,2348694,00.asp
I can't really understand why there is such a huge difference of 10W between the 250M and 260M in this case. A 10% OC can't really justify it. -
dondadah88 Notebook Nobel Laureate
but i don't think it will beat out the 280m unless it's heavuly overclocked to recover the extra 128bit bus less.
take the 4770 vs the 4850 out performs this
the 4850 vs the 4870 at the same exact clock(4850 highly overclocked) the 4870 wins by only 10 percent -
-
I'm really interested in the gts 250m then because of the 28W TDP. If one could overclock it without raising voltage then magic!
-
I must say I'm impressed. I was pretty dead set on switching to ATI because of their TDP advantage. Now I have no idea which to get(which is a good thing).
-
Interesting, and impressive. The GTS 260M being 128-bit GDDR5 @ 28w is really intriguing.
1800MHz on the memory clock is... just wow!
btw, check the new sig... I'm back in the game. -
well technically if the 4860 mobile releases, it should host a lower TDP and perform better than a 280m which in term is better than the gts 260m for horsepower but worse for TDP. The mobile 4850 had a TDP of 25W so its expected that the 4860 won't be too far off.
I still think that if ATI successfully releases the mobility 4860 that it'll be crowned the best single card solution for performance to TDP ratio.
Nvidia 40nm
Discussion in 'Sager and Clevo' started by Blacky, Jun 8, 2009.