So nVidia is beginning to replace their 9000 series. Their desktop line is already completely replaced, and they also came out with 130m which replaces 9600m GT.
When are they releasing 9800m replacement? Is it worth waiting for?
-
If you don't need it right away, it's worth waiting for!
-
Red_Dragon Notebook Nobel Laureate
it should be it is called the 180m i believe i saw a chart of it im not sure where i saw it though.
It is worth waiting for as its suppose to run quite a bit cooler -
I think you are refering to
http://vr-zone.com/articles/nvidia-40nm-mobile-gpus-line-up-for-2009/6378.html?doc=6378
and
http://www.laptopvideo2go.com/forum/index.php?showtopic=22018
Looks like Nvidia is moving its former high-end desktop line the 9800 GT and 9800 GTX+ to the low end of the 200 seris that is 240 and 250 GTS.
You can check that on wikipedia but that is not 100% reliable.
http://en.wikipedia.org/wiki/Compar..._Processing_Units#GeForce_9_.289xxx.29_series
The question is what will we see in notebooks?
I think the answer is in the power consumption. Notebook GPUs are modified versions of the desktop ones.
For example, you can be sure the desktop 295 will hardly ever get into a notebook because it consumes 300 W even on the 55nm technology. Watts = heat
Given that they are moving to 55nm on the notebook cores as well I suppose what we will see is a higher clocked version of the previous cores that were on 65nm.
Probably the current 9800M will become the 170M or a 180M GT/GTS and the Quadro 3700M which is basicly the 9800M with 128 shaders wil become 180M GTX.
They might also increase the frequencies, but that won't matter much in my opinion. I am already running the 9800M GTX at 600/1500/850 opposed to its stock 500/1375/800 and there are no heat issues whatsoever so I think performance will remain pretty much the same overall even with increased frequencies. People will run their cards overclocked anyways.
As for the 200 seris on notebooks. I can see them putting in a 260 core , maybe with some shaders cut and on 40nm however they will probably find it hard to go below 75W with it, which means you will only see it in big 17inch notebooks (Alienware M17, Clevo D90X).
My money is on the 40nm technoloy. Once that is out, I will be looking for upgrading my video card - thank God for MXM. -
The 180M will be a die shrink, probably with higher clocks. The true successor wil be the GT200M series, which is on a Q3-Q4 schedule.
-
Red_Dragon Notebook Nobel Laureate
Yes 40nm looks really good for whenever that happens
-
any word on when these cards are coming out?
will they increase the interface form 256 bit to say 512? Seems to me like this has the biggest effect on performance. -
Red_Dragon Notebook Nobel Laureate
im guessing around May/June we'll have to wait it out though.
-
It is unlikely to be so soon, May/June. The first 100M series will beging SHIPMENT in March so probably in May/June they will be out with the 180M.
My guess is August for the 200M series. -
My guess comes from the GTX280 being way too power hungry, and the GTX300 will be too expensive to feasibly be used in notebook form. ATi's cheaper and more powerful 4000 series is about obliterate the 9000 series in the price/performance ratio, so Nvidia will need to build from a chip which can be brought down in size and price quickly, and the 260-216 is the only chip I see as being possible. -
My guess is the fastest GT200M will be a GTX260 216 cores. The next best one will be a 192 cores.
I doubt there will be any low-end or even mid-end GT200 part since the chip is too big, and selling GT200 cut-down is a big loss and nVIDIA definitely won't go with that route. So G92's lifespan is still very long, IMO.
If nVIDIA sells GT200M card like they do now (~$600 - $800 just for a card), they are gonna be in big trouble (provided HD4870 Mobility is as fast as its desktop counterpart) -
Yeah, I have the same guess that it will be a 260 on 40nm. Still, the 55nm 260-216 is eating 170W.
Considering that they used the 105W 9800 GT to get to a 75W 9800M GTX even cutting the shaders I don't see how they will bring that down to below 100W.
Maybe we could figure that out by looking at shader numbers vs. consumption. I think it's around 20 shaders per 20Watts - I think based on the comparison table above.
Hmm 170W now moving to 40nm than it will be 125W cuttin 30 watts to get to a lower consumption and with some power saving they might hit the 75Watt limit. Sounds good!
The Watt issue is nto about consumption, is about heat. All those watts turn into heat that the notebook needs to evacuate. -
Lower-clocks are the only things that I can think of to cut down the wattage. But that makes the GT200M slower than the HD4800 Mobility. Looks like it's quite a tough job being an engineer for nVIDIA at this time.
-
-
clocks are not really that important, as long as they are reasonable, the bus and shader processors are more important.
-
You are right, I am sorry, my mistake. I was looking on wikipedia for that and ... it's not the reliable.
From this link it seems it's 140 W for the 260 55nm. Here's the link:
http://www.hardware-infos.com/tests.php?test=54&seite=13
And from these guys 110W:
http://www.xbitlabs.com/articles/video/display/evga-geforce-gtx260-216-55nm_5.html#sect0
Edit:
I doubled checked and it's probably very optimistic to see it as 105-110W. On the same benchmark they've put the 280 GTX 65nm at 180W consumptin while officially it eats 236W.
http://www.nvidia.com/object/product_geforce_gtx_280_us.html
So I think that the 260 - 216 55nm is about 140W not 170W. Probably on 40nm and with some power saving, it just might get to 75W. So yeah, it looks like the best option for the next Nvidia high-end.
They might also make a 280 40nm version for notebooks, maybe as a quadro or something. They did that with the 9800 GTX -> 3700M.
-
It's not only heat, but if you're talking a notebook, you're talking battery life. 100W=quick death for a notebook on battery. Who wants 30 minutes on battery only? If anything that netbooks have proven, fewer watts are much better, and are willing to sacrifice performance for it.
-
The other problem with fitting the GT200 GPU in the notebook is the physical die size. It would have been very difficult to do with the original GT200 GPU (576mm²), because that would take up a lot of space on a typical MXM PCB. Some of the recent nVidia GPUs are monolithic, so it appears the 40nm will bring the die size down to where it can be utilized for mobile platforms.
-
gary_hendricks Notebook Evangelist
ATi is way ahead of nVidia in this issue -
nvidia has this hybrid technology which allows to run the notebook on integrated graphics when the graphics power is not needed, this should help out with battery life a lot
-
But you need a nVIDIA chipset for hybrid.
-
We just got new info today. So seems we got the idea right that the next high-end video card will have 128 shaders based on the Quadro 3700M. But we got the naming wrong... it's not the 180M it's the 280M.
http://www.nvidia.com/object/notebooks.html
So then... what happened to the 180M ? Probably was dropped.
And what about the 260 - 216 40nm ? Will that become the 295M ?
I also updated by roadmap post:
http://forum.notebookreview.com/showthread.php?t=355818 -
Anything 100 series is not worth "waiting" for since its already here but those will only be minor changes...
200m series is what you really want -
Yawn. GTX 280/260M are still G92b.
Next Gen Graphics from nVidia
Discussion in 'Gaming (Software and Graphics Cards)' started by CuriousN, Feb 21, 2009.