The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Next Gen Graphics from nVidia

    Discussion in 'Gaming (Software and Graphics Cards)' started by CuriousN, Feb 21, 2009.

  1. CuriousN

    CuriousN Notebook Evangelist

    Reputations:
    0
    Messages:
    359
    Likes Received:
    2
    Trophy Points:
    31
    So nVidia is beginning to replace their 9000 series. Their desktop line is already completely replaced, and they also came out with 130m which replaces 9600m GT.
    When are they releasing 9800m replacement? Is it worth waiting for?
     
  2. F!nn

    F!nn Notebook Consultant

    Reputations:
    62
    Messages:
    182
    Likes Received:
    0
    Trophy Points:
    30
    If you don't need it right away, it's worth waiting for!
     
  3. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    it should be it is called the 180m i believe i saw a chart of it im not sure where i saw it though.

    It is worth waiting for as its suppose to run quite a bit cooler
     
  4. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I think you are refering to
    http://vr-zone.com/articles/nvidia-40nm-mobile-gpus-line-up-for-2009/6378.html?doc=6378

    and

    http://www.laptopvideo2go.com/forum/index.php?showtopic=22018

    Looks like Nvidia is moving its former high-end desktop line the 9800 GT and 9800 GTX+ to the low end of the 200 seris that is 240 and 250 GTS.
    You can check that on wikipedia but that is not 100% reliable.
    http://en.wikipedia.org/wiki/Compar..._Processing_Units#GeForce_9_.289xxx.29_series

    The question is what will we see in notebooks?
    I think the answer is in the power consumption. Notebook GPUs are modified versions of the desktop ones.

    For example, you can be sure the desktop 295 will hardly ever get into a notebook because it consumes 300 W even on the 55nm technology. Watts = heat

    Given that they are moving to 55nm on the notebook cores as well I suppose what we will see is a higher clocked version of the previous cores that were on 65nm.
    Probably the current 9800M will become the 170M or a 180M GT/GTS and the Quadro 3700M which is basicly the 9800M with 128 shaders wil become 180M GTX.
    They might also increase the frequencies, but that won't matter much in my opinion. I am already running the 9800M GTX at 600/1500/850 opposed to its stock 500/1375/800 and there are no heat issues whatsoever so I think performance will remain pretty much the same overall even with increased frequencies. People will run their cards overclocked anyways.

    As for the 200 seris on notebooks. I can see them putting in a 260 core , maybe with some shaders cut and on 40nm however they will probably find it hard to go below 75W with it, which means you will only see it in big 17inch notebooks (Alienware M17, Clevo D90X).

    My money is on the 40nm technoloy. Once that is out, I will be looking for upgrading my video card - thank God for MXM.
     
  5. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The 180M will be a die shrink, probably with higher clocks. The true successor wil be the GT200M series, which is on a Q3-Q4 schedule.
     
  6. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    Yes 40nm looks really good for whenever that happens :p
     
  7. CuriousN

    CuriousN Notebook Evangelist

    Reputations:
    0
    Messages:
    359
    Likes Received:
    2
    Trophy Points:
    31
    any word on when these cards are coming out?
    will they increase the interface form 256 bit to say 512? Seems to me like this has the biggest effect on performance.
     
  8. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    im guessing around May/June we'll have to wait it out though.
     
  9. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    It is unlikely to be so soon, May/June. The first 100M series will beging SHIPMENT in March so probably in May/June they will be out with the 180M.

    My guess is August for the 200M series.
     
  10. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    My guess has been that the mobile GT200 will be based from the 40nm shrink of the GTX260 Core 216, so if I'm right, that would mean a 448-bit bus. A number of shaders will need to be cut off, so the GDDR5 will probably be in compensation for this.

    My guess comes from the GTX280 being way too power hungry, and the GTX300 will be too expensive to feasibly be used in notebook form. ATi's cheaper and more powerful 4000 series is about obliterate the 9000 series in the price/performance ratio, so Nvidia will need to build from a chip which can be brought down in size and price quickly, and the 260-216 is the only chip I see as being possible.
     
  11. nhat2991

    nhat2991 Notebook Consultant

    Reputations:
    54
    Messages:
    205
    Likes Received:
    0
    Trophy Points:
    30
    My guess is the fastest GT200M will be a GTX260 216 cores. The next best one will be a 192 cores.
    I doubt there will be any low-end or even mid-end GT200 part since the chip is too big, and selling GT200 cut-down is a big loss and nVIDIA definitely won't go with that route. So G92's lifespan is still very long, IMO.
    If nVIDIA sells GT200M card like they do now (~$600 - $800 just for a card), they are gonna be in big trouble (provided HD4870 Mobility is as fast as its desktop counterpart)
     
  12. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Yeah, I have the same guess that it will be a 260 on 40nm. Still, the 55nm 260-216 is eating 170W.

    Considering that they used the 105W 9800 GT to get to a 75W 9800M GTX even cutting the shaders I don't see how they will bring that down to below 100W.

    Maybe we could figure that out by looking at shader numbers vs. consumption. I think it's around 20 shaders per 20Watts - I think based on the comparison table above.

    Hmm 170W now moving to 40nm than it will be 125W cuttin 30 watts to get to a lower consumption and with some power saving they might hit the 75Watt limit. Sounds good! :D

    The Watt issue is nto about consumption, is about heat. All those watts turn into heat that the notebook needs to evacuate.
     
  13. nhat2991

    nhat2991 Notebook Consultant

    Reputations:
    54
    Messages:
    205
    Likes Received:
    0
    Trophy Points:
    30
    Lower-clocks are the only things that I can think of to cut down the wattage. But that makes the GT200M slower than the HD4800 Mobility. Looks like it's quite a tough job being an engineer for nVIDIA at this time.
     
  14. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Where'd you get 170w? The 65nm was 182w, so there's no way they only cut 12w from the power consumption.
     
  15. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    clocks are not really that important, as long as they are reasonable, the bus and shader processors are more important.
     
  16. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    You are right, I am sorry, my mistake. I was looking on wikipedia for that and ... it's not the reliable.

    From this link it seems it's 140 W for the 260 55nm. Here's the link:
    http://www.hardware-infos.com/tests.php?test=54&seite=13

    And from these guys 110W:
    http://www.xbitlabs.com/articles/video/display/evga-geforce-gtx260-216-55nm_5.html#sect0

    Edit:
    I doubled checked and it's probably very optimistic to see it as 105-110W. On the same benchmark they've put the 280 GTX 65nm at 180W consumptin while officially it eats 236W.
    http://www.nvidia.com/object/product_geforce_gtx_280_us.html

    So I think that the 260 - 216 55nm is about 140W not 170W. Probably on 40nm and with some power saving, it just might get to 75W. So yeah, it looks like the best option for the next Nvidia high-end.

    They might also make a 280 40nm version for notebooks, maybe as a quadro or something. They did that with the 9800 GTX -> 3700M.

     
  17. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It's not only heat, but if you're talking a notebook, you're talking battery life. 100W=quick death for a notebook on battery. Who wants 30 minutes on battery only? If anything that netbooks have proven, fewer watts are much better, and are willing to sacrifice performance for it.
     
  18. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    The other problem with fitting the GT200 GPU in the notebook is the physical die size. It would have been very difficult to do with the original GT200 GPU (576mm²), because that would take up a lot of space on a typical MXM PCB. Some of the recent nVidia GPUs are monolithic, so it appears the 40nm will bring the die size down to where it can be utilized for mobile platforms.
     
  19. gary_hendricks

    gary_hendricks Notebook Evangelist

    Reputations:
    29
    Messages:
    561
    Likes Received:
    0
    Trophy Points:
    30
    i agree.

    ATi is way ahead of nVidia in this issue :)
     
  20. CuriousN

    CuriousN Notebook Evangelist

    Reputations:
    0
    Messages:
    359
    Likes Received:
    2
    Trophy Points:
    31
    nvidia has this hybrid technology which allows to run the notebook on integrated graphics when the graphics power is not needed, this should help out with battery life a lot
     
  21. nhat2991

    nhat2991 Notebook Consultant

    Reputations:
    54
    Messages:
    205
    Likes Received:
    0
    Trophy Points:
    30
    But you need a nVIDIA chipset for hybrid.
     
  22. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
  23. Beatsiz

    Beatsiz Life Enthusiast

    Reputations:
    95
    Messages:
    1,411
    Likes Received:
    0
    Trophy Points:
    55
    Anything 100 series is not worth "waiting" for since its already here but those will only be minor changes...

    200m series is what you really want ;)
     
  24. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    Yawn. GTX 280/260M are still G92b.