The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Will the GTS 250m be just a shrunken 9800GT/8800GTX?

    Discussion in 'Gaming (Software and Graphics Cards)' started by jk6959, Aug 27, 2009.

  1. jk6959

    jk6959 Notebook Consultant

    Reputations:
    7
    Messages:
    291
    Likes Received:
    0
    Trophy Points:
    30
    I'm waiting to upgrade my ancient dell laptop and trying to find out more about the upcoming graphics cards / cpu's - and I'm very interested in the GTS 250m for it's incredible performance/watt potential, but all of the information is the same apart from the graphics core (GTS 250m is the GT215 core vs G92 core for others)
    Are there any significant improvements brought about by this core-change, or is it just a name change upon name-change and it'll just be a shrunken old chip?

    Any links to information on this GT215 core would be appreciated
     
  2. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    [​IMG]

    All you need to know. It's not a shrunken 8800M GTX, it's an entry level GT200 chip.
     
  3. PanzerVIZeke

    PanzerVIZeke Notebook Consultant

    Reputations:
    10
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    Now I see, the Nvidia does use newer tech in laptops, but it's slower.

    So you either choose wether newer tech or better performance.The GTS 250/260M uses the famed GT200 core, and even uses GDDR5, but it's much slower than the GTX 280M, but the 280M uses old G92 core with only GDDR3.

    It's a toss up.You either have to choose newer tech, more efficiency, but slower, or choose older tech with performance.The GTS 250 is more ideal for laptop though, because it probably runs cooler while still giving decent performance.
     
  4. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    They're certainly not slower, these are the mid range replacements, and they blow the previous crap out of the water.

    The GTS 260M is on the same level as a 9800M GT, which is pretty good. But on top of that, it consumes less energy and produces less heat, and being 40nm it has great overclocking potential.
     
  5. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    Well the GDDR5 models have yet to be tested. The Gt240M seems to be on par with the HD4670 so it's nothing revolutionary. GDDR5(either from ATI or Nvidia) is basically what the mid-range field is anticipating.
     
  6. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The GTS 350M/360M will be replacing the GTX 260M/280M.
     
  7. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    One has to wonder how the GT300's will turn out..
     
  8. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I still think that one of the high-end chips will be a 40nm GTX 260, probably the Core 216, but with a 256-bit bus and GDDR5.
     
  9. jk6959

    jk6959 Notebook Consultant

    Reputations:
    7
    Messages:
    291
    Likes Received:
    0
    Trophy Points:
    30
    My main point wasn't about the overall specs of the new 200 series GPUs it was more asking about what difference the Core change would really bring. Here's the specs of the GTS 250m and the 8800GTX/9800GT

    Card: 8800GTX/9800GT
    Based on (Desktop Core) G92 Core
    Pipelines 96
    Core Speed * 500 MHz
    Shader Speed * 1250 MHz
    Memory Speed * 800 MHz
    Memory Bus Width 256 Bit
    Memory Type GDDR3
    Current Consumption 65w

    Card: GTS 250m
    Based on (Desktop Core) GT215
    Pipelines 96
    Core Speed 500 MHz
    Shader Speed 1250 MHz
    Memory Speed 1600 MHz
    Memory Bus Width 128 Bit
    Memory Type GDDR5
    Current Consumption 28w


    So specs-wise, the main differences are (8800GTX vs GTS 250m order):
    DX10 vs 10.1 // 512mb max memory vs 1gb max memory // G92 core vs GT215 core


    Are there any significant differences arising from the core (or is it the DX 10.1 compatibility alone) or is it just a 9800GT (same apart from core, 128bit+gddr5 approx = 256 bit+gddr3) at a mere 43% of the energy usage?
     
  10. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The memory speed is way higher, and the core is said to be much more efficent clock for clock.
     
  11. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    These are still pretty high-end cards, the doubled memory speed makes up for the 128bit bus, essentially making it a 256bit card.

    As for the GDDR3 models they are the same essentially as this gen, but with some more shader processors.

    So to answer your question, yes, basically the same high-end card just with less power consumption.
     
  12. NAS Ghost

    NAS Ghost Notebook Deity

    Reputations:
    297
    Messages:
    1,682
    Likes Received:
    0
    Trophy Points:
    55
    Which basically means, those that got a system with the 8800mGTX are missing almost nothing in the performance class.
     
  13. jk6959

    jk6959 Notebook Consultant

    Reputations:
    7
    Messages:
    291
    Likes Received:
    0
    Trophy Points:
    30
    The Memory speed is higher because it is GDDR5 compared to GDDR3, when you take into account the 8800/9800 is 800mhz/256bit/gddr3 compared to the GTS 250m with 1600mhz/128bit/gddr5 they are the same.

    However if the core is more efficient clock for clock then I could expect something more powerful than the 8800/9800 rebrands.
     
  14. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    The GTS 250m should be a bit slower than the 8800M GTX (though this could be different if we're talking about games that support DX 10.1). Although it uses GDDR5, it's also using a 128bit bus, and because of GDDR5's relatively high latency, 256bit + GDDR3 will be faster than 128bit + GDDR5 granted that everything else (shader count, etc.) is still the same.
     
  15. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    If the clock speeds are double what the 8800m GTX, which it is, than it will be the same performance, 250m-wise. I highly doubt memory latencies will effect performance, and its likely probable that lower-latency memory will be put it because it is graphics memory, it needs to be quick. The extra money saved by doing a 128bit fabrication process could be put towards that.

    Still, I dont think latencies will be an issue.
     
  16. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    There will be a performance hit despite the fact that the memory clocks are doubled, and it will be due to the latency unless there is improvement in newer GDDR5 chips. Remember, the difference in latency is not like what is witnessed when comparing GDDR3 vs GDDR4 memory. Matter of fact, Computerbase.de did a test where they downclocked a desktop ATi 4870's memory clock to effective GDDR3 speeds and compared them to a 4850. This can be seen here.

    This also explains why a desktop 4770 isn't significantly faster than a 4830, even though the latter has a lower core and effective memory clock.
     
  17. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    But the clocks are not the same, I see what your saying and what that article proves, but latencies wont be an issue because GDDR5 is clocked so much higher, there would be no reason to use it if it was clocked the same as the GDDR3. The bus differences will make them the same technically in this case, but for the price, and power consumption drop, and if your correct about the latencies, maybe a 10% drop in performance?

    The article says just 1600x1200, on what exactly? Its not very informative, doesnt say what game or application, what the other settings were, etc etc.

    Cannot really say until they come out though, mostly just a speculation thread, say take anything said with a a grain of salt.
     
  18. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    i care more about the pricing. nvidia's official site list the gts250/260m as 'high performance' cards. i hope they can keep the price down.
     
  19. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    I think they removed the original article, but other people have tested this on their own and got similar results. Just google GDDR5 vs GDDR3.

    As for the clocks, of course GDDR5 is clocked higher to make up for the lack of bandwidth from using a 128 bit bus compared to using a 256 bit interface. But that's just bandwidth without taking into account the speed in which all that bandwidth is processed (the latency; or conversely, how long the delay is between processing). But you're right, it's all speculation at this point, but from what has been tested, it should be slower than an 8800M GTX when running DX9 and DX10 and it could actually be faster in games that support DX10.1.
     
  20. Kamin_Majere

    Kamin_Majere =][= Ordo Hereticus

    Reputations:
    1,522
    Messages:
    2,680
    Likes Received:
    0
    Trophy Points:
    55
    I'm still jonesing for the 260GTS. That (theoretical) preformance for 38watts... WOW

    Tow of the 260GTS's in SLI are only going to be pushing 11 watts more than an 8800GTX and will give a pretty huge boost in preformance.

    With as low as these power draws are i wonder whos going to be the first laptop maker to have a 3 or 4 way SLI configuration :p (j/k)
    But it would be amazing to have SLI in a form that doesnt take insane cooling to make useable
     
  21. NAS Ghost

    NAS Ghost Notebook Deity

    Reputations:
    297
    Messages:
    1,682
    Likes Received:
    0
    Trophy Points:
    55
    Clevo, and no, Im not jk :p
     
  22. PanzerVIZeke

    PanzerVIZeke Notebook Consultant

    Reputations:
    10
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    Wow.....put a couple GTS 250/260M's in SLI, and you have there a great gaming laptop with less power draw, less heat, newer tech, and more efficient.Where can I get one? :p

    Heck, I'll even settle for a Single card configuration.Can't wait to see this get into laptops.
     
  23. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    260M's are readily available now. Check out XoticPC.com
     
  24. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The GT S is not available, anywhere.
     
  25. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    The question is will the mobile versions be 300 series just in name or will they also share the new architecture.

    Nvidia still has room left after the GTS 260m to add shaders and a larger memory bus width to a G200 based 40-nm die and produce cards with better performance than the GTX 260m and GTX 280m. What's more is that, even though the G214 and G212 GPUs couldn't be shrunk to 40-nm, Nvidia still has plans to release a 40-nm G215 GPU which is a good indication that they're not ready to abandon the G200 tech completely.

    Given Nvidia's history of milking their GPUs until they shrivel up to dust and blow away I'd be knocked-to-the-floor stunned if mobile 300 series cards were based off the new desktop 300 series architecture, and not instead something like 112 to 128 shader, 256-bit bus versions of the G200 series GPUs.
     
  26. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I'm betting on 300M being G200 based, with at least one being the desktop GTX 260 in 40nm form. I'm not sure if they can go higher than 256-bit, so maybe Nvidia will go 256-bit w/ GDDR5 memory. I'd be fine with that, but there had better be more than a measly 128 shaders on the highest model.
     
  27. jk6959

    jk6959 Notebook Consultant

    Reputations:
    7
    Messages:
    291
    Likes Received:
    0
    Trophy Points:
    30
    Man I wouldn't mind seeing SLI GTS 250m, that'd only be around 56w TDP, less than a 160m and around 25% less than a GTX 260/280m with potential for great scaling as more games are adding dual GPU options... Still dreaming of a power-efficient gaming monster, hoping the GTS 250m will cover my battery-life and performance wishes, the GTS 260m doesn't seem like it would be significant performance increase for more than 35% increase in TDP.
     
  28. Huskerz85

    Huskerz85 Notebook Evangelist

    Reputations:
    48
    Messages:
    508
    Likes Received:
    0
    Trophy Points:
    30
    Same here....
     
  29. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    It should be cheaper due to the 40nm process. The GTX 280M came out cheaper than the 9800M GTX, after a die shrink...
     
  30. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Well like I said Nvidia/TSMC couldn't shrink the G214 to 40-nm so figuring a 300M GPU's shaders would fall between the 96 of the GTS 260m and the 240 of the G214 would be a reasonable guess. ;)
     
  31. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    but it's still over 500$, a gtx285 is cheaper than that.
     
  32. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    But $400 cheaper than the 9800M GTX, nonetheless. Hopefully the 40nm chips can come in well under $300. That would be progress.
     
  33. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    what? you mean the 9800m gtx used to cost 900$?
     
  34. leah2255

    leah2255 Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    wow, that can't be true.
     
  35. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Yes, at launch it ran $800 to $900 for a single GTX, while the 9800M GT was going for $500.
     
  36. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    Desktop cards have always been cheaper and more powerful than notebook cards on average. So I don't really see the point of this comparison.