The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    NEW NVIDIA 200 series gpus

    Discussion in 'Gaming (Software and Graphics Cards)' started by jacobxaviermason, Jun 18, 2009.

  1. jacobxaviermason

    jacobxaviermason Notebook Consultant

    Reputations:
    329
    Messages:
    260
    Likes Received:
    1
    Trophy Points:
    31
  2. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Revisions, revisions, revisions..................this is getting so stupid. Nvidia needs to focus on making truly new products, plus the development time wasted in these revisions probably couldn't be too cost effective. I prefer ATi's approach where you have a product line that is not to deceptive in naming scheme, and as far as revisions go, you pretty much only see one revision, and then that level of performance relative to the entire series is increased via the next series with a brand new top of the line GPU in the series, with the medium end performance being viewed as lower end for the next series, the high end as the medium end, and a new high end foundation GPU. Now if more computer companies would just offer ATi cards. Glad to see HP pushing ATi via their newer notebook products, be them AMD or Intel based. Haha I'd actually love to see HP not even offer an Intel machine with Intel integrated graphics. Man that would off the suits at Intel.
     
  3. Cicero

    Cicero Notebook Enthusiast

    Reputations:
    24
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    5
    There already is a thread about this.

    And these aren't simply revisions. Notice that with the exception of the G 210M, all have either higher stream processor counts or use GDDR5 memory. The GT 230M and GT 240M are both positioned to take over for the the GT 120M and GT 130M and aren't mere die shrinks: they have 48 stream processors as opposed to 32. Furthermore, the GTS 250M and 260M have 96 stream processors along with GDDR5 memory paired with a 128-bit bus. Lastly, read the article: they are all DirectX 10.1 complaint whereas the previous generation were only DirectX 10. These are new cards--not revisions.
     
  4. Mormegil83

    Mormegil83 I Love Lamp.

    Reputations:
    109
    Messages:
    1,237
    Likes Received:
    0
    Trophy Points:
    55
    Looks promising too :)
     
  5. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    There already is an Nvidia GPU with 48 shader units. I can't remember it's codename but the 9700GT/GTS both used it. Halving the memory bit bus and using GDDR5 is a type of revision in my eyes because the core component, the GPU, is still the same design even at 40nm. Yeah sure, it gives us higher clock speed potential, but realistically probably nothing more than 20 or 30 percent at best over each GPU's previous incarnation probably. And I don't think it's that difficult to make a GPU shader compliant to something it was previously not. Since ATi and Nvidia did the same thing when revising many of their lower end GPUs like ATi's X600 (DX 9.0b) being revised into the X1300 (DX 9.0c). A good example of Nvidia is the 6200 (DX 9.0b) being revised into the 7300GS (DX 9.0c).
     
  6. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    You don't know what you're talking about. These cards have been in devolopment for a while now, GT214, GT215, GT216, GT218, etc. It's a completely new core that's a little late but welcomed.
     
  7. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Yeah...these aren't revisions, or anything like what they did with the 8800M GTX/9800M GT, GT 130M, etc. These cards are based on new mobile technology, specifically what nVidia would label the GT215-GT218 cores.

    And to be fair, although ATi prefers to base their mobile GPU directly from its desktop counterpart (which is why we didn't see a Mobility HD2900XT), they are starting to use some similar tactics as nVidia employs in the notebook market. For instance, the Mobility 4860 is based on the RV740/Desktop 4750 and the Mobility 4830 is based on the desktop 4670. It's not really deceptive in terms of performance because their named according to how they perform.
     
  8. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    incorrect. the mobility 4830 has the r740 core.

    back to topic, these are not revisions, not revolutionary but at least no lame rebranding. i'm interested in the gts 250, looks promising if they price it well.
     
  9. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Well as far as I'm concerned it's a revision, die shrinks with a different memory interface and VRAM don't count as GPU redesigns and I'd hardly consider it a "new" product when it's actually an older product changed up a bit with some bolted on features, because frankly I had 96 shaders on the 8800GTS I had on my desktop. I think Nvidia needs to actually create new mobile GPUs instead of acting like Intel with their Netburst architecture back in the GHz wars.
     
  10. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    yeah ur right. It's just the memory that's different.
     
  11. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    These GPUs are built on the genuine GT200 architecture. They are not G92 based chips. That completely negates the "just a die shrink" line of thought.

    The reason they only have 96 shaders, is because these were originally intended to be the entry level performance chips on the desktop side. They are not in the same class as the upcoming enthusiast class GT212 mobile GPU, which is still on the schedule for Q4.

    These. Are. New. GPUs. Period.
     
  12. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    you're so stubborn...
     
  13. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Well as far as I can ascertain, these are revision die shrinks, there is no mention of real architectural changes, though I did see mention of the ROPs being redesigned just to make these new cards DX10.1 compliant, in which case I'd be wrong, but still partially right, because it's a partial redesign. ATi did the same thing IIRC with the 4000 series, notably the 43xx and 45xx, redesigning the TMU and ROPs which allow them to perform on the same level as the 3650 card which has 50 percent more stream processors. As for the desktop GTX260s and 280s I believe they still use the same shader design as the original 8800s, just more of them, plus die shrink and higher clock speeds. I've seen mention of Nvidia having to redesign the shader structure for the 300 series because they've hit a performance wall with the design used in the 8, 9, and 200 series and will have to use a new design in order to get better performance with higher numbers of shaders. Communication between hundreds of little simple processors probably makes having such high numbers of shaders difficult, I can see where it would be a big issue. I can't see Nvidia using a "brand new GPU" in a line where the top of the heap GTX280M is a die shrunk 9800M GTX is getting beat out by it's lower number bretheren. Otherwise Nvidia would actually be touting these "new" cards as being above and beyond it's own king of the hill GTX280M.

    And yes, I am stubborn.
     
  14. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    It's a revision in the sense that it's still a GPU with the same unified shader prinicpal. But everything else about it has been changed. Nvidia's a year late with DX10.1 because they had never prepared for DX11, like Ati had. Nvidia can be stubborn, because they feel they are the larger company and they have the say in things.

    GT300 will be built upon these cards. When it comes to a large fabrication shrink, your entry and mid level cards always come first. It will take time before they have 256 bit and larger GPU's ready, because 40nm is shaky territory for now.
     
  15. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    If it's just a "revision" then show me an older nVidia card that has 96 shader processors with a 128 bit memory interface that uses DX10.1. Heck, show me one with just 96 SPs on a 128 bit interface or anything remotely close to that. Using your reasoning, ATi is guilty of this as well.

    Fanboyism is all fine and dandy, but at least be reasonable with your assertions.
     
  16. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    I never said ATi didn't do it, but Nvidia unreasonably does it too much. They should be creating actual new products that truly break from the previous. I think the "GTX280M" was kind of a slap in the face considering it's not relateable to the desktop GTX280 at all. Feature wise, Nvidia has been catching up, performance wise though, ATi has been catching up and in many respects has surpassed Nvidia because Nvidia wants to keep creating rebrands and revisions instead of a new GPU to give us truly higher capabilities. They're milking it to the last drop I suppose before they have to create something truly new. ATi is guilty of rebranding and revisioning too. To further expose my previous example of the X600 to the X1300, I didn't mention that the old Radeon 9600s was the previous version of the chips. ATi used to be as bad as Nvidia, if not worse, and they are not being prudent about their naming scheme like they were with the 3000 series when it comes to the desktop 4xxx series, they are releasing numbers all over the place now too. But luckily their mobile line still has stayed relatively uncluttered. Besides, didn't Nvidia just release the GT100 series mobile GPUs just recently? This is getting out of hand. People are going to think the GTS210 is better than a GT160 when it's not true. Nvidia should wait to clear out the GT100 line first before moving on to these 2xx cards. I don't think it's a very responsible practice in relation to the consumers. Sure consumers should be informed, but there is so much to know, just like with a car.
     
  17. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    I apologize for my initial rebuttal, but it seems as though you take issue with the naming scheme as well as the way in which mobile technology is ported from desktop technology. To the first point, you shouldn't view the numbering scheme like the way you seem to (the GT 210 vs the GTS 160M). Performance (relative) is based on the last set of digits, just like when they they had the 7000, 8000 and 9000 naming.

    To your last point, I do agree that nVidia should wait till the G100 series of cards are "used up", but it's actually a last minute marketing decision in response to the ATi Mobility 4000 series of chips. Matter of fact, the GTX 280M is really the GTX 180M (you can see this in certain parts of the BIOS description and even ORB), but had they went with naming it the GTX 180M, or GTX 170M, it probably wouldn't sell as well as if they named it using the 200M series connotation.

    But pertaining to the topic, I wouldn't say that these chips (probably with the exception of the GT 210) are mere revisions.
     
  18. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Having read both sides of the argument, could someone provide a brief summary of the actual differences between the newly announced set of chips and existing products?

    I've seen differences in manufacturing size, VRAM used, shader units and what not, but I'm still not sure what's the aggregate difference between the newly announced lineup and the GT line of refreshes currently on sale.
     
  19. v_c

    v_c Notebook Evangelist

    Reputations:
    124
    Messages:
    635
    Likes Received:
    0
    Trophy Points:
    30
    I suspected they were G92 revisions based on the specs/numbers, but looking at some of the articles they have said they are proper G200 architecture. We'll see in time I guess.

    Anyway, this kind of confusion was bound to be cause by Nvidias stupid naming schemes.
     
  20. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    I don't understand why they couldn't have just started with the "geforce 1" and gone from there...so we'd be at like 100 now or something... That way numerical value actually meant something instead of a 7900 destroying a 9300, but oh well, that would have been way too simple.
     
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The first number is the generation, the second is the class, the third is a revision. It's that simple.

    As an aside, after this point, anyone who thinks these are G92 chips is being dense.
     
  22. boypogi

    boypogi Man Beast

    Reputations:
    239
    Messages:
    2,037
    Likes Received:
    0
    Trophy Points:
    55
    nvidia should introduce a new generation of cards that will bring a huge performance jump like the 79xx to the 88xx ;)
     
  23. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    They all ready have.. it's called GT200 and it's still desktop only.

    If you didn't know already, normally the entry level and mid range GPUs come first for a large fabrication shrink, followed by the high end GPU's because the tech hasn't been refined enough. In this case with the 40nm GT200 followups, that's how it works out. Even more interesting this time is that began with mobile cards, because it really just makes sense with this entry level platform and the mobile market is huge.

    With GT200, Nvidia made a mistake and went straight for the expensive to produce GT200, which is 512 bit and 55/65nm, and never had a chance of making it to mobile form. This is why Nvidia had to cut their prices in half to compete with the 4870 and is why the mobile 4870 should be a killer card right now (but Nvidia still had something decent in G92b and Ati was slow to mobile GDDR5). So they've had to skip the mobile version of GT200b, and redo the whole architecture in 40nm, and evolve their DX10 tools to DX10.1 and DX11 capable while they're behind.

    At least they are doing things right this time, and before you know it you'll be looking at official DX11 mobile GT300 spec. This is the precursor to that.
     
  24. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Well call me crazy or fanboy but ATI is ruling these days ;). Nvidia must wake up!.
     
  25. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    How? Nvidia has the most powerful desktop and notebook gpus out right now. (295 gtx and the 280m gtx)
     
  26. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    I'm quite familiar with the numbering and naming scheme. And even Nvidia seems to chart and compare their "new" mobile graphics cards the same way I do, much like revisions/rebrands. Shrinking the die, revising the ROPs to allow Direct X 10.1 compliance, and slapping on the words "new GPU!" doesn't sit well with me. But I guess I feel wrong about the idea of these being new "graphics cards" as they are new products in the card and entire graphics system itself, but I still look at the GPUs themselves as revisions. My view on revisions/rebranding doesn't have much to do with the memory interface or VRAM type, but with the actual GPU itself. That's how I look at these things, especially since Nvidia only creates guidelines on how the GPUs are set up, but I bet some manufacturer will try to release these GPUs with something below the intended spec layed out by Nvidia. It's possible. If this doesn't happen, and cards intended to be GDDR5 equiped stay that way (no GDDR3 or GDDR2s on 128) then I'll be quite happy to see it. No card these days I don't think should be using anything below GDDR3.
     
  27. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Ati has newer technologies it already uses 40nm, GDDR5, directx 10.1 support(since long ago...) And nvidia is starting just now... Btw FOR ME the most powerful card today is the HD4890.
     
  28. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    The 4890's a 256 bit card...incredibly weak compared to newer desktop gpus.
     
  29. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
  30. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    You're comparing 2x crossfired ATI cars to 1x nvidia card. If you did that same test with 295's in SLI, the 295's would rock ATI.

    Edit: Actually the 295 gtx beat the 4890 in 3dmark vantage and most real gaming...so there's you're real world performance. Your own source disproves you.
     
  31. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Oh yeah... Anyway the ATI ones have better blu ray support and are also cheaper :p. With newer tech before nvidia.
     
  32. 000022

    000022 Notebook Consultant

    Reputations:
    20
    Messages:
    243
    Likes Received:
    0
    Trophy Points:
    30
    Well, I'm more of a

    kind of a guy, regardless of other factors (tech and what's not). hehe.
     
  33. crash

    crash NBR Assassin

    Reputations:
    2,221
    Messages:
    5,540
    Likes Received:
    13
    Trophy Points:
    206
    Let's try to stay a little on topic and leave out the desktop card discussions, ok? Thanks everybody.
     
  34. spradhan01

    spradhan01 Notebook Virtuoso

    Reputations:
    1,392
    Messages:
    3,599
    Likes Received:
    5
    Trophy Points:
    106
    Every month new GPUs. :mad:
     
  35. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    What does this mean?
     
  36. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
    I hope they aren't the usual overclocked versions of the previous generation of Nvidia cards. :p Thought it was the ol' G92 core, but found out it was a GT216 core. Wished they could have used GDDR5 though, like in the 4870. Anyone took note how long the gap between the 100M series and the 200M series? Plus, anyone got an idea how long would it take for these new GPUs to make to the notebook market? If a new series (A 300M perhaps?) would be out by Xmas, then I'll be postponing buying a laptop by then. :D
     
  37. jacobxaviermason

    jacobxaviermason Notebook Consultant

    Reputations:
    329
    Messages:
    260
    Likes Received:
    1
    Trophy Points:
    31
    Not sure if you're referring to the 65nm 200 series cards, Howitzer, but the 40nm versions do sport gddr5: "The new chips are also notable for being the first from NVIDIA to support GDDR5" --Dailytech
     
  38. dondadah88

    dondadah88 Notebook Nobel Laureate

    Reputations:
    2,024
    Messages:
    7,755
    Likes Received:
    0
    Trophy Points:
    205
    well i can say i had both and i sold my 3870 crossfire because the it wasn't beaten nvidia's old tech. (8800m gtx sli) and even though they have be renaming cards, they are still on top.

    ati has more features and has alot of great ideas that nvidia is just starting to use but ati just produce the hardware and not software to bring out the best out of them.


    but anyways, i think the 280m is the last renameing they are going to do. the core is at it limits with clock speed. so far i haven't seen a 280m overclocking like a 9800m gtx. and the 9800m gtx is very close if not suroass the 280m is ocerclocking. so i think they are going to focus on the new gts and a next card.
     
  39. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
    The lower end of the 200 series (210M, 230M) does sport 40nm technology. But is the GDDR5 you refer to the default memory for these cards? But I'm just wondering why the technology for the more powerful cards (GTX 260M, 280M) are manufactured in 55nm and is still based on the G92 core rather than the new GT216.

    Crossfire 3870 does edge out from the 8800 gtx sli, wonderfully though. :D I agree with ATI being more innovative than Nvidia, taking the example of the GDDR5 card with the 4870. I guess the G92 core has reached its limits, then. Let's see where the GT216 core will lead Nvidia next.
     
  40. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Those two are G92 because they were a stopgap, just a way of holding consumer interest until these GT200 chips were converted to the mobile market. So consider the GTX 260M dead, and Nvidia is only holding onto the 280M until the higher level 40nm chip is ready to go.

    Look at this:

    [​IMG]

    All according to keikaku.
     
  41. dondadah88

    dondadah88 Notebook Nobel Laureate

    Reputations:
    2,024
    Messages:
    7,755
    Likes Received:
    0
    Trophy Points:
    205
    well it seems that laptop gpu's are moving fast like desktops now :(

    even worst you have to buy a new laptop to upgrade so far. :( :( :( :( :( :( :( :( :( :( :( :( :( :( :( :(
     
  42. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
    That's very helpful, Kevin_Jack2.0. Thanks. :) I was expecting that the 200M series GPUs would stem from the same core. So they'll release a card above the 260M and 280M I would presume, with the 40nm. But would it be for the current 200 series or for a later series?
     
  43. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    My speculation has long been that the top card will be a 40nm version of the desktop GTX 260 Core 216, but with a 256-bit bus and GDDR5 memory.

    I guess we'll have to wait and see if I'm right.
     
  44. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    I'd expect basically 2x the GTS 250, hopefully with identical if not faster clocks. 256 bit, 192 shaders, and GDDR5 with 40nm tech for high clocks is nothing to complain about.
     
  45. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    something to keep in mind is that the Nvidia GTX 295 is ~ a $530 card

    the ATI 4890 is ~ a $170 card. you could literally buy two of the ATI cards and still have a much cheaper machine than the gtx 295.

    but, whatever.
     
  46. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    256bit + GDDR5 = 512bit + GDDR3 in terms of data bandwidth. So no, it's not incredibly weak considering how it performs against the GTX 275 and GTX 285.

    The GTX 295 is essentially two GTX 260 cards in 1, just like the 4870x2 are two 4870s in one, so this comparison is valid. Would be better if it were two GTX 275s in SLi seeing as it's higher clocked than each of the 295 cores.

    Anyways, thanks to GDDR5, you don't need to manufacture a huge die with a 512 bit interface to get high performance. And as we're seeing with nVidia's new lineup, they (meaning nVidia) also agree.
     
  47. jaa003

    jaa003 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    very true, but i really prefer nvidia over ATI. i've had problems with ATI video cards in the past, but never had problems with nvidia, so i would prefer spending more on the gtx 295. this is just my personal preference tho.
     
  48. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
    I'm guessing if ATI continues to release graphics cards at a rate faster than Nvidia and keep pushing it, Nvidia will likely fastrack the 200M series, and we could see the 300 series pretty soon. :rolleyes:
     
  49. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    All this hinges on the improvement of the 40nm manufacturing process.

    I'm really enjoying this stiff competition though. Christmas looks like a nice time for a new laptop.
     
  50. BlitZX

    BlitZX Notebook Consultant

    Reputations:
    5
    Messages:
    166
    Likes Received:
    0
    Trophy Points:
    30
    Hmm. That GTS 250M looks mighty fine. This clearly isn't a case of simple rebranding (thank god). I wish they would lose the 64-bit bus on their entry models already. Also GDDR5 would have made a huge difference in smaller entry notebooks.

    But tbh, i'm more interested in the G300 architecture. Win7+DX11 is just around the corner. So what comes after 40nm? 32/28/22? 2010 will be one heck of a year. You won't need 17-inch desktop-replacement ovens to outperform consoles anymore.
     
 Next page →