The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    since the ATI 5000 series cards will be out soon ... when will the Nvidia 300 series cards come out too ???

    Discussion in 'Sager and Clevo' started by Sparky894, Oct 16, 2009.

  1. Sparky894

    Sparky894 Notebook Evangelist

    Reputations:
    4
    Messages:
    323
    Likes Received:
    0
    Trophy Points:
    30
    I was wondering that since the ati 5000 series cards will be out soon ...

    that nvidia will follow suit with the nvidia 300 series cards ...

    :confused:
     
  2. Slaughterhouse

    Slaughterhouse Knock 'em out!

    Reputations:
    677
    Messages:
    2,307
    Likes Received:
    2
    Trophy Points:
    56
    I was wondering the same thing recently. Anyone know?
     
  3. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    The ATi haven't come out for notebooks, last I checked, so we're still waiting to see.
     
  4. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    ummmm the 5 series wont be out for a couple months
     
  5. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
  6. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    Either the OP is dumb and is asking for desktop card info in a notebook manufacturer sub-forum, or your post isn't relevant to what he's asking about mobile next gen cards.
     
  7. Pman

    Pman Company Representative

    Reputations:
    327
    Messages:
    1,882
    Likes Received:
    0
    Trophy Points:
    55
    been rumoured for q2 for the 300 series
     
  8. Sparky894

    Sparky894 Notebook Evangelist

    Reputations:
    4
    Messages:
    323
    Likes Received:
    0
    Trophy Points:
    30
    no ... no ...

    I am asking about the mobility laptop cards ...

    :)

    I would feel very bad if I suddenly bought a new laptop and then found out that a few weeks later they refresh and upgrade the graphics card to a better one ...

    ;)

    thats why I am asking ...
     
  9. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    you will always feel bad .... pick the one with the highest chance of upgrading and suck it up
     
  10. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    One of my links said to not expect real 300 mobile chips until later in 2010, but that was just a recommended guess.

    Whether Nvidia releases G200 chips as 300M before then is anyone's guess.

    In other words, no one knows anything about anything, so buy whatever you want now.
     
  11. emike09

    emike09 Overclocking Champion

    Reputations:
    652
    Messages:
    1,840
    Likes Received:
    0
    Trophy Points:
    55
    Desktop GTX 300 was scheduled to be released end of this month, but is now pushed back until Q1 2010-ish. So by trends, you can expect a mobile variant in Q4 2010, Q1 2011.
     
  12. nobodyshero

    nobodyshero Notebook Speculator

    Reputations:
    511
    Messages:
    879
    Likes Received:
    0
    Trophy Points:
    30
    I know we're just throwing ourselves into the exciting but fruitless cycle of speculation but its incredible to think NVIDIA would let themselves be tossed around by the 4870 DDR5's for THAT long...
     
  13. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    Well they have no choice ATM

    also the Market for high end laptops is pretty low
     
  14. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Where did you hear this? All the sources I've read said it would be lucky to hit December, mid-late Q1 more likely.

    That's right. They won't be tossed around by the 4870. If ATI plays this right, they'll be destroying them with the mobile 5870.
     
  15. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    See the main thing everybody here is forgetting is that ATI still has not found a way to translate GDDR5 memory into mobile video cards. Their GDDR5 counterparts get their butts whopped quite often by nVidia's GDDR3 bits. If ATI could translate the GDDR5 into the mobile versions of their cards and nVidia does not release proper translations of their GT200 series architecture for the mobile market (albeit most likely named 300m series), *THEN* ATI will dominate. Because remember, the 4870 X2 was ATI's only answer to nVidia's GTX 285. If the mobile 4870 (crossfire and SLI versions apply as well) lose out to the mobile versions of a 9800 GTX+, then it shows there's a big jump in power.

    If ATI keeps churning out DDR3 cards for laptops and nVidia arrives with their GT200 core translations, then even their beloved 5870 will not save them. Not to mention that nVidia has the huge advantage over ATI when it comes to certain gamers... PhysX is a beautiful thing to behold, and not every game is going to use the Havok engine. A friend of mine complained about playin RE5 when he reached the water levels. He has a 4890 and an AMD quad core in his desktop. My 9280 breezed through it with better framerates all around and also at a FAR higher screen res (mine was 1920x1200 and his was 1280 x 1024). It happened in NFS Shift as well, and Unreal Tournament 3 he had to disable all PhysX in the options menu. ATI's saving grace is their memory and the raw power of their cards. nVidia's saving grace is their PhysX.

    If an ATI card could outperform a nVidia card by a significant amount (like how the 5870s murder anything that doesn't say GTX 295 on it) then I'll be looking toward an ATI card. PhysX or not. If they can't do that, then nVidia for me. I love my PhysX mmm hmm
     
  16. ronnieb

    ronnieb Representing the Canucks

    Reputations:
    613
    Messages:
    1,869
    Likes Received:
    0
    Trophy Points:
    55
    maybe the most anticipated game of the decade = starcraft 2, which uses


    dun dun dun


    HAVOK OH NOEESS. Doesn't alienware have gddr5 cards coming out for the m17x all powerful? the twin 4870s?
     
  17. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    No, there are currently no mobile cards using GDDR5 memory on them. For some reason, ATI hasn't been able to get it onto mobile cards yet. GDDR3 is the fastest in the mobile set.

    And how could there be use for the Havoc engine in a Real Time Strategy game? Thinking about Warcraft 3 and the original starcraft I don't see any physics in them
     
  18. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    I was thinking more of environment physics.. stuff blowing up, debris fields, etc.. but yeah, I still don't think it's going to use it extensively.
     
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That would cause a huge processor drain if it used the Havok engine for that... there are tons of simultaneous explosions and death animations in games like that... it'd have as recommended specs a quad core if it used Havok in that way. Especially since starcraft is futuristic and therefore all about the explosions
     
  20. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    You seem to forget that ATI doesn't manufacture the boards, their AIB (mobile) partners do.

    The memory type does not matter in they way you are trying to describe it, memory + bus = bandwidth. For example, 128-bit GDDR5 performs the same as 256-bit GDDR3 given that the effective clock speeds are the same. The reason GDDR5 is used is because it allows a "thinner" (maybe not the best wording) bus which allows for reduced cost, and GDDR5 allows for power savings over GDDR3. To summarize, the core does not care which type of memory it has, so long as it isn't bandwidth limited.

    I wouldn't exactly say it was the answer to the 285 since a single 4870 was able to perform within 10-15% of the 285 for half the price (when first released on the desktop side). Continuing...

    You seem to be forgetting that the 5850 slightly outperforms the 285 (for much less cost and power usage). Assuming both parties follow their usual paths, the 285 will be a cutdown desktop 285 and the 5850 will just be a clock speed reduced desktop 5850 meaning the 5850 should destroy any mobile 285. I do however agree that AIBs need to start making cards with GDDR5 since I can't see any reason to make them without.

    Currently, PhysX is a gimmick. Developers will not use it for more than after-effects since they won't want to cut down the install base to a single GPU manufacturer. With OpenCL gaining steam (esp. since it's not GPU specific), PhysX will eventually go the way of the dodo. What they may decide to do is to implement OpenCL and then translate those calls into PhysX function calls.

    - he is CPU limited, not GPU limited based on the resolution
    - since PhysX is not compatible with ATI GPUs it seems its forced to run on the CPU (since you say performance increased once disabled). If this is the case, then he would be insanely CPU limited.
    - he seems to be having other issues since the desktop 4890 is faster than a mobile 280.
     
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    ATI cards were having serious performance issues with RE5 until the 9.9 Catalyst. It's still not perfect, but it's a big improvement.
     
  22. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    @notyou

    He has a quad core AMD black edition clocked at 3.2 GHz. He does not have a CPU problem. He also has 8GB DDR2 RAM @ 1066MHz. No problems there either. The physics in games is problematic for ATI cards. His computer kicks my laptop's butt at other games that inherently use Havok Engine (like Crysis Warhead) or that just don't use Physics in it at all (don't remember any games like this offhand... X-men Origins may be one since little is moveable/destructible in that game).

    Also, the 4870s completely destroy the 9800GTX+ cards. The 280m is the 9800GTX+ with slightly lower clocks. In fact, I cannot overclock my card to the base specs of the 9800GTX+ without it crashing. The mobile 4870s are *not* as powerful as the 280m. 280ms in SLI also outperform mobile 4870s in Crossfire. If the 285m is based on the GT200 chip architecture (meaning 240 Stream processors and not 128) and all other things held the same (maybe slightly higher clock speeds on the card), it will blow the other mobile GPUs out of the water. The ATI mobile cards already have 320 stream processors in them, there's nothing to increase there. Their desktop cards have the edge over nVidia's counterparts because of their faster memory, but their best in a series has never been able to beat nVidia's best. I don't know why. Price is a serious factor in buying a card, but let's face it, dual 4870s in Alienware's M17x are $100 more expensive than dual 280ms. And with the 5000 series on their way, those cards should have dropped in price. If ATI brings out GDDR5 memory on their mobile cards, then nVidia is in trouble. But as it stands, if a lower clocked DDR3 5000 series card comes out and nVidia releases one of its GT200 architecture cards, ATI will be the ones with a problem. A DDR3 4870 cannot compete within 10-15% of a GTX 285 in desktops. You need the DDR5 version.

    While ATI doesn't manufacture the boards, they provide them with the technology. If they provide them with GDDR5 mobile tech, it will be produced at some point.
    And 128 bus width doesn't even recognize beyond a certain memory size. While your point is valid, with 256 bit being the base maximum for mobile cards, going back to a 128 bit would make no sense, as I believe they would not be able to produce cards with larger usable memory than 256MB on them. Which is a decent bit of power inhibiting there. Not all games may use it, but enthusiast-type settings and AA love your video memory. They eat it up like free pizza. The way I meant it, all other things held equal, DDR3 would have to replace DDR5 RAM on ATI cards.
     
  23. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I can assure you that the 40nm technology used for the 285M or 290M (if they will ever come out) will not allow the use of 240 shaders, the TDP is just impossible to bare in a laptop. The best you will see is a core with maybe around 170-180 shaders and clocks somewhere around 550/1375/950. I base this on the fact that the 260M GTS uses 96 shaders and has a 38W TDP (although the bus is just 128bit). If all else is kept equal, then that means 190 shaders for 75W. But things don't really work this way, for example the 250M GTS uses the same core but with 10% decrease in clocks and it has 28W TDP.

    My bet is on a 40nm 290M GTX with 160 shader cores being the next best thing from Nvidia with clocks of 550/1375/900, expecting anything more is unrealistic. Still pretty good if you would ask me :).
     
  24. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You see, the GTX 280m has 128 SP and I believe a TDW of 85. It is a 65nm core base. If you were to drop that to 40nm, the sheer number of open possibilities are immense. Also, ATI's 320 SP run on 65nm as well, so I am sure 240 SP in nVidia's tech can work, especially with a smaller die technology. A difference of 10nm may say no 240, but a difference of 20 or 25nm may surprise you to no end =)

    Also lower clocks do affect TDP of course, that's why cards on the MXM 2.1 boards need to be overvolted to allow for overclocking past a certain number. But this is still all speculation, however I'm sure it could be done. The old C2D and C2Q had TDP of 25 and 35W, if the new i7 mobiles can match that then we shall have a winner for the video cards ^^
     
  25. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Since when was the GTX 280M 65nm and 85W TDP.
     
  26. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Half the programs that check for it tell me 55nm and the other half say 65nm. Some websites say 55nm and others say 65nm. I see 65nm more often, I'll admit I assumed it is 65nm. As for the TDW I don't remember where I heard that actually, but it is stuck in my head for some reason that it's an 85W card.

    You post tells me I'm wrong, would you clear it up for me?
     
  27. ChinNoobonic

    ChinNoobonic Notebook Evangelist

    Reputations:
    273
    Messages:
    638
    Likes Received:
    1
    Trophy Points:
    30
    The GTX 280M is a 55nm card and has a 75W TDP.
     
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    huh... so GPU-z and 1/3 of those websites are right and CPU-z and 2/3 of those websites are wrong... Oh well. Well that makes a bit of my post invalid then O.O

    I think...

    Oh well... *sigh*
     
  29. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    You have been added to the KGB watch list.
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    >.> why? 10char.
     
  31. Sparky894

    Sparky894 Notebook Evangelist

    Reputations:
    4
    Messages:
    323
    Likes Received:
    0
    Trophy Points:
    30
    hahaha ... what watch list ???

    :D

    ;)
     
  32. Sparky894

    Sparky894 Notebook Evangelist

    Reputations:
    4
    Messages:
    323
    Likes Received:
    0
    Trophy Points:
    30
    I was wondering if Nvidia is able to forever keep this up ...

    I mean everytime ATI releases a new card ... all Nvidia does is to release a new card with an increase in the number of engine cores ... but can this last forever ??? first they release a base card ... then they simply increase the number of engine cores ... just to match up with the competition that is ATI ...

    and also endlessly play the re-branding game over and over ...

    and also now that directX 11 is just around the corner ... how is nvidia going to match up to the competition ...

    :(

    :confused:
     
  33. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Only time will tell, but I hear what you're saying. I hope Nvidia plays their trump card soon.
     
  34. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    No worries :). So as I was saying 550/1375/950 with 160 SP :-D The best you will ever see on 40nm. Although... you never know.

    If my laptop would be able to take 40nm cards that would be great, but I guess I will have to skip them. Wait until Fermi / or whatever ATI is throwing out and the next die-shrink. What is the next die-shrink anyway?
     
  35. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I believe nVidia's Fermi die-shrink said they were 32nm? And ATI was supposed to be 35nm? I can't be completely sure, I was told this in class when my A+ teacher was excited last week because the new 5000 series cards from ATI looked awesome, but said he ain't ready to give up PhysX and may probably wait for nVidia's GT300 series to come out before he upgrades his dual 9800GTs
     
  36. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
  37. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    nVidia would be foolish to release their next mobile iteration as a DX10 variant. ATi will have DX 11 mobile cards on the market which will force nVidia to release a mobile version of fermi. The problem is, fermi's extra transistors are geared towards making it more of a CPU than GPU so I'm not sure if it will even beat out the ATi 5000 series in performance. The added heat from all the transistors sure won't be worth it.
     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Architecture is one thing, supporting DX is another. I do believe nVidia can make their GT200 core cards support DX11 when they port them to mobile, as nVidia already has cards which support and use DX11, just not open to the general public as yet.

    And a GPU is basically a CPU that can only process graphics, a smarter graphic CPU will be immensely good for performance. I think they just allow it to be more adaptable to the workstation-class users like those people who make games and movies like The Incredibles.

    Either way, nVidia wouldn't let its gamers down, they still do a lot of advertising in games where you see "nVidia... the way it's meant to be played". I don't think they would still be pushing that out if their new GPUs were not slated to demolish the competition.
     
  39. dalingrin

    dalingrin Notebook Evangelist

    Reputations:
    59
    Messages:
    515
    Likes Received:
    27
    Trophy Points:
    41
    I think Fermi will be great on the desktop. Possibly outpeforming AMD/ATI by a long shot in some situations. However, I can't believe how nVidia is investing all of its energy into one market that hasn't even proven to be a money maker yet. By focusing on the on GPGPU market they have nearly abandoned the mobile(laptop) sector. Lets not forget how much laptop sells are growing vs desktops. This will now be the second generation where nVidia will have engineered a GPU that is ill suited for the laptop market. On the bright side I do think Fermi is more scalable than the GT200. Hopefully we will see Fermi sooner than we saw the GT200 make it to the mobile market. Unfortunately, I think it may be at the cost of performance.

    I see Fermi beating Ati in peformance on the desktop while Ati beating Fermi in the mobile sector. ATI will probably have a primarily downclocked version of their desktop cards for their mobile cards whereas, nVidia will have to chop their GPUs in half. At 40nm, Fermi is still a very large chip.
     
  40. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    FYI 4870's do have ddr5 in the m17x
     
  41. SoundOf1HandClapping

    SoundOf1HandClapping Was once a Forge

    Reputations:
    2,360
    Messages:
    5,594
    Likes Received:
    16
    Trophy Points:
    206
    Who else is excited for Intel Larrabee? I know I am! Anyone else?




    I kid, I kid.
     
  42. brianvp

    brianvp Notebook Consultant

    Reputations:
    49
    Messages:
    281
    Likes Received:
    0
    Trophy Points:
    30
    Hahah I wish you could have seen my face when I was reading the first part of your post.. I was like... what? :confused:

    Fermi seems like an O.K. move for nVIDIA, but I read they got pushed out of the chipset market ( I remember something about nVidia ticking off Intel somehow) so they're leaving ATI open to catch the ball here. Honestly though... all of this rebranding stuff nVidia has been doing makes sense to me now ( still makes me mad sometimes ) but they have been working on Fermi this whole time. Fermi is nVidia's wild card, and they're putting alot of chips into this round I think. With OpenCL I could see nV really coming out on top here in a few years with this though... we'll see.