The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nivida 300 Series? (Post anything you know)

    Discussion in 'Gaming (Software and Graphics Cards)' started by mechrock, Nov 10, 2009.

  1. mechrock

    mechrock Notebook Evangelist

    Reputations:
    85
    Messages:
    594
    Likes Received:
    0
    Trophy Points:
    30
    I know there is not much info out now, but I'm sure many including me would like to know what the specs of these cards will be. Even If it's a guess of the spec it would be interesting to see what people think they will be and what really comes out.

    Edit: Haha I spelled "Nvidia" wrong.
     
  2. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    Dual SLI configurations.. 4 cards! Which I've heard then can be connected via Crossfire for a total of 8! INSANE!
     
  3. mechrock

    mechrock Notebook Evangelist

    Reputations:
    85
    Messages:
    594
    Likes Received:
    0
    Trophy Points:
    30
    We could only wish. Haha
     
  4. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    Well, I do know people who've put 4+ GTX 285's in their desktop, but not for gaming purposes. ;)
     
  5. bigspin

    bigspin My Kind Of Place

    Reputations:
    632
    Messages:
    3,952
    Likes Received:
    566
    Trophy Points:
    181
    It's not a rebranded old chip.....lol that's all i know :p
     
  6. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Maybe for the desktop Fermi, but the jury is still out on mobile cards and it's not looking good. Fermi is a huge chip and the 300M model numbers that were shown in drivers match up pretty well with Nvidia's recently launched 40nm GT200 DX10.1 chips.


    There's been a couple of dozen threads on the subject and if you do a quick search you can save people rehashing the same rumors.
     
  7. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Based on what I've heard both in my 'Advances in Computer Architecture' course and online, it's going to be a "weak" card for gaming since a lot of its features are geared towards the GPGPU market and are unable to be used well (or at all) in a gaming environment (double precision floating point operations for one). This is the same reason they're going to cripple it for the GeForce line (cutting out the "unnecessary" parts). When I see this, I think bad design since the architecture shouldn't need to be chopped down unless part of the die is defective.
     
  8. emike09

    emike09 Overclocking Champion

    Reputations:
    652
    Messages:
    1,840
    Likes Received:
    0
    Trophy Points:
    55
    I disagree that it will be 'weak' for gaming. The GPGPU will contain the same exact instruction sets found in current g80 architecture (G80-G200), with additional instructions and abilities geared closer toward the x86 instructions. Game rendering examples were shown running on the Fermi of pure raytracing running in real time of a car that was near impossible to tell the difference from real and fake.

    I think the more realistic expectation would be that Fermi won't be utilized properly until either a new DX API and OpenGL is released that supports a more flexible processing environment, or until developers start developing in OpenCL/GL. So naturally, with the prices of Fermi and how long it takes for a market standard to catch up, I don't think we'll see gaming take full advantage of what OpenCL and the GPGPU platform has to offer for several years. We've only just barely gotten the majority of the market off of G72 platform and into a directX 10 compatible world.
     
  9. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    According to many skeptic rumors, all future mobile GPU's won't arrive until 2nd half of 2011 and all current ATI and Nvidia mobile GPU's capable of gaming are being cancelled due to shortages of 40nm chips.
     
  10. emike09

    emike09 Overclocking Champion

    Reputations:
    652
    Messages:
    1,840
    Likes Received:
    0
    Trophy Points:
    55
    I agree that the mobile variant will take a while to get to us. I don't think we'll see it till they can shrink the die to 32nm (or whichever is next in their line). Reason we didn't see the G200 in mobile is because it needed to be shrunk down to 40nm to even be feasible. And... G200 couldn't be shrunk without devastating results. If they do manage a mobile 40nm G300, best best will that it will be severely limited, but naturally more powerful than current G92b mobile chips. So I am curious to see what nVidia pull out in the mobile department for the GTX 285m and the GT300m cards soon to come out. Speculation is as far as we can go right now.

    There is a shortage of 40nm chips, but the mobile
     
  11. Alexrose1uk

    Alexrose1uk Music, Media, Game

    Reputations:
    616
    Messages:
    2,324
    Likes Received:
    13
    Trophy Points:
    56
    Well right now TSMC is having some issues, which may well result in 40nm mobile cards being delayed; they're struggling as it with desktop availability. (Apparently they're now producing yields of 40%, whereas previously it was 60%, which affects the cost and availability of all current 40nm designs, AMD/Nvidia alike).

    On the other hand, AMD's Global Foundries are apparently working on 28nm graphic tech right now (likely with AMD sitting by with a keen eye on where to produce the 6 series, due next Autumn).

    Really at the moment everything is touch and go. If TSMC can get thier 40% yeild issues sorted out; then we may see mobile chips sooner, even within the initial Q1 specified originally by AMD before TSMCs issues; however it'll really depends on how supply, demand and technical issues are dealt with.

    I think theoretically it may well be possible for AMD to be first to market with mobile chips this round, as even cutting clock speeds on the 5 series cards, they already have an immensely low idle power threshold, its just peak power consumption which needs halved, they have the tech, they just need to finalise thier mobile design, and ensure TSMC can actually produce the damned chips until Global Foundries can be utilised.

    As it stands though, with Nvidia still struggling to get a Fermi based card to the desktop market, and admitting we may not see them on the desktop til Q1, I believe it likely they'll either release *another* (possibly die shrunk) G92 part, or a GT200 mobile part again die shrunk, before we see Fermi based chips, at least in the performance sector. Nvidia may utilise some earlier yield chips as low clocked low/medium end parts though.

    Even if the next iteration of performance parts from Nvidia is GT200 not 300, it'll still be a decent performance jump from the aging G92 architecture, that's for sure.
     
  12. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    I should have better stated my meaning of weak. By it, I meant that compared to the number of increased transistors we wouldn't see the same performance improvement in games. eg. 50% more transistors = 50% (give or take) more performance (and I do realize this is the optimal situation).

    Do you have a link to that rendering demo? The only thing I managed to find was a static car rendering (or moving very minimally, which [the rendering] to me didn't look realistic at all).

    As for the development environment, again, we return to the direction nVidia is going with Fermi. Fermi has been created to be general purpose, but the performance is primarily going to the scientific environment and thus are nearly useless for gaming. Don't get me wrong, I think Fermi will be good for the general purpose aspect but I don't believe we'll see a great improvement in games because it was not designed to run games.

    PS. I'll go talk to my professor (researching general computing) and I'll see how 'in-depth' I can get with Fermi.
     
  13. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    Why do both companies have to use TMSC... it always seems to screw up all the die shrinks and delay all the new GPUs?? Isn't there some other company that ATI NVDIA can use instead?
     
  14. Alexrose1uk

    Alexrose1uk Music, Media, Game

    Reputations:
    616
    Messages:
    2,324
    Likes Received:
    13
    Trophy Points:
    56
    Not easily right now.

    AMD will be able to fall onto Global Foundries in the next year or so, but right now they're in the hands of TMSC. I dont half doubt if this works out well and Global Foundries produces well, then I'd be surprised if Nvidia didnt begin to use them as well (even if that results in an indirect profit for AMD).
     
  15. fdrominjacks

    fdrominjacks Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Hello,

    I am fdrominjacks.This is my first visit to site.I am new to this site,but I’ve enjoyed posting in your forums.I am agree with 300 series.Unless the 300 series is a lot better/different then the 200, I'd upgrade now. When we get a few 300 cards, they will all be high end, and cost a bunch.Nvidia lately hasn't done a good job of getting the newer cards down into the lower end. Grab a GTX275 now, and then upgrade again to a 300 card 12-15 months from now when/if they finally do ship out a midrange part.

    I agree that the mobile variant will take a while to get to us.I don not think we'll see it till they can shrink the die to 32nm.you can't see the G200 in mobile is because it needed to be shrunk down to 40nm to even be feasible.

    Thank you very much and Stay connected with me.
     
  16. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
  17. grbac

    grbac Notebook Deity

    Reputations:
    137
    Messages:
    982
    Likes Received:
    0
    Trophy Points:
    30
  18. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    Wow, the specs for the 310M are identical to the 210M, while the 330M is only a slight OC of the 240M. Then again it's based on the 2xx series so nothing to be shocked about... back to rebadging.
     
  19. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Cant NVIDIA do anything on the rebadging? What is it with it? GRRRR (and that is why I prefer ATI)
     
  20. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    Is there even a die shrink this time like the last rebadge? >.> Judging from those G310M specs, they're exactly the same as the G210M; not even a die shrink was applied. both have the same DirectX version, same clocks, same amount of shaders, same manufacturing process...
     
  21. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    And that is called renaming/rebadging GPUs.

    They are simply overclocked/rebranded GPUs on the case of the 330M

    Sounds like the 8xxx to 9xxx or 9xxx to 1xx.
     
  22. grbac

    grbac Notebook Deity

    Reputations:
    137
    Messages:
    982
    Likes Received:
    0
    Trophy Points:
    30
    Yes but what about the new architecture of Fermi based mobile cards. Shouldn't that bring the expected improvements?
     
  23. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    "should" there is the key word.

    Fermi is out as Tesla right now (for some thousands of dollars of course).

    The Fermi based notebook should have been called GT3x0M...that post seems to counter that. Time will tell.
     
  24. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Called it.
     
  25. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    Some of the last rebrands of Nvidia had die shrinka to smaller processes as well as overclocks though. This rebadge wouldn't even have that apparently.
     
  26. unlogic

    unlogic Notebook Evangelist

    Reputations:
    24
    Messages:
    310
    Likes Received:
    56
    Trophy Points:
    41
  27. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    There really shouldn't have to be one as the lastest mobile GPUs are already 40nm die shrinks of reduced scale versions of desktop G200 architecture.

    No one is doing 32nm GPUs yet. (Rumor is ATI will skip to 28nm in 2011 for the 6000 series)
     
  28. WankelRotor

    WankelRotor Notebook Consultant

    Reputations:
    50
    Messages:
    214
    Likes Received:
    0
    Trophy Points:
    30
    Since, power of a mobile GPU is really hard to advance for any manufacturer, I would like to see more SLI configs. 3x or 4x GTX280m would give you a huuugee power boost without even changing any design process. Now with 21' laptops, it shouldn't be a problem anymore to fit more than 2 cards inside. If anyone will make it, its clevo :D
     
  29. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    I rather see IGP+GPU advances on all laptops, that support Hybrid power and switchable graphics.
     
  30. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    I don't think many people get 21" laptops... and I'm pretty sure NO ONE gets 21' ones (21 feet lol). :D
     
  31. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    <param name="movie" value="http://www.youtube.com/v/wkJTgv6IA7c&hl=en_US&fs=1&color1=0x2b405b&color2=0x6b8ab6"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/wkJTgv6IA7c&hl=en_US&fs=1&color1=0x2b405b&color2=0x6b8ab6" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width='560' height="340"></embed></object>

    Fermi demostrated
     
    Last edited by a moderator: May 6, 2015
  32. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
  33. grbac

    grbac Notebook Deity

    Reputations:
    137
    Messages:
    982
    Likes Received:
    0
    Trophy Points:
    30
    Hopefully that's not how the rest of the Fermi cards will start.
     
  34. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
  35. grbac

    grbac Notebook Deity

    Reputations:
    137
    Messages:
    982
    Likes Received:
    0
    Trophy Points:
    30
    More news. Coming later than expected.

    http://www.fudzilla.com/content/view/16685/34/
     
    Last edited by a moderator: May 8, 2015