The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    NVIDIA previews next-gen "Fermi" GPU design

    Discussion in 'Gaming (Software and Graphics Cards)' started by Serg, Oct 1, 2009.

  1. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    NVIDIA previews next-gen "Fermi" GPU design

    [​IMG]

    Wikipedia currently has some speculations on Fermi's designs. Follow the link

    NVIDIA this evening provided an early look at the next generation of its graphics processors. Nicknamed Fermi, the architecture for future GeForce, Quadro and Tesla chipsets will jump from 240 cores to a much larger 512 and should be much faster in each core courtesy of some industry-first techniques. Fermi chips will be the first GPUs to have a real cache hierarchy, with Level 1 caches to keep specific information on hand and a single, shared Level 2 cache for larger tasks; they will also have a new GigaThread engine that can transfer data in both directions at once and handle "thousands" of tasks at once.
    ECC memory will also work for the first time and should help Quadros and Teslas avoid errors for large scale chores, especially in PC clusters.

    The chipmaker is short on full performance claims but says Fermi is much better tuned for general purpose computing. Its double-precision floating point math should run about eight times faster than the company's previous best and should provide much better performance with current and upcoming standards, including OpenCL, Microsoft's DirectCompute and NVIDIA's own CUDA.

    Release dates for hardware based on Fermi aren't available, but the first GeForce 300 series cards will be based on Fermi and may launch late this year.

    [​IMG]

    Via (Electronista)
     
    Last edited by a moderator: May 8, 2015
  2. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    i wonder what the size of this thing is going to be)
     
  3. italian.madness

    italian.madness Notebook Consultant

    Reputations:
    44
    Messages:
    249
    Likes Received:
    9
    Trophy Points:
    31
    -Transistors from 681 million of G80 to 1.4 billion of GT200 to 3.0 billion of FERMI
    -CUDA Cores from 128 of G80 to 240 of GT200 to 512 of FERMI

    Impressive right?
     
  4. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Looks nice indeed. But what will be the power draw of this monster!?
     
  5. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    That would depend on the memory bus. If it's beyond 256 bit, it will be huge
     
  6. neilnat

    neilnat Notebook Evangelist

    Reputations:
    255
    Messages:
    655
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, I'm not seeing this monstrosity going mobile anytime soon. I mean, the G200 based chips have just hit the laptop market, and they max out at 96 cores at the moment (less than G92). The highest end Nvidia cards are still using the G92 core.
     
  7. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    it'll be 384bit wide, GDDR5
     
  8. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    UPDATE 1:
    Three NVIDIA Fermi-based chipsets on launch?

    NVIDIA's Fermi architecture will start with three models if it ships before the end of the year as promised, one leak ( via Fudzilla) from Friday claims. A flagship single-chip model would have the 512 cores NVIDIA is advertising, but a second model would, like the GeForce GTX 295, have two slightly less powerful chipsets on one card that combined would be much faster. The slowdown may be necessary as Fudzilla believes the card would have a peak thermal power of 300W.
    Rounding out the line would be a slightly lower end, single chip model that would take the place of cards like the GTX 275 and would represent NVIDIA's most important model in terms of sales. Mid-range and budget cards aren't expected until 2010, and notebook chipsets may not show until much later into the same year.

    Fermi is NVIDIA's first major architectural overhaul since the GeForce 8 series and should be much faster, especially for OpenCL and other tasks that depend on general purpose computing rather than just graphics.

    Via Electronista
     
    Last edited by a moderator: May 8, 2015
  9. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    UPDATE 2:

    NVIDIA: Snow Leopard will benefit from Fermi

    The just-unveiled Fermi graphics architecture will find its way into Macs and play an important role in Mac OS X Snow Leopard, NVIDIA chief scientist Bill Dally said today. While it's expected that NVIDIA would continue to play an important part of future Macs, the researcher drew a particular connection between the new GPU design and Apple's new OS, expecting that it would provide a significant boost for those apps that implement OpenCL. Windows 7 will also get support through DirectX 11 and DirectCompute.
    "A lot of [the new] features accelerate key consumer applications," Dally told CNET. "Both Snow Leopard and Windows 7 enable the GPU to be used as a co-processor... [using a] discrete GPU they can get very good performance on these applications."

    He added that Fermi may lose some of its potential in raw graphics performance because of its heavy emphasis on general computing but that it should prove itself once more software starts using the hardware for more than video. Games can exploit unused resources on a Fermi chipset to render physics more accurately without as much of a performance hit, for example, while media creation tools like Photoshop can speed up filter rendering or other duties that normally wouldn't benefit from a faster video chipset.

    Few Mac apps are optimized for OpenCL at present. Most that do support general-purpose computing on graphics cards still support NVIDIA's proprietary CUDA format instead, which doesn't work with AMD's ATI-branded cards or other competitors' products.

    Dally wouldn't say when GPUs based on the design would appear, but he did state that the first gaming and workstation parts would ship close together. Rumors have a minimum of three high-end cards launching before the end of 2009, targeting gamers and other performance users, while mobile and low-end parts aren't due until 2010.

    For Apple, higher-end desktop Macs with discrete graphics, such as the 24-inch iMac and Mac Pro, are most likely to get the upgrade first. 20-inch iMacs, Mac minis and all MacBooks use integrated mobile chipsets for graphics.

    Via Electronista
     
    Last edited by a moderator: May 8, 2015
  10. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
  11. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Those are bad news, if ATI is left alone, then they will charge HUUUUGE amounts of money for everything!!
    And Intel cant compete with ATI in gaming.

    NVIDIA going to HCP and Mobiles, and Intel going into HCP. ATI stays in gaming alone. oh no...
     
  12. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
  13. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106
  14. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    That is massive!! 256bit with 2GB GDDR5?
    How odd, DAAMIT has always something there, but never really advertise, they simply launch it, or show it, and that is it, then sell it.
    Intel and NVIDIA on the other hand advertise all the time.
     
  15. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    [​IMG]

    Well, if ATI can get its 5000 series into a lot of notebooks, the market is now theirs to dominate, without much of a fight.
     
  16. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    If the info shared by Phinagle proves to be correct, then the notebook market is ATI's. Let alone with the introduction of the 5000HD series.

    Wait, if there is no competitor, then ATI could sell the 3450HD as a high end gaming GPU and nobody could say no, since there is nothing else!! :eek:!!
     
  17. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    Stop panicking ppl... NVDIA hasn't said anything about giving up on the gaming market yet so for now we don't have to worry but still NVDIA's new GPU's will be great for games... but for now we don't know if they're going to be as great for games... after we gamers are the ones who make them the most money buy buying their gaming GPU's...
     
  18. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Intel has deaded Nvidia's plans to make chipsets supporting Nehalem and future CPUs. Where you'll probably see ION is paired up with Tegra in MIDs, PDAs, and Netbooks.

    http://www.fudzilla.com/content/view/15785/1/