NVIDIA previews next-gen "Fermi" GPU design
![]()
Wikipedia currently has some speculations on Fermi's designs. Follow the link
NVIDIA this evening provided an early look at the next generation of its graphics processors. Nicknamed Fermi, the architecture for future GeForce, Quadro and Tesla chipsets will jump from 240 cores to a much larger 512 and should be much faster in each core courtesy of some industry-first techniques. Fermi chips will be the first GPUs to have a real cache hierarchy, with Level 1 caches to keep specific information on hand and a single, shared Level 2 cache for larger tasks; they will also have a new GigaThread engine that can transfer data in both directions at once and handle "thousands" of tasks at once.
ECC memory will also work for the first time and should help Quadros and Teslas avoid errors for large scale chores, especially in PC clusters.
The chipmaker is short on full performance claims but says Fermi is much better tuned for general purpose computing. Its double-precision floating point math should run about eight times faster than the company's previous best and should provide much better performance with current and upcoming standards, including OpenCL, Microsoft's DirectCompute and NVIDIA's own CUDA.
Release dates for hardware based on Fermi aren't available, but the first GeForce 300 series cards will be based on Fermi and may launch late this year.
![]()
Via (Electronista)
-
i wonder what the size of this thing is going to be)
-
italian.madness Notebook Consultant
-Transistors from 681 million of G80 to 1.4 billion of GT200 to 3.0 billion of FERMI
-CUDA Cores from 128 of G80 to 240 of GT200 to 512 of FERMI
Impressive right? -
Looks nice indeed. But what will be the power draw of this monster!?
-
That would depend on the memory bus. If it's beyond 256 bit, it will be huge
-
Yeah, I'm not seeing this monstrosity going mobile anytime soon. I mean, the G200 based chips have just hit the laptop market, and they max out at 96 cores at the moment (less than G92). The highest end Nvidia cards are still using the G92 core.
-
it'll be 384bit wide, GDDR5
-
UPDATE 1:Three NVIDIA Fermi-based chipsets on launch?
NVIDIA's Fermi architecture will start with three models if it ships before the end of the year as promised, one leak ( via Fudzilla) from Friday claims. A flagship single-chip model would have the 512 cores NVIDIA is advertising, but a second model would, like the GeForce GTX 295, have two slightly less powerful chipsets on one card that combined would be much faster. The slowdown may be necessary as Fudzilla believes the card would have a peak thermal power of 300W.
Rounding out the line would be a slightly lower end, single chip model that would take the place of cards like the GTX 275 and would represent NVIDIA's most important model in terms of sales. Mid-range and budget cards aren't expected until 2010, and notebook chipsets may not show until much later into the same year.
Fermi is NVIDIA's first major architectural overhaul since the GeForce 8 series and should be much faster, especially for OpenCL and other tasks that depend on general purpose computing rather than just graphics.
Via ElectronistaLast edited by a moderator: May 8, 2015 -
UPDATE 2:
NVIDIA: Snow Leopard will benefit from Fermi
The just-unveiled Fermi graphics architecture will find its way into Macs and play an important role in Mac OS X Snow Leopard, NVIDIA chief scientist Bill Dally said today. While it's expected that NVIDIA would continue to play an important part of future Macs, the researcher drew a particular connection between the new GPU design and Apple's new OS, expecting that it would provide a significant boost for those apps that implement OpenCL. Windows 7 will also get support through DirectX 11 and DirectCompute.
"A lot of [the new] features accelerate key consumer applications," Dally told CNET. "Both Snow Leopard and Windows 7 enable the GPU to be used as a co-processor... [using a] discrete GPU they can get very good performance on these applications."
He added that Fermi may lose some of its potential in raw graphics performance because of its heavy emphasis on general computing but that it should prove itself once more software starts using the hardware for more than video. Games can exploit unused resources on a Fermi chipset to render physics more accurately without as much of a performance hit, for example, while media creation tools like Photoshop can speed up filter rendering or other duties that normally wouldn't benefit from a faster video chipset.
Few Mac apps are optimized for OpenCL at present. Most that do support general-purpose computing on graphics cards still support NVIDIA's proprietary CUDA format instead, which doesn't work with AMD's ATI-branded cards or other competitors' products.
Dally wouldn't say when GPUs based on the design would appear, but he did state that the first gaming and workstation parts would ship close together. Rumors have a minimum of three high-end cards launching before the end of 2009, targeting gamers and other performance users, while mobile and low-end parts aren't due until 2010.
For Apple, higher-end desktop Macs with discrete graphics, such as the 24-inch iMac and Mac Pro, are most likely to get the upgrade first. 20-inch iMacs, Mac minis and all MacBooks use integrated mobile chipsets for graphics.
Via ElectronistaLast edited by a moderator: May 8, 2015 -
NVIDIA's Fermi takes direct aim at supercomputing, Intel
...leaving the dying PC gaming market to ATI. -
Those are bad news, if ATI is left alone, then they will charge HUUUUGE amounts of money for everything!!
And Intel cant compete with ATI in gaming.
NVIDIA going to HCP and Mobiles, and Intel going into HCP. ATI stays in gaming alone. oh no... -
-
-
That is massive!! 256bit with 2GB GDDR5?
How odd, DAAMIT has always something there, but never really advertise, they simply launch it, or show it, and that is it, then sell it.
Intel and NVIDIA on the other hand advertise all the time. -
Well, if ATI can get its 5000 series into a lot of notebooks, the market is now theirs to dominate, without much of a fight. -
If the info shared by Phinagle proves to be correct, then the notebook market is ATI's. Let alone with the introduction of the 5000HD series.
Wait, if there is no competitor, then ATI could sell the 3450HD as a high end gaming GPU and nobody could say no, since there is nothing else!!!!
-
Stop panicking ppl... NVDIA hasn't said anything about giving up on the gaming market yet so for now we don't have to worry but still NVDIA's new GPU's will be great for games... but for now we don't know if they're going to be as great for games... after we gamers are the ones who make them the most money buy buying their gaming GPU's...
-
http://www.fudzilla.com/content/view/15785/1/
NVIDIA previews next-gen "Fermi" GPU design
Discussion in 'Gaming (Software and Graphics Cards)' started by Serg, Oct 1, 2009.