http://www.hothardware.com/News/ATI_Radeon_4000Series_to_Use_GDDR5_Memory/
"The higher data rates supported by GDDR5 up to 5x that of GDDR3 and 4x that of GDDR4 enable more bandwidth over a narrower memory interface, which can translate into superior performance delivered from smaller, more cost-effective chips."
-
Yea, we heard the news, but will they live up to the expectations? That`s what we`re all wondering...
-
Aww, i thought i was posting it first.
-
Go ATI!
-
Considering we barely saw an entry from the 2000 or 3000 series, maybe the 4000 series will fit the mobile power envelope AND give Nvidia some much needed competition.
-
hehe,i wonder how much will they cost?
-
TheGreatGrapeApe Notebook Evangelist
Well it's been expected for months, which is why it's not news, posting twice, who cares, we can discuss once, twice, three times a.....
The area it will benefit the most is that with the error correction and shared I/O changes, making an Xfire card that shares the same pool instead of duplicating the memory requirements will mean more effective and cheaper X2 solutions. It'll be interesting to see what they do with the R700-named solution.
For the standard RV770XT/4870 it really just means similar bandwidth as the R600 without the tansistor and PCB cost of a 512bit interface. -
was GDDR4 even out yet?
-
ViciousXUSMC Master Viking NBR Reviewer
All that bandwidth but probably horrid latencies. Not as important in general for a GPU as for system ram I think, but when you consider the big change in latency from DDR to DDR2, and DDR2 to GDDR3, you gotta figure GDDR5 is going to be horrid, and as such will suffer in some manner due to that.
-
Jayayess1190 Waiting on Intel Cannonlake
All of them, even the low ends?
-
Ati had GDDR4 for a while, and their cards still could not outperform nvidia`s common gddr3...
-
I'm getting two and playing Age of Conan at High settings with DX10 60FPS! Hopefully
-
Nevermind. I have no right to rain on your positive parade.
It is you young folk and your hope that keep me going! -
Wait, there was an ATi 3000 series?
I hope ATi will finally be able to challenge Nvidia. -
=O Now I'm wondering what you said -
Sorry for the OT post, but great signature! Top gear rules... I wish they would shoot it in HD though! -
TheGreatGrapeApe Notebook Evangelist
Look at Qimonda's Clamshell mode and think of it's benefit for X2 solutions especially;
http://www.qimonda-news.com/download/Qimonda_GDDR5_whitepaper.pdf
In the end better to have support than to not have support, but memory alone doesn't make or break the card, just like the R600's 512bit interface showed and the G80's broken memory management showed.
Main thing depends on the memory requirements of both texture and buffers, as well as efficiency of the memory management of the chip. -
The change in latency?
Good DDR2 RAM has lower latency than the best DDR.
DDR3 is still so new its latencies are a bit higher, but that's only a matter of time. They'll reach parity with DDR2 soon enough.
So no, I don't see the problem. -
-
GDDR3 is analogous to DDR2 with a few enhancements, right? So what is GDDR4 and GDDR5? Is 4 another set of enhancements and 5 DDR3? Or have they split from the DDR base and created a completely different set of standards?
-
It's different.
Even though GDDR4 was suposedly better, the ATI cards didn't show any difference comapred to the same card in GDDR3 -
-
Ive not even heard of GDDR4, and now ATI's going forward with GDDR5?
Why do I feel out of the loop? -
TheGreatGrapeApe Notebook Evangelist
In the same card the HD2600XT showed that despite any perceived latency issues, the extra 400(800)Mhz was a big difference. Same clocks the benefits for either wouldn't be very big (take alot of work to even expose a difference).
The problem with the R600 however, what is the gigantic memory bandwidth going to help a card that has a relatively low number or TMUs and only rarely gets a chance to use it's buffer to even half their potential utility. Is not the memory that was the limiting or freeing factor.
The G80 would've benefited more from that level of bandwidth, but it had it's own issues with it's memory bug.
Also some of the main benefits of GDDR4 like heat and efficiency gains were lost because they didn't use their low 1.5V mode, they ran in 1.8V mode to get the higher speed.
It's like moving processes. Going from one process to another doesn't guarantee lower power and heat and higher yields if your clock it so high that it wastes those benefits. -
ATIs GDDR4 failed... lets see how it scales with GDDR5
-
TheGreatGrapeApe Notebook Evangelist
Really, just like DDR-2/GDDRII it was the quick/close/easy proximity of the replacement that made it easy for mfrs to simply skip the generation and move on to the refined next generation.
Had the time frames been longer (like the desktop RAM) you likely would've seen more effort spent on refinement, but at this point why bother, simply move ot GDDR5 and go one step further.
Still would prefer to see XDR make a move into this market while it still has an advantage.
New ATI Radeon 4000's to use GDDR5 memory!
Discussion in 'Gaming (Software and Graphics Cards)' started by nizzy1115, May 21, 2008.