The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    New ATI Radeon 4000's to use GDDR5 memory!

    Discussion in 'Gaming (Software and Graphics Cards)' started by nizzy1115, May 21, 2008.

  1. nizzy1115

    nizzy1115 Notebook Prophet

    Reputations:
    2,557
    Messages:
    6,682
    Likes Received:
    1
    Trophy Points:
    205
  2. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Yea, we heard the news, but will they live up to the expectations? That`s what we`re all wondering...
     
  3. nizzy1115

    nizzy1115 Notebook Prophet

    Reputations:
    2,557
    Messages:
    6,682
    Likes Received:
    1
    Trophy Points:
    205
    Aww, i thought i was posting it first. :nah: :nah:
     
  4. Kain

    Kain Notebook Evangelist

    Reputations:
    22
    Messages:
    588
    Likes Received:
    1
    Trophy Points:
    31
    Go ATI! :D
     
  5. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Considering we barely saw an entry from the 2000 or 3000 series, maybe the 4000 series will fit the mobile power envelope AND give Nvidia some much needed competition.
     
  6. Xirurg

    Xirurg ORLY???

    Reputations:
    3,189
    Messages:
    7,375
    Likes Received:
    3
    Trophy Points:
    206
    hehe,i wonder how much will they cost?
     
  7. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Well it's been expected for months, which is why it's not news, posting twice, who cares, we can discuss once, twice, three times a.....

    The area it will benefit the most is that with the error correction and shared I/O changes, making an Xfire card that shares the same pool instead of duplicating the memory requirements will mean more effective and cheaper X2 solutions. It'll be interesting to see what they do with the R700-named solution.

    For the standard RV770XT/4870 it really just means similar bandwidth as the R600 without the tansistor and PCB cost of a 512bit interface.
     
  8. Nirvana

    Nirvana Notebook Prophet

    Reputations:
    2,200
    Messages:
    5,426
    Likes Received:
    0
    Trophy Points:
    0
    was GDDR4 even out yet?
     
  9. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    All that bandwidth but probably horrid latencies. Not as important in general for a GPU as for system ram I think, but when you consider the big change in latency from DDR to DDR2, and DDR2 to GDDR3, you gotta figure GDDR5 is going to be horrid, and as such will suffer in some manner due to that.
     
  10. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    All of them, even the low ends?
     
  11. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Ati had GDDR4 for a while, and their cards still could not outperform nvidia`s common gddr3...
     
  12. bmwm3oz

    bmwm3oz Notebook Consultant

    Reputations:
    13
    Messages:
    291
    Likes Received:
    0
    Trophy Points:
    30
    I'm getting two and playing Age of Conan at High settings with DX10 60FPS! Hopefully :)
     
  13. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Uhh...

    Nevermind. I have no right to rain on your positive parade.

    It is you young folk and your hope that keep me going!
     
  14. nbaumann

    nbaumann Notebook Deity

    Reputations:
    105
    Messages:
    722
    Likes Received:
    0
    Trophy Points:
    30
    Wait, there was an ATi 3000 series? :D

    I hope ATi will finally be able to challenge Nvidia.
     
  15. bmwm3oz

    bmwm3oz Notebook Consultant

    Reputations:
    13
    Messages:
    291
    Likes Received:
    0
    Trophy Points:
    30

    =O Now I'm wondering what you said :rolleyes:
     
  16. mattldm

    mattldm Notebook Geek

    Reputations:
    5
    Messages:
    87
    Likes Received:
    0
    Trophy Points:
    15
    To Xirurg:
    Sorry for the OT post, but great signature! Top gear rules... I wish they would shoot it in HD though!
     
  17. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Id' still take 70+% more speed over slightly elevated latencies though, especially if it comes with benifits like lower heat and better I/O management like split I/O and error correction.

    Look at Qimonda's Clamshell mode and think of it's benefit for X2 solutions especially;
    http://www.qimonda-news.com/download/Qimonda_GDDR5_whitepaper.pdf

    In the end better to have support than to not have support, but memory alone doesn't make or break the card, just like the R600's 512bit interface showed and the G80's broken memory management showed.

    Main thing depends on the memory requirements of both texture and buffers, as well as efficiency of the memory management of the chip.
     
  18. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Huh?
    The change in latency?
    Good DDR2 RAM has lower latency than the best DDR.
    DDR3 is still so new its latencies are a bit higher, but that's only a matter of time. They'll reach parity with DDR2 soon enough.

    So no, I don't see the problem.
     
  19. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106
    He did say improved performance on narrower memory interface so I think the low-end is included too.
     
  20. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    GDDR3 is analogous to DDR2 with a few enhancements, right? So what is GDDR4 and GDDR5? Is 4 another set of enhancements and 5 DDR3? Or have they split from the DDR base and created a completely different set of standards?
     
  21. JCMS

    JCMS Notebook Prophet

    Reputations:
    455
    Messages:
    4,674
    Likes Received:
    0
    Trophy Points:
    105
    It's different.

    Even though GDDR4 was suposedly better, the ATI cards didn't show any difference comapred to the same card in GDDR3
     
  22. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    It was just hype... to the common layman, GDDR4 naturally sounds better than GDDR3...
     
  23. Sotsu

    Sotsu Notebook Guru

    Reputations:
    11
    Messages:
    57
    Likes Received:
    0
    Trophy Points:
    15
    Ive not even heard of GDDR4, and now ATI's going forward with GDDR5?

    Why do I feel out of the loop?
     
  24. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    In the same card or same clocks?
    In the same card the HD2600XT showed that despite any perceived latency issues, the extra 400(800)Mhz was a big difference. Same clocks the benefits for either wouldn't be very big (take alot of work to even expose a difference).

    The problem with the R600 however, what is the gigantic memory bandwidth going to help a card that has a relatively low number or TMUs and only rarely gets a chance to use it's buffer to even half their potential utility. Is not the memory that was the limiting or freeing factor.

    The G80 would've benefited more from that level of bandwidth, but it had it's own issues with it's memory bug.

    Also some of the main benefits of GDDR4 like heat and efficiency gains were lost because they didn't use their low 1.5V mode, they ran in 1.8V mode to get the higher speed.

    It's like moving processes. Going from one process to another doesn't guarantee lower power and heat and higher yields if your clock it so high that it wastes those benefits.
     
  25. brainer

    brainer Notebook Virtuoso

    Reputations:
    334
    Messages:
    2,478
    Likes Received:
    4
    Trophy Points:
    56
    ATIs GDDR4 failed... lets see how it scales with GDDR5
     
  26. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    I guess it's a good thing we've had ATi's GDDR3 to fall back on and their GDDR5 to look forward to. ;)

    Really, just like DDR-2/GDDRII it was the quick/close/easy proximity of the replacement that made it easy for mfrs to simply skip the generation and move on to the refined next generation.

    Had the time frames been longer (like the desktop RAM) you likely would've seen more effort spent on refinement, but at this point why bother, simply move ot GDDR5 and go one step further.

    Still would prefer to see XDR make a move into this market while it still has an advantage.