The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Gddr4

    Discussion in 'Gaming (Software and Graphics Cards)' started by Serg, Sep 15, 2009.

  1. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    What happened to GDDR4??
    We have the 3 as a standard, but according to this: http://en.wikipedia.org/wiki/GDDR4 it has existed since 2006, and it is better and faster, but I know no GPU using it.

    What happened?
     
  2. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    GDDR4 just never picked up, except for a few limited GPUs. Possibly just having the wrong timing in entering the market...
     
  3. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    But was it supposed to be any good?
    Was it worth it?
    It simply disappeared......?? WOW
     
  4. LisuPoland

    LisuPoland Notebook Deity

    Reputations:
    150
    Messages:
    830
    Likes Received:
    21
    Trophy Points:
    31
    I wondered the same, probably it turned out to be too expensive to manufactutre or it might have been sabotaged by companies witch developed GDDR3 to force us to buy the less effective one for a longer period of time....
     
  5. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    I came acros this doubt since GDDR5 is getting standardized on high end GPUs, and GDDR3 on mainstream, but (pretty obvious) I noticed a lack of GDDR4 at all...and it got me into wandering what ever happened to it?
     
  6. LisuPoland

    LisuPoland Notebook Deity

    Reputations:
    150
    Messages:
    830
    Likes Received:
    21
    Trophy Points:
    31
    Well then let's just hope that they will skip GDDR4 right into the GDDR5 as mainstream soon
     
  7. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    GDDR4 was tried to made mainstream back in 2007 AFAIK, but failed. Interesting enough is that nobody even knew it existed, and i dont think it will out of nowhere appear out of the sudden and become a standard. I rather go GDDR5. :wink:
     
  8. LisuPoland

    LisuPoland Notebook Deity

    Reputations:
    150
    Messages:
    830
    Likes Received:
    21
    Trophy Points:
    31
    yeah GDDR5 is best bet, since it's newer tech. Damn soon we will have GDDRBFG20 as standard, before we notice...or even more crazy stuff
     
  9. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    HUH? <10char>
     
  10. Peter Bazooka

    Peter Bazooka Notebook Evangelist

    Reputations:
    109
    Messages:
    642
    Likes Received:
    1
    Trophy Points:
    31
    I put a radeon 4670 in my brother's desktop computer that uses gddr4. He just wanted a computer to play WOW at 1440x900 so I picked up a vostro desktop when it was cheap for about $450 and the 4670 was the fastest available card that didn't require external power. All but one 4670 uses ddr3 but for $10 extra and a giant heat-sink I chose to get the one with gddr4.

    Here is a link to some benchmarks and it looks like for almost no cost to the consumer it provides 5-10% better real-world performance than the ddr3 card and even more if overclock-ability is considered.

    http://www.elites.com/?option=com_content&task=view&id=677&Itemid=27
    http://www.firingsquad.com/hardware/sapphire_radeon_4670_gddr4_review/page3.asp

    My guess is that it is a matter of cost and manufacturers don't think it makes enough of a difference. Truth be told I wish they offered gddr4 over ddr3 because memory speed does make a difference on the same card.

    Edit: Sorry the first link doesn't work, took me a minute to figure out why and it is because the website is called elite-basterds albeit spelled the correct way and the filter here auto deletes it...
     
  11. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    The 9600 GT won over the 4670HD??? interesting.
    Oh, 5-10% increase is negligible.
     
  12. Garandhero

    Garandhero Notebook Deity

    Reputations:
    262
    Messages:
    1,522
    Likes Received:
    1
    Trophy Points:
    56
    I want it. I have stinky GDD2 mem...

    would upping to 3 be worth it for me?
     
  13. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    GDDR4 didn't pick up because the manufacturing cost + yields weren't and aren't worth the small jump over the already prevalent GDDR3.
    Probably because the desktop 9600 GT is 256-bit, 64 shaders, and with minimum clock rates of 600/1500/900. Makes the "9600M" look like a joke.
     
  14. Peter Bazooka

    Peter Bazooka Notebook Evangelist

    Reputations:
    109
    Messages:
    642
    Likes Received:
    1
    Trophy Points:
    31
    Yeah the 9600gt is a more powerful card and at almost the same cost as the 4670 it provides more performance. This is because it utilizes a 256-bit memory interface but it also uses twice the power and requires an external power supply (think they have a version now that doesn't require it but its new). The 5-10% performance may be negligible and that is probably why it didn't catch on but I chose the card because of its non-reference cooler that is much quieter than stock while keeping the card cooler and its included hdmi port, the extra speed was just a bonus that I would appreciate on any card.
     
  15. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Thanks for the info. I had for a moment forgotten those were Desktop not mobile GPUs. lol
     
  16. Prydeless

    Prydeless Stupid is

    Reputations:
    592
    Messages:
    1,091
    Likes Received:
    3
    Trophy Points:
    56