The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    ATI R600 is going to be a monster!!!

    Discussion in 'Gaming (Software and Graphics Cards)' started by uncleG, Nov 15, 2006.

  1. uncleG

    uncleG Notebook Consultant

    Reputations:
    1
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
  2. Viperjts10

    Viperjts10 Notebook Evangelist

    Reputations:
    69
    Messages:
    447
    Likes Received:
    0
    Trophy Points:
    30
    Nvidia wouldn’t be “gone” It may temporarily be outperformed by the ATI, but it will never be gone. Obviously, if ATI is able to do this, Nvidia will find a way as well and probably will overcome anything ATI brings at it.
     
  3. Notebook Solutions

    Notebook Solutions Company Representative NBR Reviewer

    Reputations:
    461
    Messages:
    1,849
    Likes Received:
    0
    Trophy Points:
    55
    I do not really believe news from the Inquirer. And 2 GB of videoram would be too expensive, and no one game can make use of 2 GB of GDDR.

    Charlie :)
     
  4. gusto5

    gusto5 Notebook Deity

    Reputations:
    54
    Messages:
    760
    Likes Received:
    0
    Trophy Points:
    30
    ditto. totally agree.
     
  5. brianstretch

    brianstretch Notebook Virtuoso

    Reputations:
    441
    Messages:
    3,667
    Likes Received:
    0
    Trophy Points:
    105
    Most likely 2GB is for the FireGL series cards, not gaming cards, as the article says. 1GB for a gaming card is believable though.

    It looks like nVidia and DAAMIT will leapfrog each other for a while rather than release cards on similar schedules. Works for me.
     
  6. Fishy

    Fishy Notebook Evangelist

    Reputations:
    145
    Messages:
    422
    Likes Received:
    0
    Trophy Points:
    30
    Even the most powerful card cannot utillise 2Gb video ram. System ram yes, video..no ANd Nvidia has always been one step ahead of ATI and AWAYS WILL BE!! Nvidia RULE!!
     
  7. oglsmm

    oglsmm Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    ...

    ATI or Nvidia, both make excellent products and both are always playing leapfrog with performance crowns.

    At this point in time neither is "better". Each is better suited to different apps/programs. However both will end up acomplishing the same. Personally I prefer ATI, however that is simply because it is Canadian (or was).

    Nvidia has the crown now, ATI will take with their next generation, however Nvidia will take it back.... they love each other. Without the other neither one of them would be able to continuously entice people to keep upgrading their cards.

    The competition is what drives the business.
     
  8. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,090
    Trophy Points:
    931
    For future reference, please do not make thread titles with censored words in them. We have the word filters on for a reason - we have no desire to see those words in any form. Thanks.
     
  9. metalneverdies

    metalneverdies Notebook Evangelist

    Reputations:
    151
    Messages:
    314
    Likes Received:
    0
    Trophy Points:
    30
    LOL the Inquirer never believe anything they say
     
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,913
    Trophy Points:
    931
    Lets wait and see shall we? ;)
     
  11. PC_pulsar

    PC_pulsar Notebook Evangelist

    Reputations:
    5
    Messages:
    479
    Likes Received:
    0
    Trophy Points:
    30
    Both cards will be almost even fast. Count my word! The ati card will be a little bit faster, but it won't be much faster.
     
  12. uncleG

    uncleG Notebook Consultant

    Reputations:
    1
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    I guess we'll just have to wait and see. WHOO HAHHAHA ;)
     
  13. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    less fanboyism (sounds kinda queer), and more objective speculation please



    hmmmmm,

    i think by the time the r600 comes out, nvidia will probably have the low and mid and the ultra high end 8900s out, so while it may be true that the r600 will be a monsters, it won't be the only one.

    that's hard for me to say especially since ati has served me faithfully but it's the truth as they came out with the first dx 10 cards first


    and history has shown that murphy's law may hold true for newer technology


    anyhow, these gpus will probably undoubtly be much better then current gen gpus and will have a much longer life time then these dx9 cards we all have
     
  14. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Not true. At *this* point in time, NVidia is clearly better. Geforce 8800 vs... uh... X1900? Yeah, I think we can safely say that *right now* NVidia has the "better" product.

    And there are plenty of rumors about the R600. Most of them say it will be faster, and most of them say so because they started back when the G80 was thought to be some weird hybrid design instead of unified shaders. And none of them are based on anything like actual facts. The G80 is quite a bit more powerful than a lot of people had expected, so maybe the R600 won't slaughter it after all...

    As hmmmm said, by the time the first R600's are out, NVidia will more or less be ready with 8900 cards.

    And of course, as has already been said, don't take anything The Inquirer says as fact. They're good rumour-mongerers, but that's all they're good for.

    (On a related note, anyone noticed their "GPU's won't consume 300W. It's so wrong of people to spread false rumours like this"-article today? Anyone remember who first started claiming this generation of GPU's would use 300W? Yep, that's right, the inq... Morons... :))
     
  15. Phillip

    Phillip Phillip J. Fry

    Reputations:
    1,302
    Messages:
    1,736
    Likes Received:
    0
    Trophy Points:
    55
    that is one impressive card
     
  16. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    "First of all, you need to know that this PCB (Printed Circuit Board) is the most expensive one that the graphics chip firm has ever ordered."

    Here goes my wallet....
     
  17. Phillip

    Phillip Phillip J. Fry

    Reputations:
    1,302
    Messages:
    1,736
    Likes Received:
    0
    Trophy Points:
    55
    Well, all I can do is dream. I don't have the means to make such a purchase.
     
  18. Ice-Tea

    Ice-Tea MXM Guru NBR Reviewer

    Reputations:
    476
    Messages:
    1,260
    Likes Received:
    0
    Trophy Points:
    55

    Sure you do. I mean, you don`t really need that left kidney, do you?
     
  19. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    HAhahahah

    the comments in this thread is funny


    nice one Ice tea
     
  20. Phillip

    Phillip Phillip J. Fry

    Reputations:
    1,302
    Messages:
    1,736
    Likes Received:
    0
    Trophy Points:
    55
    Yeah I do, I need to pay for next semesters fees and books.