The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    ati's comeback

    Discussion in 'Gaming (Software and Graphics Cards)' started by Rickyz80, Jul 26, 2007.

  1. Rickyz80

    Rickyz80 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    15
    I have to admit, simply because i saw the 2 600XT come about i was less than impressed because i thought it would be a midrange card. But upon furthur reading, it looks as this is a contender for the 8800m series by the number of stream processors in comparison to the 8700, and the 256bit bus interface. Is this too good to be true and im missing something?

    Heres the link for the 2600XT's specs
    http://http://ati.amd.com/products/mobilityradeonhd2600xt/specs.html

    and heres the link for the 8700m
    http://http://www.nvidia.com/object/geforce_8700m.html

    Has the king of mobile cards been here under our noses and we didn't know of it because of a x6xxxx inteh name? :eek:
     
  2. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    Fix those links please.

    Anyways, yes, the 2600XT does seem to be a very powerful mobile card, I know it surpasses the 8600GT but I do not know about the 8700GT.
     
  3. Amblin42

    Amblin42 Notebook Guru

    Reputations:
    75
    Messages:
    62
    Likes Received:
    0
    Trophy Points:
    15
    You have an extra "http://" in those links.
     
  4. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    you can't count stream processors the same for ati and nvidia.

    they are completely different. except equal performing cards to have way fewer stream processors in nvidia cards than ati.

    also, the mobility versions of the 8600m gt, 8700m gt, and hd 2600 xt all have a 128 bit memory bus.

    i guess you were referring to the desktop versions.
     
  5. Ayepecks

    Ayepecks Notebook Evangelist

    Reputations:
    11
    Messages:
    342
    Likes Received:
    0
    Trophy Points:
    30
    I don't know, but ATI's desktop cards absolutely pail in comparison to nVidia's at the moment. They are just as good statistically, but in actual game benchmarks they are very far behind. I'm going to bet the same happens for these laptop cards. Hopefully that'll change, though.
     
  6. baddogboxer

    baddogboxer Notebook Deity

    Reputations:
    144
    Messages:
    1,092
    Likes Received:
    0
    Trophy Points:
    0
  7. Alias

    Alias Notebook Deity

    Reputations:
    78
    Messages:
    714
    Likes Received:
    143
    Trophy Points:
    56
    Well, even though the performance of the 2600XT is impressive, the number of laptops employing the ATI card dont seem to be more then the fingers on one hand..

    Unless ATI can bring about the 2600XT on the mainstream 15.4" and 17" laptops, the particular card would be just a card that 'could' have been gr8...
     
  8. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    the www.notebookcheck.net is unreliable. they pull a lot of those numbers out of the air, and a lot of their rankings are obviously wrong.

    the hd 2600 xt is a really good card, approximately as good as an 8600m gt.

    the fact that it has 120 processors instead of 32 doesn't matter because the nvidia steam processors are about 4x efficient as the ati version.

    and they both have a 128 bit memory bus; no 256 bit memory buses for the midrange parts yet.

    one giveaway should be that the mobility hd 2600 xt is labeled as having 60 pixel shaders and 60 vertex shaders, when in reality it has 120 stream processors. another giveaway should be that 3dmark is a poor indicator of performance in the first place. a third indicator is that their 3dmark scores are collected from various sources taken under various resolutions, and 3dmark scores do not correct for resolution. if i run 1024x768 i will get a higher score than someone else who runs 1600x1200.
     
  9. Rickyz80

    Rickyz80 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    15
    No i think was referring to the mobile one in the link.(sorry bout those)
    and what is the memory bus exactly? Id probably better understand everything if i knew what it did.
     
  10. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    when i read that, i see "128-bit 4-channel DDR2/GDDR3 memory interface"

    i don't know where you are getting "256 bit" from or what you mean by that.
     
  11. times

    times Notebook Evangelist

    Reputations:
    0
    Messages:
    316
    Likes Received:
    0
    Trophy Points:
    30
    i was just lookn at these cards and wanted to know wot x2300 and x2500 cards can be compared to nvidia range.
     
  12. Phrozt

    Phrozt Notebook Geek

    Reputations:
    22
    Messages:
    98
    Likes Received:
    2
    Trophy Points:
    16
    As others have said, you cannot compare paper between ATI and nVidia as a basis for real world results.

    For instance, in desktop models, the 2900xt blows the 8800GTX away on paper, but actual benchmarks tell a FAR different story.
     
  13. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    well, the paper works, if you read enough. you can't just read the bullet points though.

    120 shaders on ati hardware and 32 shaders on nvidia hardware doesn't mean much. you have to find answers to bigger questions like:

    what is a shader?
    how does the shader work?
    how do the two different companies employ shaders?
    how do they count shaders?
    how much performance impact is a result of shader performance?
    what else affects performance?

    then you have to ask similar questions about other parts. when you are done you will see that ati and nvidia match up in the end, more or less.

    as for the x2300 and x2500, AVOID THEM.

    the x2300 is literally just an x1300/1400. the x2500 is an x1600/1700. from what i gather they are actually crippled moreso than their previous generation named counterparts, because they are designed for people who don't know any better.
     
  14. times

    times Notebook Evangelist

    Reputations:
    0
    Messages:
    316
    Likes Received:
    0
    Trophy Points:
    30
    lol and here i thought i could save 200£ getn a x2300 or x2500.guess id better stick with my original plan and look at getn the 8400 or 86 gt.dont really need to play games tho.i got a 360 that will get red lights sooner or later for games lol
     
  15. Rickyz80

    Rickyz80 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    15
    "Fully distributed design with 256-bit internal ring bus for memory reads and writes"
    What is this then?
     
  16. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    meaningless. especially for comparative purposes. that just describes how memory is handled internally. it is of no interest to us. trust me, both cards have a 128 bit memory bus.

    the only reason we even care about the memory bus is that it is directly related to the memory bandwidth.

    double the memory bus and keep the memory clocked the same, and you double the memory bandwidth.

    the more memory bandwidth you have, the faster you can manipulate data.
     
  17. baddogboxer

    baddogboxer Notebook Deity

    Reputations:
    144
    Messages:
    1,092
    Likes Received:
    0
    Trophy Points:
    0
    I agree with much you say, I have seen typos and impossible #'s some times, and 3DMark is just bench mark so it is not real world. With that said I do not think it is as useless as you think. On the point of the shaders, if you go into a description of the card it is identified correctly 120 unified, clearly the table they use has limitations so instead of redesigning they split over colums they have. They made a decision to represent it the best they could, they Know what the card has.

    They also say competes with 8600m GT so they agree with everything you say but they run 3DMark. :eek:
     
  18. Rickyz80

    Rickyz80 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    15
    thanks masterchief, things have made more sense after reading and after a more thorough explanation was given about those pesky shaders.

    The m2900XT however is going to be at least 256bit im guessing.
     
  19. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    i would like to think it would be 512...
     
  20. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yeah the top end parts are going to be 256 bit.

    midrange parts aren't ready for 256 bit yet. its not like the last gen had them and now they are gone. the last gen actually had 64 bit x6xx parts and only the high end ones were 128 bit. now all the x6xx parts are 128 bit, the high end ones use gddr3 instead of ddr2.

    its just a power / heat nightmare i'm sure for the upgrade to 256 bit.

    top end parts don't give as much priority towards heat and power concerns, so they will get it (and have had it) since the top end 6800 ultra series.

    of course, the bus width isn't everything. the 8600m gt memory is way faster than the 6 series's, so it makes up for it.

    plus, massive memory bandwidth really starts making a difference at higher resolutions. at lower resolutions its ok to have the 128 bit bus.

    its all good.
     
  21. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    exactly, the majority gamers using this card are not going to be running it at WUXGA or even WSXGA+ because of the power limitations
     
  22. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    well the highest resolution you will find matched to this card is 1680x1050, which is getting up there. most should be 1440x900 or less. you will probably find some in 1280x800, too. thats probably the optimal resolution to be gaming at, just because you get a really sharp picture once you have 800 pixels of detail. 600 or so and you can start to see the pixelation in my opinion. obviously less than that is even more pronounced (480, standard def). but 800 already exceeds consoles, thats doing good in my book.
     
  23. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    im on a 20 inch screen at 1280x1024 and i think it is as far as i would go, 1440x900 sounds perfect, on a smaller screen that would be more than enough i would assume
     
  24. McKillenstein

    McKillenstein Notebook Consultant

    Reputations:
    28
    Messages:
    134
    Likes Received:
    0
    Trophy Points:
    30
    Wasn't there an article posted a couple days ago that showed the 2900xt showing significant performance gains over the 8800 in dx10 games? And aren't they going to produce them in 65nm process now so the heat and power consumption will be more manageable? All this jibba jabba between the cards means nothing until something like Crysis comes out...a dx10 game that people actually want to play... I don't want to sound biased but deep down I guess I'm an Ati kind of person.
     
  25. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    most likely, there was.

    there was also probably an article showing the 8800 to have significant performance gains over the 2900xt in dx10 games.

    there were a handful of "dx10 afterthought" games and demos released over the past few weeks. most all of them are highly preferential to either nvidia or ati.

    unfortunately both the high end cards are power suckers, regardless of their manufacturing process. you would think 65nm would help though.
     
  26. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    who cares about power for your desktop? seriously i dont understand it the people that get those cards will have large enough power supplies
     
  27. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    geforce 6800 ultra / x800 xt had a 400 watt power supply recommendation. that was a big deal back then. they were considered power suckers.

    a really good 400 watt power supply can be had for 50 bucks.

    ati hd 2900 xt has a 750 watt power supply recommendation.

    a really good 750 watt power supply can be had for 200 bucks.

    thats just entry, consider your electric bill also...

    power requirements should be going down, not up.

    intel has it right.
     
  28. Phritz

    Phritz Space Artist

    Reputations:
    68
    Messages:
    1,276
    Likes Received:
    0
    Trophy Points:
    55
    Wtf is an internal ring bus?
     
  29. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    well yeah it should be going down, i just wondered why it mattered so much
     
  30. McKillenstein

    McKillenstein Notebook Consultant

    Reputations:
    28
    Messages:
    134
    Likes Received:
    0
    Trophy Points:
    30
    So the lights in your house won't dim whenever you play computer games, I don't think we should take electricity for granted. Besides, if its cooler and uses less energy to begin with, won't that imply a better ability or ease in overclocking as well?

    This is also NotebookReview.com after all...
     
  31. imhungry29

    imhungry29 Notebook Evangelist

    Reputations:
    19
    Messages:
    473
    Likes Received:
    0
    Trophy Points:
    30
    omg i remember back then. i was barely getting by with an x800 xt and a thermaltake 450w true power. back then a 450w cost $150usd. now they hav 1kw for around 300 or so. how price to performance ratio has dropped.

    oh and i was reading in cpu magazine that falcon NW is shipping a system with dual x2900xt's with 512mb, 512bit gddr4 on the card. how will that affect ur system performance. masterchef u seem very knowledgeable on this subject. can u please enlighten me?
     
  32. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    it will probably affect your gaming performance in a positive way ;)

    also, it will probably make some of your neighbors lights turn out.

    SLI isn't for me, though. Its really only useful if you have a 2500x1800 or so monitor (i don't remember the exact res) and want to play the latest games on it.

    From a budget perspective its hard to justify. 800 dollars for two high end gpu's, when just one could handle any modern game at any reasonable resolution. Then, when the next series of technology comes out, you could pay another 400, have the latest tech, and probably outperform what would have been your SLI rig.

    In both cases I spent 800 dollars and was able to play all the 5star games, but the second method also allowed me to upgrade to the latest tech. I don't necessarily recommend doing either of those, but you can see why i don't like SLI. SLI is only good if you want to play flight simulator x or rainbow six vegas, in something above 1920x1200, right now, and it doesn't matter how much money it costs.
     
  33. RogueThunder

    RogueThunder Notebook Consultant

    Reputations:
    12
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    Well, I think I'll put forth my 3 cents.

    As for this whole "Zomg ATI went 128bit memory bus too" bit. and "Zomg what about the 256bit internal ringbus" I have a good theory.

    Technically speaking... I think everyones been right, ish. Sounds crazy eh?

    What I think theyre using, is a pair of 128bit buses, essentially. Is it 256bit hence? Not exactly. Is it 128bit? Certainly not, but at the same time yes...

    Think of it this way, the memory controller, is 256bit. The paths leading off it, are 128bit. The ram is found in two(or possibly four with a similar configuration to the 8700m swapping between them) clusters, connected by the 128bit paths. There however, are more 128bit paths than found in say, the 8600/8700's. By a multiple of two.
    Technically, the memory is running on 128 bit interface, the bus itself however, is running a 256bit one. Mishmashed together, in a manner-that hopefully works very well. And unfortunately, is confusing as... Yeah.
    This would mean they could use memory, designed for a 128bit interface also, with a 256 bit interface essentially. A good way to try to lower cost.

    Now, does this mean it will preform better? In high res and AA/AF, I damn well hope so. Otherwise though-questionable. (Not to mention they have to fix having murdered AA/AF in its desktop variant...)

    If the one benchmark we have semi-legitimately herd of the HD2600xt is valid, they should replace their desktop mid-end with the laptop one-it would preform better... Maybe in a nifty 7950GTX2 style setup. Significantly from what I've herd. That is, if the benchmark in any way transfers to a working video card with real world preformance. Which we have yet to find out. If it doesn't, ATI fails this gen totally... Oh well, they finally get their 5X00 series equivalently. Too bad. I'm going to stay optimistic though-Or grab the 8800 when it hits laptops. The performance I'm looking for requires a bit more umph at high res than the 8600/8700 likes putting out... and sli is a bit too large and heavy for me(likely due to its memory interface being...)

    Anywho sorry for continuing on rambling, but thats what I do. I hope the first bit of my post is helpful. And please keep in mind, it is a THEORY. I don't have insider information(damnit!) and even if I did, for some moronic reason I'm sure it would be under a NDA atm...
     
  34. knightingmagic

    knightingmagic Notebook Deity

    Reputations:
    144
    Messages:
    1,194
    Likes Received:
    0
    Trophy Points:
    55
    I don't get these "this card has X [current-gen feature]" statements. It used to be pixel pipelines, now it's stream procs. Does this stuff really matter or even come close to average FPS measurements?
     
  35. Akilae Hunter

    Akilae Hunter Notebook Consultant

    Reputations:
    7
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    Well, in the old series, 7xxx & x1xxx, the number of pipes/shaders was a pretty good representative of performance. Now, not so much. Makes it more confusing, IMO. I usually go by published pixel fillrates. They're pretty solid and not much subject to propaganda.
     
  36. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    oh indeed they are absolutely ridiculous at this point; you can tell by rogue thunder's theory. these features are made with marketing in mind. they are hoping you get caught up in them and just fall for their product and buy it.

    the "internal ringbus" simply deals with how memory is handled internally. it is a moot point, because that isn't a bottleneck. the whole ring bus thing is just a cost saving feature that makes the internal design scalable and consistent. its a good thing, but it doesn't affect performance. you see no benefit from your ring bus as opposed to the nvidia design. no extra anti aliasing or frames because of the ring bus. other things will give advantages or disadvantages, that is not one of them.

    the one thing you can look for that will give you an idea of performance (across brands) is memory bandwidth

    memory bandwidth is a big deal. if you can't get data in, you can't manipulate it. simple as that. memory bandwidth is inherently a bottleneck because the memory is stored outside of the core. memory bandwidth is a combination of the memory interface and the memory clock. but really the only number that matters is the memory bandwidth. you can find that data on wikipedia among other places for individual cards.
     
  37. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    Nvidia increased its market share with Notebookgrafikkarten of 24% in Q1 2006 to 60% in Q1 2007.

    ATI/AMD dropped in response from 73% to 39%.

    This might be to attribute to the early introduction of DirectX 10 capable diagram chips of NVIDIA for Notebooks (8600M and 8400M series). ATI announced the hp 2400 and hp 2600 series already, really emerged up to now however still no Notebook thereby (should happen in the coming weeks).

    With this study it might concern only dedicated diagram chips, since Intel sets off with the integrated diagram chips at most.

    (translated from german)
     
  38. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    i hate how intel is the number 1 gpu maker in the world.

    bagh.
     
  39. knightingmagic

    knightingmagic Notebook Deity

    Reputations:
    144
    Messages:
    1,194
    Likes Received:
    0
    Trophy Points:
    55
    My desktop motherboard comes with an Intel Extreme 2 chip, it's pretty worthless. HP installed slightly better Geforce4 MX 440s (AGP) into the a255c line. I later bought a PCI Radeon 9250. Half-Life 2 runs at 25-35fps on the Nvidia card @ 1024x768, no AA/AF filtering, high textures, shader, and models, blob shadows, lowest specular, and mat_dxlevel 70.

    Now only if I could combine my 3 GPUs into something that could pull decent fps. :p
     
  40. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    lol, yeah how did they do that, probably a conspiracy, and link there is just the one notebook, the HP pavilion HDX with the hd2600xt in it and it about the same level as a 7900gtx(minus the high res problems due to the small bus)
     
  41. RogueThunder

    RogueThunder Notebook Consultant

    Reputations:
    12
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    Aiya, Yeah, it is getting ridiculous. The best thing we can gauge by is benchmarks at this point... (Especially with the "standardized" benchmark programs getting more inaccurate at the same time...)
    And btw-by my theory, they may actually have significantly improved memory bandwidth over the 8600/8700. Using older tech.(Well, same tech as the 8600/8700 really, but older than the 8800...) Kinda a move were used to amd pulling off. That said, its a slightly loose theory ^.^ That without the NDA'ed white-papers I probably will never be able to complealty prove. As for AA/AF performance, it wouldn't so much gain FPS, as loose them slower :p same thing in your eyes prehaps... but... Compare the AA/AF curves of the 8600GT and 7900-7950s...(How much each step up in AA/AF reduces performance.)
    If your wondering, normally the ramsystem is a single matched system... ATI is branching off of this... How exactly-I'm actualy curious about.(Either very neat idea, or dumb marketing chatter... About 49.5/49.5 i would say with 1% eft over for banana. ^.^)
    That said, when we finaly get our hands on one, it should be prove/disprovable through running some video memory benchmarks... hmmm, cant wait ^.^

    Caught up in it? Actualy-the ringbus is a novel concept, that if implemented inelegantly WILL give performance increases over nvidias design, without huge price increases at that(and yes, a very scalable design...)... Nvidia this gen focused on an amazing core...(I would say they did pretty well too... Pulling such preformance with a 128bit memory system) ATI-in theory atleast, on an amazing memory bus. On the other hand, a poorly implemented ANYTHING wont work...(Or *cough* not truely implemented at all *cough*desktopHD2600aa/af*coigh*)
    That said, its hard to understand on paper... I imagine the whitepapers are far more frightening(though contain enough information to figure it all the way out... damn them not ever accepting my-ah er, anyway ^.^, WELL they HAVE/HAD a hobbyist in their application for acess... Don't ask...)
    I'm waiting to see what it actually does. Then, if it delivers, I'll buy it. If not, hey, I'm bettings the 8800m's wont be a long wait by then.

    *shrugs* The best thing if ones interested can do, is wait. If you just need a system though-go and get it, no point in waiting. I want a bit more than is avalable now in something below a certain window in size/weight. That means HD2600xt or a 8800m of some flavor.(Should be a 22-25w version of it... 8700gt obviously isn't, even though it seems to be the 8800gs in codename... so... 8800GT? lol XD who knows) Both, are a waiting game. N I'm still up for that... *pets his poor ancient bugger* At-least as long as my current baby holds out...

    Anywho...