Why are their shader clocks so low compared to nVidia? Is it because they're not targeted at the gamer market?
-
each one of them follows different technology, NVIDIA has higher shader clocks than ATi, but in the same time, ATi has more stream processors than NVIDIA, so they end with something of the same performance in some games(if comparing 2 cards of the same class).
-
In addition I think to compare the number of shaders between ATI and NVIDIA cards, you gotta divide the number of ATI shaders by 5 or something like that.
-
masterchef341 The guy from The Notebook
the underlying technology between these cards is fundamentally different.
as stated, it is difficult to assess game performance by comparing the number of stream processors, clock speeds, etc. across generations or lines of graphics cards. -
I just know that the stream processors of the ATi cards 5-dimensional shader cores, not sure about dividing the numbers by 5.
-
I believe to estimate performance, you can divide the shaders by 5 to compare with nVidia GPUs. For example, HD 3650 has 120 shaders, which divided by 5, is 24. The HD 4650 has 320, which divided by 5, is 64. Compare this to nVidia's 9600M GT/GT 130M, which has 32 shaders, and you can see why HD 3650 < 9600M GT/GT 130M < HD 4650. Not completely accurate, but gives a good general idea of where GPUs lie.
ATi graphics cards
Discussion in 'Gaming (Software and Graphics Cards)' started by fred2028, Jul 6, 2009.