I was just comparing my desktop's Geforce 6200 AGP 256mb with my laptops Mobility Radeon x300 64mb. I know my card is clocked at about 500/320 but all the cards listed were significantly slower and were actually slightly outperformed by the x300. Bumping the clock speed showed more expected results but 300 --> 500 seems like a pretty large bump and memory clock was listed as 530 MHz (1060 DDR).
http://www.gpureview.com/show_cards.php?card1=198&card2=414#
This is labelled "nVidia GeForce 6200A"
Default clock is 350/250. Overclocking to 350/500 only bumps Memory Bandwidth to 8 GB/sec
http://www.gpureview.com/show_cards.php?card1=192&card2=414#
Labeled "nVidia GeForce 6200 AGP".
Default clock is 300/275, meaning Shader Operations, etc. are slower but the Memory Bandwidth is already listed as 8.8 GB/sec!
I believe mine is the NV44A chip, so I would guess it's closest to the first one is but from what I know of mine (DDR2, 256MB memory) performance should be closer to the second.. So what gives? Are these specs wrong, have I got some sort of "Super Card" or am I reading these specifications wrong?
-
-
I dunno, but GF 6200 AGP has 300/275 (550 DDR) by that link.
Else, i think you comparing PCI x16 with AGP x8. -
Both cards are AGP. -
I thought you were comparing with x300, anyway 6200A has 4gb/s and 6200AGP has 8.8gb/s, so you can guess which one is a better.
-
If my Desktop is slower than my laptop than there's not much point it except for delegating tasks.
Wierd specifications for Geforce 6200AGP
Discussion in 'Gaming (Software and Graphics Cards)' started by The General, Oct 6, 2008.