Nah, I didn't reallymean it deliberty. There is no debate, I am just trying to point it out.
No, the 512MB is not a waste. I had a 10FPS increase on average when I switched over to the 512MB version of XT. The 128Bit memory interface does keep it going farther, but it definentally wasn't a waste.
If you don't have the XT version, then I am not surprised at your results. The XT is a gaming card, the pro is a multimedia card. There is this huge difference.
My max OC (If only I had a better colling syetm) is 800(engine), 1200(mem)
But when I do it this ay, my idle is 75C...
My idle at the moment is 58C with 700engine, 800 mem
-
Technically, the 8600m GT has more stream processors. What most peole do not realize is that ATi's and nVidia stream processors are different. While nVidia has a straightforward stream processor that processes one-way, ATi's are capable of 5 simultaneous processes. While the HD2600 is capable of processing 120 processes, it has 24 physical stream processors, opposed to nVidia's 32.
Another problem with the HD2000/3000 is the low count of raster and texture shaders available, while really hurts it in older games and anti-aliasing. Due to the low ROP count, ATi decided it was a better idea to make the stream processors manage AA; while this makes AA more customizable/programmable, it takes resource away for the actually number crunching. This is why the HD2000/3000 cards suffer massive performance hits with AA. If I remember correctly, the 8600 has 16 TMU and 8 ROPs, opposed to the 2600's measly 4 TMU/4 ROPs.
I'm not sure how higher transistor count and clock speed on one over the other is good. Considering the Geforce is clocked at 475 MHz on the core, it can beat and keep up with the 2600's 600 MHz. I think in this situation, the lower clock at the same performance means the Geforce is more efficient and should produce less heat.
Image quality being better on the Radeon? Uh sure, maybe about 6 years ago during the Radeon 9000/Geforce FX era.
By the way, your laptop has the regular HD2600, not the XT. The only laptop I know that has the XT is a Fujitsu. -
You forgot the fact that Nvidia has its shaders clocked higher than core clock opposed to ATI. Also all previous generations have been bottlenecked by the ROP's but the theoretical amount the cards can numbercrunch are much higher than the Nvidia cards.
Tbh your post is utter useless. You are comparing a car to an airplane. Two totaly diffrent designs. The fact that you just copy paste what ppl say about ati / nvidia shader ratio and ROPS etc also doesnt make you look smart.
Also you conclusion on how a chip consumes less power than the other made me giggle. You have no clue. Youre forgetting the produciton procces, chip design and thus energy leakage and last but not least the production size.
Image Quality.... This is debatable so im not going into this one.
Im not fanboy on either side. My own main rig has 2 98GX2's, my notebook a 3650 and ive advised loads of ppl with both the brands. All im trying to say is that your comparison is wrong on some critical points and that the internetsz has alot of wannabe smart ppl so dont copy them. -
If you're trying to make me look like a fanboy, please have look at my signature. Okthxbye. -
how in the hell does people OC their memory clock to 1200??? that is awesome http://www.youtube.com/watch?v=rItbzQksytQ
-
-
and our is GDDR3? which is worse? **** that's gay
i was wondering is it possible for the gateway m-6864fx to exchange GPU? like from a ati radeon 2600 pro to ati radeon 2600xt can it be done? -
Due to the low amount of controller units for the shaders the cards stuggle when we input AA/AF. This is where the 2 and 3 series of ATI cards lose ground, and lose most benchmarks. Altough i dont see alot of us using AA on our notebooks anyways.
Now your point, saying where the 2600 loses vs the 8600GT. Simpel. The G80 is just a superior product when it comes to games with AA and the Catalyst driver not being as much optimized as the Nvidia drivers. Also ofcourse the games which have an Nvidia logo written al over them.
-
The video "card" is not upgradeable on the 6864FX. Even though it is a dedicated video "card", it is integrated on the mainboard, so it is not removable.
-
Aha, hehe Thats the only place where i ever saw that being said. Good thing i left that hell hole long time ago.
-
51 MB/sec average transfer rate vs 45 for a Western Digital 5400 RPM.
-
steelcurtain11 Notebook Consultant
i tried searching around on this forum to find the answer to this but couldn't really find anything except overclocking stuff.
I have the gateway m-6862 with the ati radeon 2600hd just like everyone else. i'm not sure if i should upgrade my graphics drivers. here's what i have now.
version: 7.14.0010.0555
date: 1/10/2008
anyone know if i need an update? and where/how to get it? links please.
also.. i've noticed that my sound is starting to crackle when playing audio from applications and youtube and such. anyone else get this? -
I have a fujitsu lifebook N 6470 and they said it came with a 2600HD 256mb DDR3 stock clock's 500/600 but i can bump mine up to 700/750 and it never goes over 64c when gaming. i am just working on the bios to change the speed's on it so i can not o/c it everytime i turn the computer on -
so what drivers are the best?
whats a safe gpu overclock? -
You all overclocking hd2600 512mb gddr3 right? But did someone overclocked or maybe knows how to overclock hd2600 512mb gddr2 I think that I have this card because in gpu-z programme it shows ddr2 or maybe I'm wrong. So I tried to overclock it the core was peace of cake I could overclock it from 500 default to 750 no artifacts. But with memory I have a problem if I add 100mhz to default 400mhz I get artifacts if this the end for memory or what? I didn't try undervolting because my cpu is older than yours it's t5450 1.66Ghz so I don't know if it's worth to undervolt by the way it uses1.36V on default.
P.S maybe I should buy a cooling pad for memory better overclocking or no?
Sorry for my bad english knowledge -
If you're getting artifacts then a cooling pad wont help, sorry.
Run HWmonitor to monitor temps as it keeps track of the highest temp reached and run benchmarks\games or ati tools artifact scanner to make sure your oc is stable and make sure your gpu doesnt get too hot(80+ is a little too hot for most peoples likeing)
-
Hm temps is ok when playing grid or other games it doesnt go upper than 59C. So I think its normal. And another question I'm getting artifacts because it's not possible to lift the memory no more? Or there is another thing?
-
Yes your card is becoming unstable which could be due to insufficiant power supply; laptop gpu's are not designed for overclocking. Every card has it's limits though and i guess you just found yours.
It might be worth trying a few different drivers to see how it affects performance and how far you can overclock. -
I would try with omega drivers but there isn't any for vista so I'm leaving those from toshiba website or should I upgrade? Catalyst power saver was turned on so now I turned it off. What about atitool when I hit find max memory it is testing video card very long testing and testing with those artifacts and heating up why it's so slow? How much it adds mhz every time?
And why I can rise core so high and memory so low? Due to ddr2? -
Don't let any program oc your card for you! The GDDR2 memory would probably be why you can only oc your memory so high. Although if you look at your oc as a percentage of your stock clocks you can oc it almost as much as with GDDR3
Try to find new drivers, not sure whereTry the forum search.
-
-
So what drivers I should try catalyst 8.10? By the way with these values 750core and 450memory in 3dmark06 I reached 4100 points.
-
With Crysis, my screen goes blank after a while and I had to underclock it to 600/504 for it to work stably. But as you know crysis is expected to be heavy and gave me about 25-35 fps but mainly 28 - 30. Let me mention that for DMC4 it was at 1280X800 my native res and crysis was on 1024X720. DMC4 everything on medium with 2X anti aliasing and medium on crysis. For the game alone in the dark, I had to set it on 648/450 to get it steady. With a 800X600 with everything else on medium i get an average of 28fps. No AA ofcourse.
Can someone let me know why it is exactly that certain games act differently to my GPU? Because I figured that with a vista score of 5.6 on DMC and 5.0 on crysis and Alone in the Dark, that it would be stable?
ALPHA
ALPHA -
IS the catalyst 8.10 really out? I just downloaded and fell in love with the 8.9 already haha.
ALPHA -
that i dont know but i like my drivers i have i get a stable 700 core and 750 memory and never goes over 64c when gaming
-
Any ideas if in the past I could easy overclock my gpu from 500 to 700 and memory from 400 to 475. Now I can't even get from 500 to 501 the amd gpu clock tool program just stops and the screen getting white, then cames out artifacts, no cooling problem, temperatures are normal, this started after I installed catalyst 8.12 beta drivers and now after I uninstall them, no versions of drivers can't help me. I can't overclock at all, no program works? So maybe somebody knows what's the problem? On default card works great I didn't see any problem at all. But if I try to overclock the screen gets blue, then goes to white and artifacts shows up.
-
ATI MOBILITY RADEON HD 2600 graphics with 512MB GDDR3
Discussion in 'Gaming (Software and Graphics Cards)' started by X360proGamer, Aug 13, 2008.