at all max settings @ 1440x900 in bioshock I got an average of 15fps
-
-
it does, however, prove that 2600 is a good card indeed. with some new drivers, it will be able to compete with 8600 at its fullest potential. it did, however, score 200 less parrots than the usual 8600 (3072(2600hd) vs...3200-3300(8600m)) -
yeah I thought the score was decent... Are you talking about gddr3 8600's or ddr2? I thought most ddr2's didn't get over around 3000-3200 @ 1280x800.
Oh I'm running a TL-60... bit slower than most of the core 2's so that has an effect as well don't forget -
i have a ddr2 version.... get around 32xx at 1280*1024
-
speaking of which (off topic...) anyone know any programs for overclocking my CPU? and how safe/effective it is on a my particular CPU?
-
I still dont get this. There is a XT-version with DDR2?
Sorry if I am a bit slow.. I just wanna get this right.
//Fabian -
They're not talking about the HD2600 XT, they're talking about the regular one that's available with both DDR2 and GDDR3 memory. The XT is only available in the 20" HP HDX.
-
-
Fabarati. Well thats the weird thing. HP/swden sayes the HP HDX with HD 2600 XT uses ddr2. I am trying to figure out if this is correct.
If it would use DDR3(GDR3?) i just might buy it...
//Fabian -
-
that is true, my bad guys... i got a 2ghz proc....
-
-
Crimsonman Ex NBR member :cry:
-
Here are the reuslts with 163.69 whql, i'll post screens with 163.75 in a few...
3DMark Score 3202 3DMarks 10 0 0 -1
SM2.0 Score 1309 10 0 0 -1
HDR/SM3.0 Score 1129 10 0 0 -1
CPU Score 1793 10 0 0 -1
Game Score 0 Points 10 0 1 N/A -1
GT1 - Return To Proxycon 10.6214819 FPS 10 0 0 0 SM2.0 Graphics Tests
GT2 - Firefly Forest 11.19667435 FPS 10 0 0 0 SM2.0 Graphics Tests
CPU1 - Red Valley 0.564732432 FPS 10 0 0 1 CPU Tests
CPU2 - Red Valley 0.910829723 FPS 10 0 0 1 CPU Tests
HDR1 - Canyon Flight 10.27495003 FPS 10 0 0 2 HDR/SM3.0 Graphics Tests
HDR2 - Deep Freeze 12.30827904 FPS 10 0 0 2 HDR/SM3.0 Graphics Tests
Fill Rate - Single-Texturing -1 N/A 10 0 1 N/A 3 Feature Tests
Fill Rate - Multi-Texturing -1 N/A 10 0 1 N/A 3 Feature Tests
Pixel Shader -1 N/A 10 0 1 N/A 3 Feature Tests
Vertex Shader - Simple -1 N/A 10 0 1 N/A 3 Feature Tests
Vertex Shader - Complex -1 N/A 10 0 1 N/A 3 Feature Tests
Shader Particles (SM3.0) -1 N/A 10 0 1 N/A 3 Feature Tests
Perlin Noise (SM3.0) -1 N/A 10 0 1 N/A 3 Feature Tests
8 Triangles -1 N/A 10 0 1 N/A 4 Batch Size Tests
32 Triangles -1 N/A 10 0 1 N/A 4 Batch Size Tests
128 Triangles -1 N/A 10 0 1 N/A 4 Batch Size Tests
512 Triangles -1 N/A 10 0 1 N/A 4 Batch Size Tests
2048 Triangles -1 N/A 10 0 1 N/A 4 Batch Size Tests
32768 Triangles -1 N/A 10 0 1 N/A 4 Batch Size Tests -
Ehm but the thing is they (HP) say the swedish HDX has a HD 2600 XT with DDR2.
Istnt DDR2 "worse" than GDDR3? Thats what confusing me. I am trying to figure out if its a "good" or not so good gpu in the HDX.
Am I totally missing something here? Could be.
Tell me to bugger of if Im messing up this thread.
//Fabian -
DDR2 can't be clocked as high as GDDR3. That's the difference. And I doubt that DDR2 can be clocked high enough for it to be used in a HD2600 XT.
-
-
The Forerunner Notebook Virtuoso
Whats not believable about that?
-
GDDR2 and GDDR3, are the same memory chips, just built at different speeds. GDDR2 maxes at 499/998~(by labeling, many chips of course cant even do these speeds) and GDDR3 refers to 500/1000+ speed GDDR2 chips. A stupid little industry braggint difference. Sadly several GDDR2 things get randomly labeled GDDR3, because, well, its the same chips, whos gona notice-and yes its technicaly false advertising, dell has already been hit for that one once, when they SOLD GDDR2 8600gt m's as DDR3 8600gt m's, which... sadly has a bit of a preformance gap... in the bad way.
In short. Get your hands on the actual vram clocks, GDDR2/GDDR3 is just too unreliable a label due to them being the same damn thing at different speeds.
Why cant you hence just overclock GDDR2 8600gt m's to GDDR3 speeds?(or in this case, HD2600 to HD2600XT) Well, its actualy plausible you might find one or two that can. The chips however are unlikely to take it, and something about the ram bus design in the 8600gt doesnt tend to like more than 256mb of ddr3 i suspect. Abiet, the 8700m's are a workaround for such I'm thinking, that of course, being clocked so high AND having the 512mb, are power/heat ... whores.
Anyway. Thats that really. +/- a few theories that even if I ever find the truth out it'll probobly be under nda... XD... (for instance, i would swear the 8700 through 8400gt are all the same production core somehow... prehaps even the 8400gs as a reject of rejects of rejects... or just 8600gs cores cut in half somehow... hmmm, the pondering...) -
GDDR2 isn't even proper DDR2, it's a midway post only found in Geforce FX (5800 I believe).
GDDR3 is optimized DDR2 meant for graphics. They do share the same technological base (as opposed to DDR3), but it's been tweaked. -
And yes, the 8600M-GT and 8700M-GT are based off of the same core (the G84m), and the 8600M-GS is based off of that same core as well, though with half of the stream processors and TMUs disabled. The 8400M-GT and 8400M-GS are based off the G86m, with the 8400M-G being based off of it as well, again with half the stream processors disabled. -
The Forerunner Notebook Virtuoso
^ What odin said.
-
Uhum...^^what I said
-
So about 3000, which is still a good score. And my theory the underclocked HD 2600 is smack in between the DDR2 8600 GT and the GDDR3 one is still intact -
Crimsonman Ex NBR member :cry:
-
I'm pretty sure that the clock readings are wrong, btw.
-
The Forerunner Notebook Virtuoso
I doubt its in the middle of the two. Its probably on par or worse than the ddr2. Atis gpus get higher 3dmark scores which dont reflect their in game performance well.
-
And I'm pretty sure the stock settings are 500/600. -
Crimsonman Ex NBR member :cry:
-
-
-
-
-
-
well, we'll just have to wait for forerunner to clear things up for us.
night
-
The Forerunner Notebook Virtuoso
Ah I was referring to the ddr2 version. Didnt check what you guys were referring to before inputting into the conversation. I agree with the gddr3 being in the middle.
Not a nvidia fanboy buts its known ati usually scores higher in 3dmarks and thats why many people including myself use 3dmark as a reference for driver performance boosts rather than a cross comparison for ati and nvidia.
Sorry for all the confusion. -
ViciousXUSMC Master Viking NBR Reviewer
To really compare the 2600 to the 8600 we need temperatures and overclocking results. preferably in the same notebook so the environment is the same for both.
Im sure everybody has heard me say it before in another thread but there are alot of differences between the two that will no be apparent with just stock 3dmark scores.
Here are the things I want to see put to the test:
The 2600 is built on a smaller die process than the 8600 this in theory gives it less heat output, less power consumption, and a higher overclock ability.
The 2600 has a super scaler technology vs what nvidia uses (the name of the nvidia tech excapes me at the moment) Whats important with this is that in theory the ATI card has much more power than the nvidia card however super scaler is very difficult to use and requires perfect drivers. While nvidia requires alot less driver work. So while nvidia may get away with one small tweak to correct performance for a handfull of games, ATI needs a tweak for each game specifically.
This means over time the ATI card can win even if its behind now as drivers improve.
The last thing is that the ATI cards been showing that it scales better when overclocked than the Nvidia card.
You put all of this together and I think the ATI card can very easily beat the Nvidia card. Especially for those willing to overclock the card.
I get most of this information from the 8800GTX/GTS vs 2900XT arguments and discussions because these two cards are exactly the same tech just will lower clock speeds and cut down components. -
Crimsonman Ex NBR member :cry:
-
Nice scores Serpah!!! What is the exact laptop and config Uve ot??
Ive finally finished with fres Vista on mine 8510p so will be benching and o/cing after i download 3dmark. I will give o/c another shot as previously it didnt go very well - crashes of drivers and so on. I should update u all in couple of hours.
ps. seraph what drivers were u using?? -
I have an a7k-a1.
Modded 7.10 drivers -
Ive got really funny scores after O/c
http://forum.notebookreview.com/showpost.php?p=2597232&postcount=23 -
Hello again
I read thru the thread again especially paying attention to Seraphs screens and Ive noticed a big difference! Ahaa!
Seems like both HP and Asus have different cards, from different sources, or most likely if they are from the same source they have different BIOS! checking upon GPU-Z screen and comparing to mine I noticed different BIOS version than mine. It has also different 2D clocks and different default ones.
HP is giving 125/250 core/mem for 2D where Asus is giving 300/400 for 2D while 500/600 - stock for 3D.
I need to check couple more thing with U seraph.
Could U run 3D mark with default resolution 1280x1024 with clocks set to 300/400 and 500/600??
and also do U see the difference after u change clocks in fill rate etc in GPU-Z?? close the tool between checking.
Also would U be keen to make a copy of your BIOS for me?? do u have DOS bootable USB stick??
THANKS! -
The Asus bios can be found on the official asus site, but i won't run on the 8510p (amd cpu versus intel cpu, different mainboards...)! I bet that there's some sort of oc protection in the HP 8510p...maybe you write to hp and ask them?
-
LOL.
I dont want Asus CPU bios!! I need HD2600 bios from seraphs notebook
Update:
I wonder what would u get serpah about your gf card with that tool to copy bios...
let me know if u are interested -
I got one of the 8510p units today and the GPU keeps crashing and resetting. I get a message saying the VRU graphics accelerator is resetting after my screen goes to garbage and then blanks out, and then windows xp comes back just fine.
It seems to be running hot, though not hot enough to burn my legs so I am not sure if the GPU is overheating. The fan kicks up high so I know for sure its running full out.
The question is , is it a hardware problem , a software problem, and is it the GPU or the laptop design if its hardware.
HP sent me to an internal link to download new ATI drivers that came out in the last week or two, but that did not fix the problem. Anyone hear about this 2600 GPU overheating in any laptop designs? -
Crimsonman Ex NBR member :cry:
I haven't hear of them overheating. but make sure the air vent isn't blocked. Get a cooling pad, it won't overheat
-
sorry i haven't been replying... I can probably run some of that stuff for you guys later. Right now I'm off to bed. btw... GPU-Z reads the same clocks no matter what I do to it right now.
-
-
I really like what's going on in this thread. Cant wait to see more overclocking results.
-
Mike: I've only seen one other person on the forums with 8510p BSOD GPU issues so hopefully it's pretty rare. My first laptop five years back was DOA and it really soured me, but the second shipment was great, so don't worry
But yeah, generally with a new machine the manufacturer will be pretty good about rushing a new one to you.
ATI 2600HD Review
Discussion in 'Gaming (Software and Graphics Cards)' started by Crimsonman, Oct 13, 2007.