just out of interest, desktop vs mobile.
both are 256mb GDDR3 stock speeds no overclocking.
As far as i can tell reading a lot of posts, both get around 4500 in 3dmark06 demo.
I thought the desktop 8600GT should would be more powerful.
What do you think?????
thanks
John.
-
Tinderbox (UK) BAKED BEAN KING
-
Desktop cards will always be more powerful, atleast more overclockable, simply since they're larger!
-
isn't the 8600mGT only 128-bit though? (whereas desktop version is 256 i thought)
cause that will affect performance pretty badly.. -
Tinderbox (UK) BAKED BEAN KING
Hi.
both are 128bit
regards
John. -
I guess they are comparable, both having 32 stream processors and being 128-bit. But as already mentioned by Prasad, the desktop card has better OC possibilities because desktop cooling is much more flexible than a laptop's. Plus, the desktop version already has higher clock rates to begin with.
-
Charles P. Jefferies Lead Moderator Super Moderator
The desktop 8600GT is probably closer to an 8700M-GT in performance.
-
If you check the specifications of the 8600-GT and the 8700m-GT, both are basically identical. Thus a comparison performance of 8700m-GT VS 8600m-GT GDDR3 is the same as you were asking for and makes it easier to find answers in this laptop forum.
Now to your question: I think it is powerful (4500 in 3DMark06 @ ????x1024 res, not 1280x800). The 8600m-GT GDDR3 is around 3500 (from what I have read in this forum). -
From the looks of it mobile cards are closing the gap on "out of the box" performance. Granted a desktop card will always out OC a mobile card, but in terms of stock performance this is great news.
-
masterchef341 The guy from The Notebook
you can measure the performance difference. the only difference between the 8600m gt, 8700m gt, and desktop 8600 gt is clock speeds, which you can set manually if you so desire.
-
Tinderbox (UK) BAKED BEAN KING
Hi.
I read somewhere that the desktop version uses around 45watts, I cannot belive the mobile version uses this much power?
regards
John. -
masterchef341 The guy from The Notebook
it doesn't.
even plugged in, the wattage budget of the macbook pro is 85 watts (or a hair more).
it only uses a few watts. the card is not physically the same. the performance specs are equivalent. it operates on a lower thermal and power budget than the desktop part. -
Tinderbox (UK) BAKED BEAN KING
Hi.
if the 8600m gt only uses a few watts and the 8600gt uses 45-50watts.
then the difference in 3dmark06 should be greater that 1000 points, 3500-4500
the link below give the power usage of the 8600gt and 8600gts
http://www.hardwareanalysis.com/content/article/1848/nvidia-8600gts-and-8600gt/
regards
John. -
Last edited by a moderator: May 8, 2015
-
Tinderbox (UK) BAKED BEAN KING
Hi.
8600mGT = 20 Watts according to the website below
http://www.notebookcheck.net/NVIDIA-GeForce-8600M-GT.3986.0.html
8600GT = 45 Watts
regards
John. -
Here is a small compilation of screenshots comparing the Geforce's 8600/8700 specs:
The results are these:
Geforce 8600m GT
Core Clock: 475 MHz
Shader Clock: 950 MHz
Memory Clock: 700 MHz (GDRR3 version)
Geforce 8600 GT
Core Clock: 540 MHz
Shader Clock: 1180 MHz
Memory Clock: 700 MHz (GDRR3 version)
Geforce 8700m GT
Core Clock: 625 MHz
Shader Clock: 1250 MHz
Memory Clock: 800 MHz
Geforce 8600 GTS
Core Clock: 675 MHz
Shader Clock: 1450 MHz
Memory Clock: 1000 MHz
* All of them have 32 Stream processors and memory bus of 128-bits.
-
-
Wow, so the 8700M GT is actually more powerful than the 8600 GT?
And if you OC enough, an 8600M GT can actually match or even surpass the power of 8600 GT? -
The desktop 8600 GT is somewhere between the 8700M GT and the 7950GTX . Tried it myself.
-
lol looks like some real info is streaming in..makes more sense what im hearing now
now u guys are comparing differant cards lol straying off topic instead of just swallowing your pride...
the 8600gt is not like the 8600mgt and if all u guys do is compare speeds than the hd2600 is better than both...which is simply just not true
ive looked around and from what i tell is...the hd is better than the gt in games...gt is better at 3dmark..thats all i found from actually sources and not made up assumptions like that dude off the bat saying the 8600gt is equal to the 8600mgt(i laughed hard when i read that) -
xps 1330
give one link to a legit review showing a default 3dmark within 1000 points of 4700....lol good luck -
dude the xps m1330 uses an 8400m GS
-
BenLeonheart walk in see this wat do?
desktop gpus > laptop gpus.
-
lol 8400m gs and he says hes got 4700 3dmarks06 ....too funny
i owned a nvidia 7150m before this hd2600...but i never had to lie about the results...it is what it is..i get 3600 stock and thats good enough..since i lowered my voltages i can now safely overclock to 600 with low enough temps..i guess that mean 4100 3dmarks..1 hour of tweaking and i have a card that performs like xt but at 600 dollars less...
I just wish i could figure out the fanboy mentals in here...claiming they got the fastest oc ect...some people oc and get 700 points more...and the select few idiots seem to get a bonus 2000>???? as if there gpu was a desktop one or something????
alot of people overclock...and 99 percent get 4600 or less...then there s the elite group of liars that get over 6000??? why do they lie about something so unimportant
also i had a point comparing the hd2600 to the gt and told a kid to run the benchmark with the config to see if he got the same scores...i told him to overclock it to the max and post 1 photo...he sent me a empty photo link....then another angry dude sent me a message telling me the config runs better on mine because its custom tailored for the hd2600>>>???? last time i checked crysis was sponsered by nvidia...well everytime i open crysis im reminded.
next i didnt tailor the config to my gpus advantage i mearly said it was for the hd and gt...assuming gt because its suppossedly better...the setting are all low plus the very high effects???how is that baised to the hd2600...
lol im getting irritated thinking about some of the crap people say on here -
Though eleron that 7950GTX eats the 8600m GT and 8700m GT as breakfast. They might be equal at lower res such as 1280x800 but the higher you go in res the more that 128-bit bus struggles, the more AA/AF you run width the more severely hard it hits the 128-bit cards due to much lower bandwidth. Of course it hits the 256-bit cards as well but not as heavy as the 128-bit cards.
I noticed this when I had dual 8700m GT´s in my M1730, at high res these cards are really bad. They struggled bad at 1680x1050 and especially 1920x1200. Not all games of course but these were two 8700m GT´s in SLI and heavily overclocked too.
Even if the shader power is better than a go 7950GTX that does not compensate for the inferior bandwidth these GPU´s output. Now the 7950GTX is a high end card however Nvidia stated the 8700m GT as a high end too which is not true at all in real life usage -
good point..u make more sense than most people in here...you think diff
i think u just think..which is rare -
Comes down to 256-bit vs 128bit.
-
You mean me ryane0840? I don´t want to diss any 8600m nor any 2600 that´s not my intention. They are really good mid range cards and as can be seen by benchmarks run games really well. I just think realistically that´s all. Imagine if the 8600 and 2600 had a 256-bit bus
Yes though at least one thing the 8600 and 2600 is better at compared to any go 7800GTX, go 7900GTX and go 7950GTX i.e the older high end cards. Is that these newer midrange cards with at least 32´SP´s has in fact better shader powerThis because they use a unified shader architecture compared to the "older" high end cards.
So at 1280x800 a 8600m GT might beat a 7950GTX in gaming performance. Well actually when I have my XPS M170 with a go 7800GTX up and running again we could benchmark against each other with both the 8600m GT and HD2600 at 1280x800. I definitely think you will beat my older high end card and the only difference between a go 7800GTX and any go 7900GTX, 7950GTX is that these are only higher clocked.
Only culprit with my older XPS is as said a single core CPU at 2.13GHz so that might effect some benchmarks of course since most CPU´s today is a Core2Duo and higher clocked. But anyway I will make a thread when I get that GPU up and running.
So in the end the 8600m GT and 2600 and up should perform better in newer shader intensive games than the older high end cards. Games today becomes more and more shader intensive, where the unified shader architecture of the 8600, 2600 performs better than the older shader architecture.
Someone can correct me if I am wrong on some points.
Desktop 8600GT Vs 8600mGT both GDDR3
Discussion in 'Gaming (Software and Graphics Cards)' started by Tinderbox (UK), Mar 18, 2008.