Hey, I thought you might be interested to see some hard results for the difference between the 400mhz and 700mhz ram versions of the 8600m gt.
Some people say it makes a HUGE difference.
Others say not so much.
I will let the results speak for themselves for you to analyze.
The one thing i will mention is that the gap will close the more you do things like lower resolution and lower the amount of AA and AF
Counter Strike Source Video Stress Test (my favorite bench, way better than 3dmark)
4xMSAA, 4xAF, all settings maxed, 1440x900
Average FPS @ 700mhz ram = 115
Average FPS @ 400mhz ram = 83
-
masterchef341 The guy from The Notebook
-
ShadowoftheSun Notebook Consultant
That's pretty bad... ugh. At least at 12x8 the difference will be minimal. That's a 28 percent performance drop. Hopefully the card will be responsive to clocking to around 500 or beyond. Of course, I won't be doing this right away, but in the future if the performance becomes unacceptable, I might go for it.
-
Ouch 700mhz is over 40% faster. I knew it made a difference but that is big.
-
-
Ach...yeah, I suppose that's probably the reality of it. Oh well. At least we can hope to overclock a bit...and well, I almost never bother with AA or MSAA anyway, and I don't mind sacrificing resolution a bit. I'm fine to game at 12x8.
-
masterchef341 The guy from The Notebook
there seems to be some number confusion. "over 40% faster" and "28% faster" are both incorrect.
the hardware difference:
just so we know the numbers, 700mhz is a 70% upclock in memory frequency over 400mhz.
the difference in counter strike source:
the faster memory card is 38.6% faster than the slower memory. (approx 1.386 x 83 = 115)
or you could say the slower memory card is 72% as fast as the faster card.
keep in mind this is just one test. different games and situations and benchmarks will have varying results. -
.
-
masterchef341 The guy from The Notebook
hah. oh osserpse. you poor math skilled soul... you are forcing me to teach you math... sigh...
150 is 50% more than 100.
100 - 50 = 50.
therefore, according to your logic, 100 is 50% of 150...
obviously that is not correct. 100 is 66.6% of 150. two thirds.
how is it possible that 150 is 50% more than 100 AND 100 is 66% of 150?
its just merely the way of things, young osserpse.
basically, you can't add and subtract percentages, because percentages are basically just rewritten multiplication.
another example...
2 + 50% of 2 = 3
but 3 - 50% of 3 is not 2.
its because those percentages are fraction being multiplied by the number in italics... you see? -
I'm going to have to go with masterchef on this one. I was a little bit intrigued by the whole thing as it's been awhile since I've done anything dealing with %, my memory serves me as masterchef has been doing.
-
The fact 83 is 28% slower than 115 is also correct (115 - (28% of 115 = 32) = 83.
We just said it two different ways.
And just like masterchef said 150 is 50% increase on 100 and 100 is a 33% decrease of 150. -
Meh, whatever
. I'm just using the wrong words and got brain twisted.
-
I hate math!
-
masterchef341 The guy from The Notebook
although i guess this means that now you are using the wrong words, too
i agree. math sucks. its good to have it right, though. -
-
Isn't anything above 60 FPS meaningless anyways, as that is the threshold at humans can even detect a difference?
-
-
masterchef341 The guy from The Notebook
its a benchmark. it shows the difference in performance of the two cards. you can see the change in performance of changing the memory clock.
its a good indication of what the performance difference might be at other frame rates.
whether or not the human eye can detect the difference between 83 and 115 is sort of a moot point. its an entirely different discussion.
most crt monitors don't display images faster than 85 frames per second anyway. most lcd's have a 60 fps refresh rate (16ms - some have even lower). so you absolutely won't see the difference unless your hardware can display it. -
Thats a big difference, but honestly until we are really pushing both cards to their limits, we won't physically observe a difference (like see with your eyes). Yes there will be high end games where the 700 mhz will perform far better than the 400mhz one. But for me, just to run the game which my current desktop has problems doing it, is prize enough. Going on a tight budget, the 700 mhz was just a bit out of reach. Considering I don't game a lot, I couldn't justify the increased cost for me.
-
Well, if you get 300fps instead of 50, you'll have GPU power left over for future games. Well, maybe just one more wave (Crysis has eaten up the capability of a 8800GTX).
-
-
Out of interest, what were the machines you used? It's always important to state the environment when giving test results. (I'm sure your results are accurate and conclusive, but it'd be useful to see what they were carried out on
).
Also, any chance of running 3DMark06? -
masterchef341 The guy from The Notebook
2.4ghz core 2 duo
2 x 1 GB 667mhz ddr2 ram
800mhz fsb
8600m gt core at 475, memory at 400 and 700 to represent DDR2 and GDDR3 clocks, respectively.
no chance of 3dmark06 -
just wondering, is the 8600 gt in the Asus C90 a gddr2 or gddr3?
Thanks -
The C90 has the DDR2 version.
-
Charles P. Jefferies Lead Moderator Super Moderator
I'm not going to debate the benchmarks in this thread but I'll make a statement.
There is way too much emphasis being put on the DDR2 vs. GDDR3 issue. The performance with any 8600M-GT is going to be great and those getting notebooks with an 8600M-GT are going to have a great gaming experience, regardless of the memory type. Don't stress if you got an 8600M-GT with DDR2. -
-
masterchef341 The guy from The Notebook
agreed. average fps above 30 is a great place to be. ideally your minimum fps will be above 30, thats an even better place to be.
edit:
below is true. if i did the benchmark without anti aliasing, aniso tropic filtering, or if i lowered the resolution, the percentage difference between the two cards would decrease, while the performance of both would increase. in other words, the 400mhz ram card would start catching up to the 700mhz one. -
Also high AA and AF are memory intensive so I believe that would somewhat exagurrate the results from a more realistic typical figure. I probably won't be using high AA very often so the difference of DDR3 vs DDR2 would probably be less significant in my situations.
-
What do you have against 3dmark?
-
masterchef341 The guy from The Notebook
why would you spend effort to further remove you from what you truly seek?
if you desire to know something, go straight to the source of it, i say. -
Please explain. I'm not really sure what you mean.
-
He means that 3DMark is an overrated method of gauging a card's performance that is worthless compared to actual real world measurements.
-
It should be noted however that the same can happen with very popular games (including Source, and hopefully UT3 engine games) especially ones sponsored by certain graphics card companies. -
Can you download the CS:S stress test if you dont have the game?
-
masterchef341 The guy from The Notebook
unfortunately no. the fact that 3dmark is free is the only reason people use it to compare gpu's.
everyone can run it for free. -
Think of 3dmark like the SATs or other standardized tests... its the only objective way to compare a computer by running the same tests with the same settings and getting a score... Too bad i think the SATs are Bull**** lol
-
3Dmark runs a bunch of games and looks at the fps of each one. How is that worse than what you are doing - looking at the fps of a single game? How is 3Dmark less of a gauge of real world performance?
The cheating does kind of suck, but it's probably well known which 3Dmark benchmarks are susceptible to the cheating and you can ignore those ones. Other tests are probably also susceptible to cheating, but maybe it isn't as well known? Another point to 3Dmark. -
How would a 512mb DDR2 8600gt compare to a 256 GDDR3 8600gt.
I know one may dig into system ram.. But just flatter me a little bit.
Also.
Does lowering your GDDR3 realistically illustrate the performance of DDR2?
Wouldn't they be different architecture and work somewhat differently, or would that aspect be so minimal that just performing tests on the speeds on the GDDR3 of each version compensate?
And on the issue of no AA making the gap smaller, mean that people with larger resolutions may be able to take better use of the DDR2 versions, as they will have more pixels, allowing the visuals to look relative? -
-
masterchef341 The guy from The Notebook
2. refer to number 1
FINAL WORD: both cards are really powerful. the gddr3 obviously edges it out by a solid margin in counter strike source, but BOTH cards are able to handle cs:s with ease. any 8600m gt is going to offer really great performance in popular games like source based games and they should both perform well in the ut3 engine as well, which runs rainbow six, ut 2007, splinter cell games, among others. -
-
First off, let me say that this site is fantastic. Such a great wealth of information.
Anyway, I'm trying to decide between the Inspiron 1520 and the Sager 9020. The deciding factor right now is the video card. Which will give me better performance: the 8600GT in the Inspiron or the 8600GT in the Sager. I know the Sager 8600GT is DDR2. Is the 8600 in the Inspiron DDR2 or DDR3? -
They're the same
-
Both are DDR2?
-
Both are DDR2. The Sager might give a minimal performance increase in very texture heavy games, but it's unlikely since the memory bottleneck is in it's bus speed (128-bit), not the amount of VRAM.
-
There are only 2 or 3 companies that use the DDR3 version. Apple uses it in the MacBook Pro. I think Asus uses it in one of their notebooks.
-
If you're looking for an 8600gt with gddr3, the Asus models G1S & G2S and Macbook Pros are the only ones in the US I am aware of for the time being. The Dell, Acer and Sager (Compal) are all gddr2, as pointed out already.
Edit - Cyclone beat me to the punch! -
Thanks a bunch guys! Looks like I'm going for the Sager.
As I said, this is an awesome site.
-
The big thing is this will determine how high of settings you will be able to go in newer games. Sure you can play all the current games just fine with either, but when a new game pops up with some new eye candy, the GDDR3 will be able to set settings to high instead of medium.
This is quite a difference between the two. This is pretty much like moving up one step to a better card. -
Good choice!
One take on the DDR2 vs GDDR3 8600m gt
Discussion in 'Gaming (Software and Graphics Cards)' started by masterchef341, Jul 28, 2007.