Here are benchmarks of the 8800M GTS on the Gateway P-6831FX laptop with the stock processor (T5450) and the T7300. These benchmarks were performed on my own personal P-6831FX, purchased 3 days ago. I have done a clean install of Vista, and I am using the 167.43 drivers. I've also compared 1280x1024 resolution vs. 1440x900 resolution, stock clock speeds vs. overclocked, and no AA/AF vs. 4x/4x AA/AF.
This system is completely stock (with the exception of the T7300 that was installed), with 3GB RAM, 250GB 5400 rpm HD, 512MB 8800M GTS graphics card, and an intel T5450 C2D processor. The T5450 is clocked at 1.66Ghz with a 2MB cache and a 667 FSB, whereas the T7300 is clocked at 2.00Ghz with a 4MB cache and an 800 FSB.
AA/AF--- Resolution- Processor-- Clocks----- 3dMark06 Score
0/0------1024x854 ---T5450---500/800--------6887
0/0------1440x900 ---T5450---500/800--------6721
4x/4x----1440x900 ---T5450---500/800--------5883
0/0------1024x854 ---T5450---642/942--------7133
0/0------1440x900 ---T5450---642/942--------7088
4x/4x----1440x900 ---T5450---642/942--------6410
0/0------1024x854 ---T7300---500/800--------7583
0/0------1440x900 ---T7300---500/800--------7311
4x/4x----1440x900 ---T7300---500/800--------6108
0/0------1024x854 ---T7300---642/942--------8263
0/0------1440x900 ---T7300---642/942--------8067
4x/4x----1440x900 ---T7300---642/942--------7010
-
Crimsonman Ex NBR member :cry:
Where did you find this?
-
I didn't clarify that in the original post. I'll edit that...
I performed these with 3dMark06 Professional. -
I guess the processor CAN make a difference.
-
processor upgrade helped a lot!
-
It seems that the T5450 really was bottlenecking this thing majorly. Hell, it may still be bottlenecked by the T7300... who knows.
-
should edit the first post >.>
"Here are benchmarks of the 8800M GTS on the Gateway P-6831FX laptop with the stock processor (T5450) and the T7700." But thanks for a update on the upgrade I think this does indeed confirm the original 3dmark results from nvidia where greatly effected by the cpu. -
Yeah, but the 3dmark06 has a test ONLY for the CPU...so yeah, I can see how it affects the 3dmark06 score. Not too sure about gaming in general though.
-
Benchmarks don't always equate to real world performance.
-
so let me see, according to this, a faster cpu and higher clocked GPU make a difference...how unexpected
-
as gpus get more powerful the processor has to be improved too
-
Awesome! Thanks!
-
im not to sure why this was unexpected. the processor plays a large part in 3dmark, and thats about right where it should be.
i doubt it will have a huge effect on real world performance. -
How come I only get a 3Dmark06 score of 5629??? ...with everything stock, no fresh install and only an upgrade to 169.09. I have no programs running in the background and AA off as well. Any suggestions? I hope mine is not a lemon. Thanks!
-
-
Yes I have the free edition 3Dmark06 Basic edition and resolution 1280x854 and I am set to high performance. Is the free edition not reliable? Thanks Snowsurfer!
-
-
Ok Thanks, I'll do that now.
-
cpu do a big difference in 3dmark... but in real gaming it do nothing... you will have the same performance with a slow core2 or a fast core2... if you stay at high res...but if you go at slow res the cpu wont be able to follow the videocard so instead of having 200fps you will have 120...
anything over 30fps is perfect -
3dmark06 has a processor-only component and biases the score by about 150-200 per processor rank at 1280x1024. (in the C2D 7xxx family)
This is a scoring bias and thus a flaw in the mostly real performance of the 3dmark06 tests.
(the only thing synthentic about 3dmark06 is the scoring as everything else is based upon FPS in very difficult DX9 rendering tests)
Due to 3dmark06's flaw, this test only proves the flaw.
Processor WILL help in actual games, but not as much as GPU. -
http://forum.notebookreview.com/showpost.php?p=2912065&postcount=417
Sure, people can argue all they want that CPU don't matter and benchmark don't matter, but fact is many people are posting their benches for a reason. For real gaming performance? My 8800m GTS isn't faster then my 8800 GT when playing Crysis, but its darn fast to be able to play Crysis. And other then Crysis, 8800m GTS plays all the games I have at max settings very nicely.
8800m GTS is about 7% slower then 8800m GTX by 3dmark06. -
Typo in first post, it should have been T5450 v T7300 (not T7700), I assume people looked at the table and saw that it was a T7300.
Also, no where was it stated if these results were expected or unexpected. Nor was it said that this would impact gaming performance by much, if at all. The point of this was to give users an idea on what types of scores the 8800M GTS in the P-6831FX will put out with different clocks and processors, as well as to show the effects of resolution and AA/AF. -
+1 for the initiative. Even though most expected the results, its always nice to have the expectations confirmed.
-
...
I want to strike down people who believe the processor dosnt make a difference in games....
If you have a GREAT gpu but a crappy processor, your processor is just going to bottlneck your GPU and make it run worse. My last computer was an AGP, well actually I dont have a new computer quite yet but Im getting the Gateway P-6831FX ive decided today seeing as its upgradeable and it has great specs anyway. But anyway, my last computer ( it was a desktop) and it was AGP I had pretty much the Highest specs I could get on an AGP computer with 4 gb ram and an overclocked BFG 7800 with a 3.4 ghz processor. When I first got my 7800 (an upgrade from an old Nvidia Geforce 5600) I had a 2.8 GHZ processor and was trying desperately to get Dark Messiah Might and Magic to work, which is a crappily made game anyway but it was still bothering me that I wasnt seeing the huge FPS difference I was expecting. I contacted the Nvidia support and asked whats up, they told me I should probably upgrade my processor becuase it can slow your video card down by a lot if its a slow processor. Upgraded it to the 3.4 and suddenly the game ran so much smoother.
A slow CPU but a great GPU isnt a great match and it does help a lot by upgrading. But hey, if im wrong fine its just personal experience. Plus getting a better power supply really helps a beefy GPU, dont really have to worry about that on a laptop though lol. -
Ahh and i really hope the laptop supports Quad core processors.....
Anybody know anything about that? -
Processors actually don't make such a difference games like they used to. The examples you used dates back to the Pentium 3 and 4 eras, dual-core processors are a huge leap forward performance-wise and the only reason to get a T7xxx processor with 4MB cache would be to pair up with the 8800 series cards since those are powerful enough to switch the system bottleneck from GPU to CPU.
Very few games nowadays will stress out dual-core CPUs to the limit forcing buyers to go for pure processing speed, the exceptions being Supreme Commander, World in Conflict and Crysis.
Forget about quad-core CPUs for the time being, tough enough fitting large dedicated GPUs relative to notebook size into 15-17" and packing four CPUs in whatever space you have remaining. -
-
umm the 3dmark06 has a cpu component in the benchmarks - of course the scores will be sharply higher.
What did you expect??
Now test that on some real actual games like FEAR, CRYSIS, CoD4, WiC, UT3, NFS where it'll be a better indication of what effect a cpu upgrade will do. And ppl are forgetting, its not just the difference in speed in the cpu but the cache size. 2mb vs 4mb - in some games that likes cache it'll make a difference. -
Atually, CPU's play a MASSIVE role in REAL world performance!
Here's something I did.
Used a Core 2 duo clocked at 1.8GHz, and it got 120FPS average in the CS:S stress test. I clocked the Core 2 duo at 2.7GHz and it completed the test with 230FPS average on the same settings, in the exact same conditions.
This is the EXACT same with BIOSHOCK, COD4, Crysis and EVERY other game!
If you think the CPU plays a small role, then compare a celeron, vs a core 2 duo in real games.. it would make a difference of about 3000% or more depending on the game with both systems using any form of an 8800 card. -
yeah I agree there's a difference but like most things there'sa point of diminishing returns. if you started at 1.66 and jumped to 2.4 the difference will be quite noticeable. I think anything above 2.2 you'll start seeing diminishing returns. 2.2 is about the magic spot.
That was taken from a review of C2D desktop cpus ( http://www.neoseeker.com/Articles/Hardware/Guides/cpu_bottlenecks/8.html ) but the underlying concept shouldn't be too different for mobile cpus. -
From very extensive tests of my own, I can personally say that if you dont want your CPU to be robbing the performance that your video card is capable of, then I would reccomend no less than a 2.4GHz core 2 duo for any 8800 card on any type of PC, and no less than a 2.7GHz Core 2 duo for SLI.
-
Running on XP
Stock components and clocks, 169.61 GPU driver
Running 3dMark06 Basic: 7096 -
I own this laptop. replaced stock chip with T9300 and added 2 7200 rpm seagates in raid 0. very fast laoded vista in like 18 minutes
3d Mark 06 score 8798 no OC
memory bandwidth 4200
file system benchmark 77mbps
wondering if i put 800mhz memory in this T9300 will it run 800mhz or 667?
more than anything else at this point it is memory limited. want to either oc ot or find out if 800mhz runs on this
Anyone found the PLL on this yet?
anyone tried or successfully OC'd this board yet? (cpu not gpu) this t9300 runs at 35 idle and i bet easily clock much higher.
feel that at this point it is memory limited so i want to try find out if it will run at 800 mhz with the right memory or OC it somehow. -
Would you mind doing a 15 minute thermal test using TAT? My 5450 leveled out at around 70C per core. Both these CPU's rated 35W so was curious.
Not sure if we can unlock the CPU clocks, haven't read of a way yet.
ps: did you mean 18 seconds -
-
Can we get some Crysis benchmarks with this kind of comparison?
-
-
-
btw great review!
I see you got 27fps running Crysis at 1024x768, DX10, All Settings On High, No AA. Have you tried the 174.74 drivers. There very good.
http://www.laptopvideo2go.com/forum/index.php?showtopic=18049 -
uhh CPU always have BS score in 3dmark06, it doesnt mean anything for gaming, lets say some people have quadcores can break 20k 3dmark06, while others with dual core pwn them in real gaming benchmarks..
-
yeah, the time for quad core isn't ripe yet. a cheap dual core will do fine.
8800M GTS benches (Resolution, clock speeds, T5450/T7300 compared)
Discussion in 'Gaming (Software and Graphics Cards)' started by Canix, Jan 23, 2008.