EDIT - mystery solved, read post #23.
If your score wasn't recorded at 1280x1024, it doesn't count.![]()
------------------------
Have been trying to get optimal performance from the system before I settle into using it (and before I bother with overclocking), and have not been getting the same results as a few others have. Are people that are getting 4600+ marks (non OC'd) running at default 3dmark06 settings? I am. Just want to make sure.
My base system is:
T9500 @ 2.6ghz
8600M-GT 256mb DDR3
4GB RAM
200GB 7200rpm
Anyone know what I might be doing wrong? I have checked power profiles to ensure there are no restrictions there, have turned off powermizer in nVidia control panel, and turned off unnecessary operating system services as well. Is there anything I missed checking?
The driver I have been using is 174.31 in both Vista and XP. I found a Dell specific version of the driver for XP, which worked the best for me - but the results still aren't good enough given my specs.
-
People scoreing 4500+ are overclocking, with stock cooling reaching around 680/790 and scoreing 4.8 to 5k. Your higher end processor will help bring out a few more FPS in Crysis though
-
Hmm that's odd, I got 4745 marks with SP1 (no OC) with 174.31. You're getting really low marks. My laptop also came with the 174.31 driver and it works great. I have pretty much the same specs as you, so there is obviously something going on. I didn't toy with any settings, so there's a slight chance you might have checked something you weren't supposed to? That's unlikely, but you could format and test it again if you have the time.
-
Thanks ApocNarok, indeed there is something going wrong here. Another user got 4738 using XP so your result sounds correct.
First I tried the laptopvideo2go version of 174.31 (for XP, with their own modded inf), got better results with the official Dell version of 174.31 for XP.
But either way, my results are way below what they should be. Can anyone else help? Any ideas what might be causing this? -
Honestly Udi, I wouldn't count on 3dmark for accuracy. I really think many people put too much emphasis into 3dmark scores. I would suggest trying to benchmark some games, and compare your results with other similiar systems.
If it's lower, than it might be a good idea to try closing some processes and startup programs, and see if that helps.
If that doesn't help, a fresh install might be in order.
From a fresh boot, how many processes does task manager report? -
Sorry to disagree but I do. Plenty of users with similar configurations to me are getting 4600-4700 in 3dmark06, and the closer their configuration gets to each other (ie. operating system), the closer the results get. I think you'd be kidding yourself (as would I) to consider it an inaccurate indicator.
I'm really freaking out over this. I've run GPU-Z to confirm my chip is what it should be, all clocks match stock clocks, model is 8600M-GT DDR3 256mb, so it's not like I got given an 8400M accidentally or something.
When I go My Computer > Properties though, it says 777mhz for actual processor speed. Is that what yours says ApocNarok? 3dMark06 says 2594mhz or something like that however, so it appears to be okay.
I'm willing to try anything so keep the suggestions rolling. I've scoured BIOS but maybe I missed a setting there somewhere? -
Do you mind sharing your three individual marks that made up your 3dmark06 score (SM2.0, HDR3.0, CPU)? It might help me narrow this issue down to a video card or processor issue. -
If it's reporting 777mhz, try checking your power profile in the task bar. Does it say power saver, or high performance?
-
XP doesn't have power management settings that are that involved. I did change it from portable/laptop to home/desktop though (something like that) earlier on and set it to never turn off hard disk and never standby.
Anyway, checked the computer properties again and it said 2.59ghz so all must be well there (maybe I was unplugged when it said 777mhz, not sure).
I reset all the XP services back to defaults at the same time, restarted, and left the test running again. Will see how it goes, but somehow I'm not expecting things to be any different.
Anything else I should check? -
If you've got the model with the 1680 res screen, then the reason it's lower is because 1440 and 1280 res owners can't run the default resolution, and so the score is biased because 3DMark down-scales the resolution.
If you want to properly benchmark your machine comparing it to other XPS M1530 owners, I'd suggest benching on games instead, as 3DMark is just a benchmark after all!
Alternatively, you could post your 3DMark05 score and ask others to post theirs, which AFAIK has a lower default resolution. -
I'm actually running a 1920x1200 display currently, which would explain a lot. Are you sure it reduces the score (or uses that much power) to downscale though? And would it have a lower score when downscaling from 1680x1050 to the default res, compared to 1920x1200 down to the default res?
If you're right about that, I'll be very thankful. I just want to make sure there's nothing wrong with my system. -
In 3DMark06 (didn't touch ANY setting) I got 4170 in Vista using 175.70 drivers from laptopvideo2go.com. The 1440 and 1280 res M1530 models "cheat" to get a higher score because they can't run 1280x1024, which is the default resolution.
I wouldn't worry about it, it's just a benchmark after all!
Crysis runs most excellent on this laptop, even on medium settings it's surprisingly playable!
I also ran Oblivion in 1680x1050, HDR and all settings to "Ultra High" - which is also very playable (compared to my MacBook Pro that struggled to play the game at a lower res with only Medium-High settings). -
I agree that your resolution could be the main reason why your marks are low. Your machine seems to be running just fine. Your driver works functionally. You just need to try out games and see how they perform. Check the top 10 questions thread for the m1530. There should be a link to some game benchmarks. Good luck, and enjoy your laptop when you can. -
halkyon -
Thanks, but I was only worrying because it'd be a shame to get less performance than I paid for. If what you say is true, running your desktop at 1280x1024, and turning off scaling in the nVidia control panel should net the "correct" results right? Because that way no hardware performance is wasted on scaling, and changing resolution isn't even necessary between tests thanks to the desktop being the same resolution also. I'm skeptical of this changing the results, but I've just done it (clean restart too) and left the test running again. Let's see what happens.
ApocNarok -
I'll grab a game benchmark or two and try them out soon. Any recommendations, or anything you've run and have a result for? I'd like to compare with your system as our system configs and OS are so similar. -
I think Crysis has a benchmark you can use, if you've got that installed. There's a CPU and a GPU test that you can run separately from the Crysis executable.
I'm going to run it now to see what mine is, I'll post the results. -
Here are the games that I've played:
Crysis: 20-35fps medium settings
Call of Duty 4: 25-40fps with high settings except AA
Half-life 2: 50-60+fps max settings
Gears of War: 30-50fps medium-high settings
Bioshock: ~40-50fps max settings
All were playable and very smooth. I hope that helps you out some. -
I'm getting...
Crysis - 20-25fps / med settings @ 1024x768
Oblivion - 25-30fps / max settings @ 1680x1050
Bioshock - 20-25fps / max settings (DX10 surfaces) @ 1280x1024
Bioshock FPS varies a lot, especially when more enemies come on screen, but bumping the resolution down to 1024x768 increases the FPS to around 30-40fps on all max settings, including DX10 surfaces.
I'm very pleased with this laptops ability to play the newest games, without sacrificing the quality either. -
Thanks, sorry to be an ass, but the FPS ranges are a bit hard to compare/benchmark with. A specific test would be better, but unfortunately I don't think Crysis is an option for me anyway (XP).
By the way, could you guys post your individual 3dmark06 scores? Mine are:
SM2.0: 1736
SM3.0: 1618
CPU: 2368
That'd probably help me narrow my low result down to a CPU or GPU limitation. By the way halkyon, I ran it with scaling turned off, and desktop at 1280x1024.. same results. The best I can get is 4385. -
Anyone know how well Unreal Tournament 3 runs on this bad boy of a laptop?
-
-
Default resolution is 1280x1024 as halkyon suggested.
What score do you get at that resolution? -
I just did the test at 1280x800 (which, I repeat, is NOT the default resolution) and scored 4950.
I'm starting to wonder if halkyon's idea, that the displays with less than 1024 vertical pixels (1440x900, 1280x800) do not render the extra pixels, holds true?
ApocNarok - do you mind running 3dmark06 for me at 1280x800? Give it a clean restart or whatever you usually do. -
Yes. I've solved the mystery (thanks to a thread a good friend of mine showed me, while I was whining about this issue of mine). Halkyon was EXACTLY right, I actually only noticed what he said on my second read of his post.
- 3dmark06's default resolution is 1280x1024, that is the resolution all results should be based on.
- BUT, If your laptop has a screen with less than 1024 vertical pixels (1440x900 or 1280x800), 3dmark06 will change its defaults to the next lowest resolution, which is NOT the default and will give a higher score than users that are using the genuine default setting.
- So, for a fair comparison, users with lower resolution screens either need to run 3dmark on an external monitor supporting 1280x1024, or explicitly declare their score was based on a lower resolution to allow other users to compare at that resolution (or disregard their results - that's what I should have done, instead I freaked! haha
)
Anyway proof and some info in threads below:
http://www.yougamers.com/forum/showthread.php?t=64292
http://forum.notebookreview.com/showthread.php?t=147055&page=2 -
Told ya. ;-)
The true XPS M1530 3DMark06 result should be around ~4100-4200. Anything higher and you're either overclocking, or it's downscaling the resolution because the LCD panel's native resolution doesn't support the default. -
Halkyon,
My best score is a smooth 4397.
No overclocking, and 1280x1024 res. Clean install of XP SP2, with all drivers installed as minimally as possible (no bloatware, just raw device driver where possible). Nothing else done.
I don't think it's possible to get much better. For the record, I'm using dell's official 174.31 driver for XP, it's the best driver I've found performance wise.
Got a screenshot. I got heaps of scores around the same as this, so it seems consistent.
I guess now that I've milked the best performance out of my stock system, I can finally go wild with rivatuner! -
Awesome result! I'm going to try some different drivers, I think the 175.70's aren't so good for performance.
I'll update this thread with my results, currently I'm running a good clean copy of Vista with SP1 installed. I thought about installing XP, but I'm going to see how Vista turns out... -
I just ran 3DMark06 again under XP with the M1530. Using the 174.93 drivers, seems fairly good, a lot smoother performance in comparison to the 175.70 ones, which gave jagged results.
4378 3DMarks
SM 2.0 Score 1735
SM 3.0 Score 1614
CPU Score 2372
Happy! -
i got 4612 with 1280 x 800 with 3dmark pro 06
edit:
OC at 625/1250/849 gave a 5688 score in 3d06 with 174.31 driver and vista sp1 -
3DMark06 default resolution (1280x1024)
Stock clocks - 4378
OC to 650/900 - 5671
This is a pretty impressive increase, and temperatures at the OC speed are only a few degrees higher... -
Yeah, this notebook cools so well, and the 8600M as well as the new DDR3 memory love being overclocked.
Have you done much gaming at those settings though? I'm running mine at 610/950 and it's stable after 2-3 hours of gaming. I think core clocks higher than about 620 were causing artifacts though - not in 3dmark, but after extended gaming sessions. -
I'd also be curious as to how OC'ng affects battery life.
-
It doesn't have to, because you can simply set different OC profiles. I've set mine up (via rivatuner) to have it underclocked on boot (450/650), and then have another overclocked profile that I only use when gaming (610/950 at the moment).
If you did want to run it overclocked on battery though (for some reason) then yes indeed it would drain the battery more - due to more power consumption, and possibly more fan operation. For the record (at least using rivatuner) it won't let you apply overclocked settings when on battery, and will just default to stock clocks - however if you set o/c'd settings while plugged in, and then unplug, it will maintain those settings on battery. -
That's brilliant - sounds like something i need to do since I definitely am going to be gaming quite a bit.
Also - would an OC profile look different if I wanted to accelerate 3D applications and rendering speed vs for Gaming? -
Anything that's related to gaming or rendering registers as 3D ...
-
If you look in RivaTuner there's "standard 3D" and "performance 3D". I believe if you use PowerMizer, then setting the clocks higher for "performance 3D" won't affect battery life for everyday use, as theoretically it should be using "standard 3D", for which is at stock clocks. This is until you hit a 3D game and then the card kicks into "performance 3D" mode using the overclocked settings, for which the drain would then occur.
M1530 users with 4600+ 3dmark06 (no OC) - what drivers?
Discussion in 'Dell' started by Udi, May 28, 2008.