As we continue to get into the realm of 9800M GTS / GTX, SLI setups, HD 3850, HD 3870, Crossfire, new Mobility HD 4850's, etc, 3dMark06 really is long past being useful in graphic card comparisons.
I discovered this myself when I built my desktop -- I gained more points in 3dMark06 from overclocking my CPU (2,000 more points to be precise) than I did from overclocking my GPU (~800 more). It's increasingly becoming what Aquamark is now -- a CPU benchmark, not a GPU benchmark. My T7500 scores about 2000 CPU score in 3dMark06, while my overclocked Q6600 scores over 5000 -- but this does not mean a Q6600 will provide 2.5x better performance in any modern game running say a 9600M GT, because the GPU is the limitation, not the CPU.
There are a couple solutions, some better than others. One is to use in-game benchmarks like Crysis, but not everybody has Crysis so this won't work. Another alternative is to run at higher resolutions, but this won't work for all notebooks. Another alternative, that which I think is the best, is to start running 3dMark06 with 2x or 4x AA and 4x or 8x AF. While this is not as good a solution as running at a higher resolution also with filtering and aliasing, to me it's the best solution since currently Vantage is not widely usable.
3dMark06 is a good benchmark for anything under maybe 8,000 points. But to keep it useful, I think it would be a good idea for those at NBR who post 3dMark06 benchmarks for new cards to run it at the default resolution and settings (For compatibility with other benchmarks), but also to run it with 4xAA and 8xAF to work the GPU more and make the CPU a much smaller part of the score.
Thoughts? If NBR's users get on the same page here it would make some of the GPU comparisons more meaningful I think.
-
-
For sure, I completely agree. You prove a valid point. The CPU and GPU score ratio doesn't match up at all, providing ultimately useless results. These days it seems like the higher the score the better, regardless of inaccuracies or wrong calculations.
However, there should be a benchmark thread for 3Dmark 06 under different settings so that users will be on the same page. For example, a thread for running defaults, running everything low, everything maxed out, etc. -
Only problem with this is you cannot change the settings and apply AA / AF without buying the program. I dont think many people will want to do this.
-
Users always have the option of forcing it from their respective GPU's control panel. I haven't done this myself, any idea if it works properly? -
All I've ever used 3DMark06 for is as a measuring stick for my card against others with identical GPUs, to make sure mine is performing up to spec. Plus the overall score is less important than the SM2.0 and 3.0 ratings.
3dmark Vantage is the new standard. -
-
-
It's not that it's a completely useless benchmark - people can still use it to gauge the impact that an overclock/new driver has on their performance - but it's losing ground as a comparative measure.
And Vantage has its own issues - GPU PhysX can muck things up greatly. -
My point is that it's losing it's value as a comparative tool not just the raw scores.
When a CPU overclock (Which based on my Crysis benchmarks, makes zero difference in FPS) yields 2,000 more points in 3dMark06, and a GPU overclock yields 800 more points in 3dMark06 ( But 3 more FPS in Crysis [8% boost]), the comparison itself is flawed.
It's only useful for benchmarking sites that run 8 GPUs on the same CPU, in these notebooks that have processors ranging from 1.8 GHz to 3.1 GHz, it's skewing results. -
-
Why 06? its old and outdated. Use Vantage
-
Except XP has probably well over 50% market share among gamers, and it won't run on XP.
-
Well XP is old and dated. Soon everyone will go to Vista or 7
-
http://store.steampowered.com/hwsurvey/
21% of Steam gamers have a DX10 system.
Vantage, while it is the replacement to 06, will never become the defacto standard like 3dMark06 until that percentage hops up to around the 80% range, and that won't be for years. -
I also have to disagree that soon everyone will be on Vista/7. XP has lost 12% market share since a year ago according to Market Share; at that rate it'll fall below 50% in H1 2010 - and that's including Macs/Linux, so it may well be until H2 2010 that it's below 50% of Windows installs. And even if we assume 12.5% goes to Seven each year - a higher rate than Vista has been gaining - and that it comes out in July of next year, it'll take until H2 2011 before XP isn't the dominant Windows OS (at -12%/year, XP will have 30% in December 2011 and at +12.5%/year, 7 will have 31.25%). If Seven comes out at the end of next year, it'll be until H1 2012 that XP isn't the primary Windows OS. True, cutting-edge and gaming systems will be more likely to have DX10/Seven, but even so we're looking at a good time out until XP isn't dominant.
But back on the main point, CPU does affect 3dmark a good deal. That does reflect itself in games to a bit (downclock to 600 MHz and I notice a difference in most modern games; GTA4 won't run well on a 1.5 GHz Celeron), but it's certainly a valid point. I've long thought a NBR 3dmark database would be a good idea, but you would have to have several tables for different resolutions/versions/settings. I, for instance, am limited to 1280x800 resolution for 3dmark06 because that's what my screen is. But if you're willing to put together table(s), I'm certainly up for submitting. -
Ditto here - I wear my 3D/PCMarks on my sig with pride.
Of course, I have to redo a few b/c of the jump to 64-bit... -
masterchef341 The guy from The Notebook
was 3dmark ever a useful benchmark? i don't remember such a time.
-
3dmark was never meant to be an actual benchmark. It was mostly there as a rough tool to compare stuff, but it did not always equal a performance rating. If anything, a lot of people paid more importance to 3dmark than actual performance O_O
-
3dmark gives a rought estimate at best. Much like: "Well honey i think most honda's have a big 'H' on the grill, do we have a big 'H'?"
But yeah 3dmark is pretty much a joke, not really any different than WEI and to be honest somewhat a waste of time, unless you really enjoy coaxing that e-peen. -
Right... 3DMark06 is actually useful for determining relative performances between graphics cards of the same generation and company i.e. ATI 4xxx or Nvidia 9xxx.
It does have its uses. -
Please note I said "reference" in the title, not "benchmark". I fully realize it's not useful as a benchmark but as a reference relative to other cards, but in the mobile world where again like I said CPUs range from 1.5 to 3.2+ GHz, 3dMark06 is not useful even as a reference because CPU scores are weighted far too heavily.
-
Au contraire. While notebooks do span that range of frequencies, most decent gaming laptops are squarely centered in a very narrow range of 2.0~2.5GHz dual cores. If at all, that should make laptop chip comparisons even more accurate than desktops where you can get 2.6~4.2GHz dual or quad cores.
But I do agree that CPU scores are weighted too heavily. It does provide a somewhat accurate benchmark for games like Supreme commander though, which are cpu intensive. -
I still oppose. Laptop CPU's are constructed slightly differently to desktop CPU's so a direct comparison wouldn't be appropriate. And the frequencies would be of nothing to worry about if their were constant calculations behind them, per frequency. This is where 3dmark06 fails, it doesn't seem to have a constant, correct "algorithm" to identify scores that would fit across the board.
-
well, in a sence we can never have an accurate gaming benchmark program, as every game is coded differently - some use more GPU power as others are GPU/CPU dependant, and yet some would benefit more from CPU than from GPU power ..
too bad that in the free 3DMark06 one cant change the resolution .. as in my case I have 4:3 monitor and I have to test at 1280x800, a resolution I never use, lol. -
What's kinda funny, is that my desktop gets 15600 points in 3dmark06 all default. In 1680x1050 I get higher framerates running the tests, and my score is 18900. It's awkward that I actually get more frames per second running in a higher resolution. 3Dmark Vantage is just completely pointless since it's extremely buggy, TONS of issues with their engine, and it isn't very accurate because of this, they also shouldn't allow physX enabled GPU's to take over for the CPU score.
Crytek should just release their own benchmark for new GPU's. A single small level packed with details and physics objects, and have it blow up or something and make it a free download. -
-
Why not HL2:Lost Coast?
3dMark06 has basically lost its ability to provide a reference for modern GPUs
Discussion in 'Gaming (Software and Graphics Cards)' started by Jlbrightbill, Dec 26, 2008.