- The Big Low-End Comparison: 9200M GS - 9300M GS - HD 3450 - HD 3470 -
Hello everyone!
Lots of laptops are currently shipping with a low-end dedicated card. I've been on the lookout for some nice comparisons between them, but it's hard to get a good view on this. With this thread I would like to gather as much information about the performance of these graphic cards (NVidia 9200M GS & 9300M GS and ATi HD3450 & HD3470) as possible.
So, I've searched the forums around a bit and made a list of 3DMark06 scores. 3DMark scores are a pretty good way to benchmark cards, though there's the problem that some scores may differ because the test was ran using a different resolution. Nonetheless, here's what I've gathered so far:
3DMark06 Scores
9200M GS:
1742 Source Specs: [HP DV5] P7350,4GB
2067 Source Specs: [Samsung Q310] P8400,4GB
2574 Source Specs: [Samsung Q310] P8400,4GB, 9200M GS overclocked to Core 750MHz & Memory 750MHz
9300M GS:
2242 Source Specs: [Lenovo Thinkpad SL500] P8600,2GB
1695 Source Specs: [Lenovo Thinkpad SL300] P8600,2GB Note: This 9300M GS is a 128MB version - explains the low score?
2211 Source Specs: [Lenovo Thinkpad SL400] P8400,2GB
HD 3450:
2100 ('just under 2100') Source Specs: [HP DV5z] 2.1Ghz Turion Ultra,3GB
~1800 Source* Specs: [HP DV5z] Unknown
*Note that this poster also says thisHD 3470:
2041 Source Specs: [Sony Vaio SR] P8400,2GB with Original Drivers(OEM)
2089 Source Specs: [Sony Vaio SR] P8400,2GB with Mobility Modder Catalyst 8.8 Drivers
2277 Source Specs: [Sony Vaio SR] T9400,4GB with Mobility Modder Catalyst 8.8 Drivers
2575 Source Specs: [Lenovo Thinkpad T400] T9600,2GB
2598 Source Specs: [Sony Vaio FW] T9400,4GB
Many people have been saying the HD3450/HD3470 outperform the 9200M GS and 9300M GS. However, so far, based on above results, I am not really convinced. It looks like the 9200M GS performs the same if not slightly better than the HD 3450 (and if overclocked can reach the performance of the HD 3470). The HD 3470 seems to be the best option in the end, though.
So! Looking for contributions!
-
Well, seeing as my thread was the source for your SR info... i'd just like to say, the 3470 is more powerful than the 9300, but the SR only comes with a 128mb version of the card.
If you compare the 128mb version of the 9300 vs. the 3470, the ATI blows it out of the water. -
Your thread is nice btwLast edited by a moderator: May 8, 2015 -
Also if you speak about overheating, the prize goes to the 3470 wich is the coolest card.
-
I think it is high time only the SM2 and SM3 scores are taken to assess the raw power of a card in 3DMark 06.
-
-
how about the 9300m G
-
-
Since the materials and build are in 55nm, they dont seem to be defective, and the laptops that have those graphics have good fans (t400 for example), Thats why I think that they are cooler.
Btw the HD3470 GDDR3 with 256mb is more powerful/or at least the same power as the 8600GS DDR2 256mb. Also FYI the 3470 has DirectX 10.1 support, I think the other nvidia cards dont support it.
Nice thread btw. rep+ -
The SM2 and SM3 scores are worth something though as it directly compares the power of the GPU, which would show the ATI cards in your list peforming better. -
mobius1aic Notebook Deity NBR Reviewer
Just think, if the hybrid crossfire was usable, you could predict 3Dmark06 scores in the ~3000 range with an HD 3450 + HD3200. I'd still rather have the Puma system simply because there is the possibility of hacking it. On the CPU side of things, I completely agree, as the CPU score that is factored into the 3Dmark score kind of convilutes the scores between such different machines running different CPUs, different overall platforms.
-
-
If you'd have 2 laptops both priced $1000 with different CPU's and different videocards the one with the highest score would be your best bet, even if it's the CPU that makes the score higher. This is, of course, assuming that a higher score actually leads to more fps in games but I think we should take that for granted, else the whole thing becomes useless anyway
There will be a 'problem' to a certain extent when you have to choose between an ATi or NVidia card on the same laptop, but that's not very common (or not happening at all? Not sure). And even then, you could check out what cards performs best in combination with the CPU you're going for.
So let's see some more benchmarks! -
Now, I personally think the benchmark is a great tool, but only taking the SM2.0 and SM3.0 scores. Lets face it, even the lowest end Mobile Celeron, or Mobile Turian is MORE than enough power to feed these low-end mobile chips, even the midrange is really considered 'low' in my book, though it is decent for a laptop. So, again, your lowest end Mobile Celeron and Mobile Turion will easily peg the 9600M/3650. It starts to shift when you move into the 8800M GTS range in which case, it seems about a good 2Ghz C2D pairs with it nicely.
Anyway, it is very, very possible to outclass in 3DMark on one laptop (very fast CPU) and have it still be far slower in games as a result. That is a sad situation and that is the reason the 3DMark06 overall score is misleading. -
I currently have a Dell 9300 with a GeForce Go 6800 and a Pentium M @ 2GHz. My girlfriend has a Dell 1430 with an 8400M GS and a Core 2 Duo 1.6GHz. Even though my laptop scores a higher 3DMark 06 score, hers is noticeably faster in games. Where my CPU is pegged at nearly 100% at all times in every game I play, hers stays at a mild 60% on both cores.
However, any game where there is significant rendering with little to no AI, my 6800 smokes. For instance, the Dell 9300 is better in WoW and LOTRO. The 1430 SMOKES mine in World in Conflict and Sins of a Solar Empire.
Indeed, a Celeron or Turion may push a lower card to its limits, that doesnt necessarily mean you will have acceptable from rates in games on those systems. Where as that same card with a Core 2 Duo will perform fine.
There is still considerable burden placed on CPUs in most all video games.
Just food for thought. -
I agree. Though, talking strictly with the ability to send wire frame data to the GPU, it doesn't require all that much power. Most games out there (this is changing now) don't really tax the CPU that much. Supreme Commander does, Mass Effect, and some of the newer RTS seem to hammer it, but most other things don't really implement enough physics for that CPU to be a huge factor in gaming. To be clear, again, I agree the CPU does more than send wire frame data to the GPU, it does handle sound, physics and artificial intelligence.
However, I just see some really lopsided laptops out there. Huge CPU power and really low end GPUs for people who want to game. If money and size is no object, I would say get the best of both. But most don't have that luxery. So I say that if you are going be gaming, sink the extra money into the GPU, even if it means slightly downgrading the CPU.
My Quad Core @ 3.6Ghz Gave me a few thousand higher 3DMarks (3,000 more to be exact) than my A64 X2 @ 2.6Ghz. This was really cheesy because games saw absolutely no increase, though I never did test Supreme Commander. All other games didn't see one lick of improvement. This was paired with an 8800GTS 512MB.
As for the OP, I think what you have done is great, I just wanted to point out that
3DMark06 overall score is not reliable enough to purchase a latop on. Really, I blame FutureMark for incorporating the CPU so heavily into the equation. So I don't want to take away from your thread as I think it is great and a great baseline to start some deeper research.
The Big Low-End Comparison: 9200M GS - 9300M GS - HD 3450 - HD 3470
Discussion in 'Gaming (Software and Graphics Cards)' started by Joplin, Sep 13, 2008.