BENCHMARKING AMD INTEGRATED GPU RADEON 6620G:
I have always wondered and seen questions asked about the best RAM to buy for one's laptop. The AMD Llano Radeon 6620GP IGP performance is excellent compared with standards of previous IGP's for either AMD or Intel. The IGP does not have its own dedicated memory like a discrete GPU does, so it utilizes system RAM. This obviously means it is dependent on the system RAM for its performance. Utilizing my HP DV6z, I spent some time benchmarking the AMD Llano mobile APU (A8-3510MX) which contains the 6620G GPU, to test for effects RAM of different latency, speed, and single or dual channel mode have on overall gaming performance.
System Configuration:
HP DV6z Quad Edition
AMD A8-3510MX APU with integrated 6620G GPU
Radeon 6750m
15.6" 1920x1080 LCD
WD SiliconEdge Blue 256GB SSD
Blu-Ray Reader / DVDRW
The CPU was clocked at 2.4GHz (stock is 1.8GHz), but integrated GPU 6620G was running at stock speeds of 444MHz core.
Drivers were Catalyst 12.1a pre-release.
In most cases a resolution of 1280x720 was used as well as default benchmark settings. In some cases graphical detail was reduced to ensure a realistic framerate as well as get decent comparison results. Comparing 1vs2 fps isn't quite as effective as comparing 20 vs 23fps. Benchmark details can be found in the second post.
Typical RAM found in laptops is 1333MHz, so you can consider option (4) below as the baseline of performance. RAM changes of significance are highlighted in RED text. The bar color that that component is represented on the graph is shown in (parentheses).
RAM configurations:
(1) 2x2GB Micron 1066 @ CAS 7, 1.5v dual channel ( light blue)
(2) 1x4GB G.Skill 1333 @ CAS 9, 1.5v single channel ( green)
(3) 1x2GB Hynix + 1x4GB G.Skill 1333 @ CAS 9, 1.5v asymmetrical dual channel ( orange)
(4) 2x4GB G.Skill 1333 @ CAS 9, 1.5v dual channel ( red)
(5) 2x4GB Samsung 1600 @ CAS 11, 1.35v dual channel ( blue)
(6) 2x4GB G.Skill 1600 @ CAS 9, 1.5V dual channel ( purple)
Note that the CAS number indicates latency in the memory module. Lower number is typically better (i.e. faster performance).
RAM CPU-Z Screenshots:
Benchmark Results:
3DMark06
![]()
3DMark Vantage
![]()
3DMark11
![]()
Battlefield 3 (on battery)
Average Power Drain: -38.1W with 30fps frame limiter on (DXtory app)
BF3 FPS:
Min: 25
Max: 31
Avg: 29.9
![]()
Crysis benchmark was run using medium settings
Crysis DirectX 9
![]()
Crysis DirectX 10
![]()
DiRT 2 Demo Benchmark run using medium settings
![]()
HAWX 2 Benchmark
![]()
Just Cause 2 Benchmarks run using low settings, high texture, medium water, no AA
![]()
Stalker Pripyat DirectX 9 run using medium preset and full dynamic lighting
Stalker Pripyat DirectX 10 run using medium preset and enhanced full dynamic lighting
![]()
Resident Evil 5 Benchmark
![]()
The Elder Scolls V: Skyrim Intro (on battery)
Skyrim, 720p, Medium preset but turned off AA, limit FPS to 30, CPU @ 1.8GHz 0.9875V
Average Power Drain: -34.5W
With no FPS CAP Average Power Drain: -37.6W
![]()
Street Fighter IV Benchmark
![]()
Trackmania Nations run using Normal detail
![]()
Conclusions:
The speed of the RAM seems to have more of an effect on performance than the CAS latency of the RAM. Running single channel (i.e. one stick of RAM) has a significant detrimental effect on performance, so best to stick with two sticks of RAM.
To summarize the RAM performance, below is a chart showing % performance off the baseline RAM of 1333MHz CAS 9.
- Average is the average of the % off baseline.
- Min represents the benchmark with the lowest % off baseline
- Max represents the benchmark with the highest % off baseline
So effectively it's showing a range, and should probably chart it differently, I'll work on that. So for example the first one DDR3 2x2GB 1066MHz CAS 7 was average of -14.04% spanning from -18.65% to -7.30%. So effectively -14.04% with a range from +7.3% / -4.61%.
![]()
Bottom line, it's not very dependent on CAS latency, but moreso on speed and single or dual channel.
- Difference between 1066MHz and 1333MHz RAM was about 15-20%
- Difference between 1333MHz and 1600MHz was about 5-10%
- CAS 9 vs CAS 11 1600MHz was less than 5%, on average almost even.
- Asymmetrical dual channel (2GB + 4GB RAM) resulted in a small performance hit, average of about 3% with a range of 1-5%.
In any case you can see the 6620G offers decent performance for an IGP. Just make sure to run 1333MHz or faster and in dual channel.
The 6620G is a fully capable gaming GPU at 720p even with the latest games like Skyrim and BF3 which I will determine a way to benchmark later.
-
System settings used for the game benchmarks:
3DMark06 - Stock
3DMark Vantage - Stock Performance settings
3DMark11 - Stock P (performance) settings
Crysis - All medium 1280x720
DiRT 2 Demo Benchmark
HAWX 2 Benchmark
[
Just Cause 2 Benchmarks
STALKER Pripyat Benchmark
Resident Evil 5 Benchmark
Street Fighter IV Benchmark
Trackmania Nations Benchmark
-
Great job!!
-
davidricardo86 Notebook Deity
Great job indeed!
Would there have been a more noticeable difference had you used 8gb 1.5v Cas 9 1600mhz RAM instead of "2x4GB (8GB total) Samsung 1.35v 1600MHz @ CAS11, dual channel (blue)?" -
http://forum.notebookreview.com/har...g-someone-swap-ddr3-1600-me-benchmarking.html
I also ran the same benchmarks on my AMD Zacate E-350 netbook with the Radeon 6310 GPU, using both single stick of RAM and two sticks. The E-350 only supports a single channel RAM controller, and supports maximum of 1066MHz RAM. But thought I'd test it anyhow. Should be posting those soon. It is also quite impressive. If they allowed for dual channel though you would probably see an additional 20-30% gain in performance there too. -
Very Interesting results indeed... then 1333 Mhz at the end is just the sweet spot for our IGPs... thanks for your time doing this.
-
this is rockin. i just upgraded from 4 to 8gb of ram today. how much would you suggest i reserve for the hardware? right now i've got it at about 2.5gb.
-
Meaker@Sager Company Representative
AMD did comment that LLano is sensitive to latencies too, I wonder if a lower latency kit would help a bit.... Also what happens with overclocking.
But cheers for your effort -
-
UPDATED FIRST AND SECOND POSTS WITH NEW AND ACCURATE INFORMATION!
Thanks! -
-
Added average % of game performance relative to 2x4GB DDR3 CAS 9 RAM for reference.
-
NICE!! This widens the gap from Zacate a bit further. So it will run Crysis...sexily. Hell this would do justice on most games for now, and even a little into the future. So if Trinity's graphics are 50% faster that is literally within the margin of error for a 6750M? Unless AMD builds another radical dGPU for mid range, Trinity will no longer need a dGPU unless you put in something like a 6850M or faster.
-
Would be interesting to see the same results for these benchmarks with the dGPU. If you've kicked up the iGPU to within the 1.75 ratio, then CrossFire would be of some use...provided it's working properly.
AMD should pay you. -
-
Updated using Asymmetrical dual channel (2GB + 4GB SO-DIMM RAM for 6GB total).
-
(3) 1x4GB Hynix + 2x4GB G.Skill 1333 @ CAS 9, 1.5v asymmetrical dual channel ( orange)
Typo? -
-
Would love to see these same tests using the dGPU and then crossfire. (Yeah, not asking you to do it...just would like to see the results.) Since the iGPU is so RAM-dependent, with the 1600 CAS9, you might make crossfire viable. (Provided that AMD sorts out ay issues with the drivers...and given they are targeting budget-solutions for 70% of the buying masses...they should.)
YOur testing methodology is the same as [H]ardOCP...trustworthy real-world benchmarks, something you don't see with Tom's Hardware or AnandTech. As a matter of fact, you've disproven AnandTech, who has been accused of being Intel fanboys. -
My problem with Crossfire is it's a mixed bag. Stuttering and such, but seems it doesn't affect the outcome of the benchmark too much usually. I'll give it a run since I'm in the mood. Would love to get ahold of the new 7690m though. Too bad it's soldered to the mobo otherwise might try to pick up one on eBay or something. Too bad I made so many modifications to my machine, otherwise I would hav finagled a new motherboard out of HP. -
Heck, you're halfway there! Great job!
Me...I am going to wait for the GCN (Kaveri) supposedly due next year after Trinity, and then go top-of-the-line. -
I should stick with my machine, actually, but I'm a sucker for new (inexpensive) tech, even if I don't really need it.
-
Hey, I found this:
https://www.sapphireselectclub.com/ssc/TriXX/TriXX.aspx
Maybe it's the answer to OC/OV'ing the d graphics card. Check it out and let me know.
EDIT: Looks like it's the damn BIOS all the way around that is killing the OV'ing. Shame. -
Did you try if it is more stable then MSI afterburner, say restoring clock after restart? May just use trixx if it can autostart/autooverclock right. -
If we could get this kind of BIOS, then existing packages like Trixx and Afterburner would allow us the more complex software controls so that there would be far less consuming trial-and-error approaches.
Given that this is new technology, it makes no sense to assume that AMD is "on it." On the contrary, they're listening to and and are more dependent on enthusiasts and feedback more than ever, but as long as we don't get the kind of control we need, we can't help them to our best abilities. -
I can't get TRIXX to work at all. It balks at me and says it can't OC.
-
was talking about dGPU.
Seem to me, It need to prime dGPU before apply new clock like MSIburner, so it is nothing better than Msi except you dont need to add the chain of text lol.
Why can't we get some software that work automatically ~~ -
-
-
I'm definitely impressed with these scores. With DDR3-1600, you're coming within a few hundred 3dmark score of my 5730.
What kind of battery life do you get gaming on the iGPU only? Could you possibly benchmark BF3 on low settings at 720p? -
I haven't tested the battery gaming with iGPU to complete battery drain, but I will see if I can find a way to record actual average power drain. With the 9-cell I am certain I can get close to 3 hours on battery using the iGPU only. -
I just did a round of 64 player Caspian border and results:
Average Power Drain (HWInfo64): -57.0W
BF3 FPS:
Min: 19
Max: 53
Avg: 32.5
So with the 96WHr battery I have I could get probably 1 hr 30 mins or so of play time. Maybe an hour with a 6-cell.
Configuration:
- LCD backlight @ medium brightness
- Power Option @ Max Performance (otherwise will reduce game performance)
- CPU @ 2.4GHz 1.15V
Only thing that I can see that might help a bit is LCD brightness, but that may only save half a Watt or less, and reduce CPU speed a bit, but again, marginal improvements. Even if you can save 1W that's only 1/57W or 1.8% or a few minutes. -
Out of sheer curiosity, was that 19 a fluke that happened for a split second or two, or were there times when it was actually unplayable?
Can't wait for the first Trinity benches to start appearing. -
At 19-20FPS it's actually playable. And those low points were typically right after I died, the fps takes a slight dip. Although I find anything above 22-23FPS very playable in this game, seems smooth.
I just did another 64 player Caspian Border again but with CPU @ 1.8GHz 1.05V
Average Power Drain: -50.1W (~7W less than at 2.4GHz)
BF3 FPS:
Min: 18
Max: 48
Avg: 33
again fps was typically lowest when I died. It did dip to 20fps time and again, but it was very brief.
Surprisingly the FPS isn't much different at 1.8GHz than 2.4GHz and consumes 7W less. That's pretty significant.
I am looking forward to Trinity laptops as well. I am hoping to get ahold of a sample somehow, but I doubt it. My biggest issue with BF3 at the moment is playing at 720p on a 1080p screen, it just looks too... blurry. I can turn on Post AA at Medium without any affect on framerate, but MSAA drops FPS too much. If you have a 1366x768 screen you'd probably be ok running at 1366x768 and it would look crisp. If gaming is your primary or a significant part of your laptop's time, I'd recommend the 1366x768 screen if you plan on gaming with the iGPU. Even with dedicated GPU your FPS will be superb. -
Wow, I'm absolutely surprised how little CPU power that game takes to run, compared to BC2. Seems more and more like a small Trinity laptop will fit my usage perfectly.
-
-
BC2 has much lower graphics requirements, though.
-
Not too surprised by those battery numbers. The games I play on iGPU generally use one or two CPU cores and don't choke at 800MHz. (Including practically any Bethesda title before Skyrim, and possibly Skyrim with 1.4 patch...need to buy that game)
Frozen Synapse uses surprisingly more power than I expected, seems to be an OpenGL title too. But it's damn fun to be paranoid.
For reference HT, my battery life on stock 6-cell using dGPU in Crysis was between 1-2 hours. For Battlefield 3 to run less than that on iGPU....wow. And HT, I think your framerate looks more stable at the lower clock. Possibly there is less conflict for resources on the shared bus? -
Well the problem is that there's no repeatability in the tests. While on the same map, two completely different sets of circumstances. Unless I do at least a dozen or so runs at both speeds there's no level of confidence in something other than an FPS range and average. Even though FPS was about same at slower clock, it just "felt" a little less smooth, but perhaps that's just placebo. And obviously a frame limiter may not help much, but I may give it a try to see. Limit it to 30fps and see if that helps too.
I can play Skyrim on iGPU at 720p with ~ 30fps. I will do a run and test with that, at least I can get something somewhat repeatable by running the intro scene. -
I just used DXtory to limit my FPS to 30 fps, and realized I wasn't using my optimized "Battery" setting last time. 1800MHz @ 0.9875V compared with 1.05V last time. Not a huge difference, but still a factor. CPU temp never exceeded 52C and that was NOT on a cooler.
This time results were much better:
Average Power Drain: -38.1W
BF3 FPS:
Min: 25
Max: 31
Avg: 29.9
-
Sweet! Looking very nice.
-
I'll give Skyrim a spin later...
-
Skyrim, 720p, Medium preset but turned off AA, limit FPS to 30, CPU @ 1.8GHz 0.9875V
Average Power Drain: -34.5W
With no FPS CAP Average Power Drain: -37.6W
-
You could set a watch to those framerates, very consistent. And commensurate undervolt made a nice difference. That brings you closer to three hours battery in a game that is only a few months out the door. Medium presets and 720p in Skyrim on the iGPU? Stellar!! And I saw you achieved similar fps with High and 1080p on 6750M...
Only a few PC games released in the next two years will potentially be unplayable on my laptop, that makes me happy at such a nice price. -
Sorry but I didnt see this anywhere, is there a way to overclock the RAM in the BIOS? My friend has a DV6 and Im sure hed like the small bump in frames.
-
Nope, only way to get faster RAM is to use faster RAM. And "M" models (i.e. A8-3500M) use max 1333MHz RAM and MX can use max 1600MHz RAM.
Plus A6 uses the 6520G which is slightly slower (maybe 10%) than the 6620G because of slower GPU clock and fewer cores. -
Yeah one friend has the A8 3500M + 6750 and the one I was talking about has just the A8 3510MX (Bought in store only). The processor wouldnt automatically run the ram at 1600mhz right?
-
-
davidricardo86 Notebook Deity
-
AMD Llano 6620G Benchmarked with various RAM configurations
Discussion in 'Gaming (Software and Graphics Cards)' started by HTWingNut, Jan 26, 2012.