AMD offers CrossfireX technology in their current generation Fusion laptops with the "Llano" APU. Basically allowing the GPU integrated with the CPU to work in conjunction with the dedicated card, in this case the 6750m, which should result in improved performance. Also note that the 7690m performance should be very similar. The CrossfireX or "Dual Graphics" is only applicable to the AMD laptops that are designated capable of this technology.
Also, I was curious and also read claims that RAM speed can affect gaming performance even if you're only using a dedicated card, so also tested with two different configurations of RAM, a single stick of 4GB DDR3 1600 and two sticks of 2GB (4GB total) of DDR3 1066.
The test machine was an HP DV6z with A8-3510MX APU at 2.4GHz with the 6620G integrated GPU with core speed of 444MHz. RAM is 2x4GB (8GB total) G.Skill DDR3 1600MHz CAS9 unless otherwise specified for test comparisons. Clock speeds refer to the dedicated GPU only whether alone or in CrossfireX mode.
The purpose off these benchmarks was to validate two things:
- performance effect of single channel and RAM speed when using the dedicated GPU
- CrossfireX vs. dedicated card overclocking
Configurations were as follows (speeds are core/vRAM):
(1) 6750m @ 600/800 (these are stock speeds) - green
(2) 6750m @ 600/800 with CrossfireX - purple
(3) 6750m @ 780/880 - teal
(4) 6750m @ 780/880 with CrossdireX - orange
(5) 6750m @ 600/800 using 2x2GB DDR3 1066MHz RAM - Red
(6) 6750m @ 600/800 using 1x4GB DDR3 1600MHz RAM - Blue
Benchmark results are as follows (expand spoiler tags):
3DMark06
![]()
3DMark Vantage
![]()
3DMark11
![]()
Crysis
Settings 1080p Medium
![]()
Crysis 2
Crysis 2 Benchmark Performance Setting
![]()
DiRT 2 Demo in-game Benchmark
Settings 1080p High preset
Did not test the 1066 and 1x4GB RAM with Crysis 2 (too much time and enough other data to make conclusions)
![]()
HAWX 2 Benchmark
Settings 1080p High settings, 4xAA, Terrain Tesselation OFF
![]()
Just Cause 2
Settings 1080p Aniso 16x
High: Textures, Water, Objcts
Off: AA, v-sync, high res shadows, SSAO, Point Light Specular
On: Motion Blur, Full Screen, Decals, Soft Particles
![]()
Metro 2033
Settings 1080p Medium quality preset, AAA, Aniso 16x
![]()
Resident Evil 5
Settings 1080p used default benchmark settings
![]()
STALKER Pripyat
Settings 1080p Medium preset
DX9 Full Dynamic Lighting
DX10 Enhanced Full Dynamic Lighting
![]()
Street Fighter IV
Settings 1080p all High or ON
![]()
Trackmania United
Settings 720p and 1080p, Performance "Nicer"
![]()
Conclusion:
With regards to CrossfireX vs just dedicated overclock, benchmarks show a decently overclocked dedicated GPU can keep up with a CrossfireX system running at stock speeds. There is a definite improvement in performance in many games with an overclocked card AND using CrossfireX. However the one caveat is that there is likely to be "micro stutters" or a constant jump in FPS. Even though FPS can state 50FPS, it's jerky and plays along like it's 12FPS. This varies from game to game but here's my assessment on it based on the games/benchmarks:
3DMark Vantage: slight stutter in both stock and overclocked CrossfireX
3DMark11: very apparent stutter
Just Cause 2: apparent, moreso with a faster clocked dedicated GPU
STALKER Pripyat: slightly noticeable
Metro 2033: slight at stock clocks, horrible with overclock
DiRT 2: apparent
Crysis 2: apparent and worse with overclock
Other games not benched but tested with and without CrossfireX:
Deus Ex Human Revolution: slightly apparent
Skyrim: apparent
BF3: apparent
Otherwise the remaining games/benches played smoothly with no noticeable stutter or ill effects due to CrossfireX enabled. But my assessment? Best to use an overclocked dedicated card than CrossfireX. It's a great idea, but AMD just hasn't figured out the best way to implement it for smooth performance in most games.
And regarding RAM speed and dual channel or single channel, it had little to no effect on game performance using the dedicated card. the one exception was Just Cause 2 that seemed to be a bit sensitive to the single channel RAM, about 5-10% performance hit. Considering how cheap RAM is though, there's no reason to run with a single stick of RAM, even 2x2GB 1066 is plenty fast for most games if you're using a dedicated GPU and not one integrated with the CPU.
-
GAME PERFORMANCE:
The Elder Scrolls V: Skyrim
Ran opening scene (in wagon), through the point where you can actually move freely (after dragon attack).
Run with CPU @ 2.4GHz, GPU at 750MHz GPU / 850MHz vRAM
Skyrim using "High" preset at 1080p with AA disabled
-
crossfire has potential , it just isnt polished yet. If the igpu had a higher clockspeed, it probably wouldn't choke the 6750m.
edit: Opps sorry, i meant to reply to the dv6z thread lol -
Yeah, AMD's Xfire, CrossfireX, Crossfire, Dual Graphics however you want to call it, is more or less intended to work with similar performing GPU's. If AMD can figure out a way to easily divvy up the workload so the ratio of performance for each card is more in line with its performance, it would probably work a lot better.
-
I am certain that AMD would be very interessted in this analysis, since they are targeting notebook performance and gameplay experience.
You might want to contact Kyle Bennett @ HardOCP as well. Your results might make Hard want to take a look at this as well since they had a couple of very favorable reports regarding Fusion, but really didn't take it further, but they have more contacts and resources available to them.
What I am saying is that you're definitely pushing the envelope here, and that might be something that neither AMD nor HardOCP has even considered. Between the two, Hard is definitely an OC enthusiast, and has pretty much a direct line to AMD. Don't assume anyone out there is as far or farther ahead than you on this.
I will write Kyle and provide him a link to this thread and the Optimization guide well., but you should also write too. As far as I can tell, you're breaking new ground here.
So far, what would you say is the most stable and useful BIOS version? Given that the iGPU clocks with the CPU and memory, there's really no way to adjust it except for K10 and the like, but it sounds like the inability to adjust volts for the dGPU is a bottleneck, but one that could be overcome with a proper CrossFire/Dual graphics driver to allow a K10-modded clock to bring the iGPU up to speed as long as fast memory is involved.
Frankly, I'd be right up there with you on the settings and testing, but this laptop is my lifeline to the world right now (in more ways than you can imagine), and I can't afford to take any chances with it. I DID, however, build the separate travel HTPC, so that eases the load, but I've got to have something available for business and personal business use available at all times.
Hey...thanks for the heads-up on the Sager power suppply. I needed exactly that for another non-computer related device I've been working on for years! -
So far the ModdedBIOS site "Camilol" BIOS has been helpful as it allows us to set fixed mode graphics, and Musho's BIOS (fixed OC in BIOS0 also seems to add stability to the "Camilol" BIOS by flashing the Camilol BIOS first, setting fixed graphics mode, then flashing one of Musho's BIOS on top of it at whatever GPU speed you'd like. All these are based on the F.21 BIOS from HP.
This negates the need for software overclocking, however, with the latest 12.1 drivers software overclocking is enabled again.
I sent an email to Kyle Bennett as you suggested. We'll see if we get any reply. I don't expect them to bat an eye at it, but whatever, if it'll help the community.
edit: email couldn't get delivered... -
I wrote Kyle this morning, as promised.
Honestly, with the framerates you've been able to achieve, CrossFire or not, this has been some really maginificent stuff that should map right over to Trinity and Kaveri.
AMD, since I've been building computers, really has done a magnificent job with their hardware, but their software, drivers, OS patches have always been lackluster. I don't think they're really going to shine as bright as they should until they actually start concentrating in these areas. Now that the parallel hardware technology is here, they really should, and hopefully will soon.
Given the hybrid nature of laptops now (combining an APU and dGPU is brand new, as opposed to a dGPU and card...the way it was always done before, and primarily only in desktops) opens up a whole new level of performance, as long as AMD gets Asymetrical CrossFire / Dual Graphics done right. For that to happen, they might have to rewrite the whole scheme and actually measure and time these suckers correctly, instead of leaving it to "catch as catch can" built on equal GPUs alternating framerates.
Even in desktops, with the right APU, motherboard, and hardware, it might soon eliminate the need for two cards for high performance. (Though I do believe that nothing will equal two high-end cards in crossfire or sli...but you don't need that kind of performance for a satisfying experience. At some point it all just becomes numbers if you can't visually tell the difference, and to keep going for the fastest card and the fastest CPU eventually dead-ends, subject to the law of diminishing returns.)
I HOPE SOMEONE out there with good hardware knowledge and a vivid imagination is realizing that the performance they seek isn't "potential." It's happening now...if they could just get everything else it needs beyond the hardware alone right. -
As far as BIOS is concerned...yeah, it is a problem, though admittedly, you're doing a GREAT JOB for what you have to work with.
What you need is a UEFI bios that allows control over all the basic functions: voltage, speed, options, etc.
Beyond that, then a good OC'ing software that works with the UEFI can do all the automated heavy lifting. ASUS' OC'ing software looks pretty good and works that way with my little A4 HTPC, but I don't get much out of it because, hey, it's an A4 with no external graphics, and it's got 1066 memory in it, so there's not much headroom there either. Don't need it...but...
If I wasn't on the road all the time, I'd build an A8 system with all top-of-the line equipment just to see what it could really do with all the UEFI tweaks, optimizations, and proper cooling. I'm kind of shocked and disappointed that nobody's done it yet (that I could find). -
Thanks for all of these great numbers. I'll have to reassess my own usage patterns. If the only game I play that benefits from HCF is Shogun 2, I really should just disable Crossfire and bump the 6750M a little bit...
-
-
You might want to try something from here to record:
http://download.cnet.com/1770-13633_4-0.html?query=recording+software&searchtype=downloads&rpp=10&filter=os=133|licenseName=Free|platform=Windows|&filterName=os=Windows 7|licenseName=Free|platform=Windows|&tag=contentNav;pe-searchFacetsTile;navForm -
GapItLykAMaori Notebook Evangelist
wait does this mean with the OC the 6750m is faster than the gtx 260m? Thats an amazing feat as a mid range card, also taking into consideration its power consumption
-
To my knowledge it was at least in the same bracket as a 555M or possibly 460M....comparison against 560M is possible. We start getting into specific use cases and the differing versions of DirectX. For example the 260M can't even run DirectX 11 at all, correct? So a few games with DX11 mode *may* look better and/or run faster. For certain this Shogun 2 I've been playing the hell out of, it is unquestionably superior in every way in DX11 mode. A few other games have minimum framerate of 30+ in DX11, compared to 18+ in DX9...
The new laptops with 7690M have a higher base core clock (725 against 600) and base mem clock (900 against 800), and should have similar power demands. Overclocking from that point (if possible) should yield even better results. I expect the few 6750M's that reached 800core/1000mem will have far more neighbors boosted to that speed from the 7690M base. That super mem speed will be a nice feature in some applications...
A newcomer could get a base A6 (overclockable 50-60%) and the 7690M for less than $600USD with a typical small coupon. Such a damn good deal for gaming on the cheap. -
-
Actually....how's the battery on that G51? Good or not very good.
-
GapItLykAMaori Notebook Evangelist
-
I have the stock 6-cell. I'm getting 4-5 hours on the web, 5 hours playing most games on 6620G iGPU in Power Saver/Quietest. 6-7 in light typing or document review/meetings from hell. Using 6750M dGPU I mostly get between 1.5 and 3 hours gaming.
I should spring for that 9-cell. From other guys who have it, sounds like nearly double endurance. An hour of battery for every ten dollars...amazing and really tempting. -
GapItLykAMaori Notebook Evangelist
-
-
In HWinfo64 I am averaging between -9.3xx and -10.1xx with a few spikes to -12 or deeper. My HDD has not stopped spinning and my brightness is II. With lower power RAM and an efficient solid state disk, lowest brightness, I may be able to extract something close to -8 or better. Also my 6620G clock is indeed 444MHz.
I suspect these numbers are slightly inaccurate too. My estimated max battery is about 53Wh (3.7% wear) which would put me south of six hours, but I have seen more than that on a regular basis. (over 7 if screen shuts off periodically) This is not an exact science.
The 9-cell is what? 93Wh? But it distributes the load across more cells, so the battery will not heat up as much. Should probably last about twice as long as 55Wh six-cell under load, from prior experience with a Thinkpad that came with two different batteries. -
I'm getting about 13-14W with CPU @ 800MHz, screen at about 1/3 brightness. I think it's more than it was originally, but maybe my BatteryBar is actually tuned appropriately now.
-
Quite a difference but at least partly just the screen. I think my wifi was off when I checked too.
-
Those benchmarking scores seem off to me. I tried running 3DMark11 myself a while ago with a AMD A8 3500M 1.5 Ghz and AMD Radeon HD 6740G2 (6620G + 6650M) and got a score of P1560 3DMarks with Crossfire enabled, on outdated Summer 2011 (Acer) graphics drivers even. I'd expect the 6750M to perform slightly better than a 6650M, unless the weaker APU really slows it down that much (1588 pts vs 1560 pts). I haven't even overclocked my system, it's all vanilla.
I recently updated my drivers to the latest mobility catalyst 12.1 release and did a rerun with significantly better fps, but unfortunately it doesn't recognize my graphics card anymore so it doesn't give me a score. Although I suspect it would be even higher as my fps nearly doubled in most games (no kidding, I was surprised myself how big of an impact new official AMD drivers had on games like Skyrim or Super Monday Night Combat). I don't seem to notice much microstuttering either, SMNC has no stuttering whatsoever for instance and Skyrim only microstutters in very graphics heavy scenes.
EDIT: I also have 8 GB DDR3 1333 Mhz RAM (2x4GB) if that makes a difference. -
The issue is that 3DMark11 and other Futuremark benchmarks aren't always indicative of true performance. I had a K53TA for a little while and Futuremark benchmarks were similar to the HP DV6z but in actual game performance the 6750m was superior, because of the GDDR5 vRAM vs the GDDR3 with the 6650m. The A8-3500M and A8-3510MX have the same integrated GPU too (6620g)
-
As far as I see it, SLI works like Raid0 so that each GPU only takes a part of the load. The way I see CrossfireX working is that the second GPU would be adding it's pixel and texture processing power to the first rather than how SLI works.
Awesome review. -
Well SLI / CrossfireX share the load. It's designed more or less to work between two similar performing cards. Seems they haven't quite got all the bugs worked out when there's a significant differential in performance.
-
Just added Skyrim game performance to second post here: http://forum.notebookreview.com/gam...d-6750m-benchmarking-results.html#post8300413
-
Thanks for the updated numbers. I think screen brightness may have made an extra watt for you, not sure about wifi drain. Inserting my wireless mouse nub made almost .9W difference(!!) which was irritating. I should document my settings somewhere.
-
No, I didn't touch the screen brightness. I've found that between lowest and 1/2 brightness makes very little difference in power drain, fractions of a Watt. But going to the top few notches especially top notch of brightness it consumes something like 2.5W! I use a wireless Logitech Darkfield mouse, I'm sure that has some effect too. I also have Bluetooth disabled.
-
Can i ask two stupid questions?
What does it all mean for average user like me does this only work on new Liano laptops? I know how to overclock my gpu but it sometimes goes unstable, is there a way around it with crossfire or wut?
In other words can i somehow increase FPS in games with my 6750m on mbp? -
Sorry, I should have explained. You need a CrossfireX capable laptop in order to run dual GPU's. As of now for AMD only the "Fusion Llano" laptops are capable. Other machines are not set up for such operation. I believe it requires hardware as well as software to make it work.
Regarding your MBP, no you cannot, that has Intel IGP and AMD GPU. There's no such solution for that, and doubt there ever will be considering they are direct competitors. But as you can see, overclocking the 6750m has similar performance to a stock clocked 6750m in Crossfire with the integrated GPU. -
i see ty.
Did you use gpu-z to overclock?
I tried msi afterburner at some point but notebook shut down in 30 minutes of playing witcher2 every time with pretty average overclocking >< -
GPU-z is a reporting tool only. I use MSI Afterburner. Just have to monitor temperatures. I use HWMonitor for temperatures or even MSI Afterburners own built-in temperature monitoring tools that display it in-game. They show GPU only not CPU though.
-
That just means your Macbook is probably overheating? The i7 may be too hot, or you need a cooling fan to blow across the bottom.
Pretty sure the MBP overclock threads mention thermal monitoring software you can use, but if overclocking in Windows on your Mac just try HWinfo32 or HWinfo64...
Well, memory speeds are more of a crapshoot that CPU clock. Going from 600 Core to 725 is pretty reliable. Overclocking GPU memory could be anywhere from a 5% safe increase to 25%.
So you need to watch your temps, and when in doubt try lowering the memory clock again before lowering core clock. Unless your motherboard has thermal data for the memory too, it could be that one chip is holding the rest back even as GPU core temp looks fine..... -
i see thanks.
Funny thing is that overclocking gave me like +5 fps in witcher2 when turbo boosting gave like +10-15. Anyhow, witcher2 was awfully optimized for amd back then. Will try overclocking in other games today. -
Newbie question: Why does dxdiag & PerformanceTest recognize my 6620G integrated and not the 7690M?
7690M Shows up as a selectable card in GPU-Z. This just confuses me, Just got my dv6zqe and want to make sure everything is kosher. -
You have to manually select which GPU to use for each app. It defaults to the 6620G, well except that some games are preset by the Catalyst Control Center to run "High performance" mode which is using the dedicated GPU (6750m/7690m). Right click desktop and choose selectable graphics option.
-
Pretty much everything from the links in HTWingnut's signature should apply to your system...I assume you have a new HP 6xxx laptop?
-
Great analysis, +rep
-
I want to buy the dv6 6135dx laptop, but I do not know how well run the battlefield 3 which is what I play on medium-high settings, the graphics are 6620G or 6520g +6750m, making him not overclock the processor if and video card can play the BF3 medium-high around 30 FPS
If I do overclock the video dedicated performance as that which would be achieved
AMD Radeon HD 6750m Benchmarking Results
Discussion in 'Gaming (Software and Graphics Cards)' started by HTWingNut, Feb 8, 2012.