When all things are equal, as proven here by you crazy modders, AMD pulls ahead of Intel, if you're looking for game/graphic performance vs cost. (Excluding coupons.) AMD has clearly shown that Fusion-type integrated graphics is the wave of the future. Even if currently unworkable/impractical right now, it brought CrossFireX to notebooks.
However, shortcomings on both ends are going to make me wait for Kaveri (after Trinity), partly to get the latest and greatest combo of AMD's cpu and gpu, but mainly due to software support.
The common thread for both AMD and Intel packages' problems seem to be drivers; AMD is getting hammered all the way around for this, including with desktop cards. AMD has been known to lag in this area since PCI-E replaced AGP.
Also...cpu core performance. Bulldozer's hardware capabilities exceed its performance, and has been shown, at least in part, to be due to software/driver integration...though "the patch" so far has only proven to give modest gains, and not enough to fill the gap...and the patch doesn't apply to Fusion because Fusion is based on the old STARS cores.
AMD has certainly been stretched thin therse last couple of years, and obviously it is because it has been fighting a war on four fronts (CPU, GPU, APU, drivers), but the hardware success and performance (despite drivers) of Fusion is certainly a hopeful plus.
Maybe when AMD-based notebooks offer a combination of solid UEFI BIOS plus stable OC'ing software, workable Hybrid CrossFireX implementations, better-matched iGPU and dGPU combinations, and driver stability/performance, it'll be time to buy again. Given the near-timeline of Trinity, however, vs problems we are still seeing at almost a year after Fusion, and a couple of months of incomplete beta-only driver packages that obvously require installation/deinstallation simplification, I can't see that AMD is going to resolve all this by the time Trinity has left the building, unless it's late in the game, making it worthwhile to just wait for Kaveri which will offer GCN graphics and better CPU cores.
Frankly, as far as notebook makers go, my money would be on ASUS in 2013-2014.
-
-
The only gripe I have against the AMD Fusion arquitecture is that the processors are very low clocked against what they can really deliver.
As for xFire mine runs smooth as butter with 12.1 preview and caps, I wonder if I really need to do a video showing proof of it... -
To my knowledge the 7690M is based on 6770M. It starts at....725/900? It may actually run BF3 at 1080 with tolerable performance, no OC at all.
The i5 config looks pretty good due to those freebies. A lot of games only use dual core, and some depend heavily on clock frequency. The integrated graphics will not provide a good experience, so he should not expect to do a lot of gaming on the battery. But otherwise that machine will work just fine, CPU is matched well against GPU when plugged in. -
7690m = 6750m
7690m XT = 6770m
But yes, very capable machine, just don't expect ultra detail at 1080p -
...and look at what you have had to do to get the performance that your laptop is capable of.
Given that this is all new tech, relatively, let's hope that AMD is committed to soon doing what it takes to give users this kind of performance out of the box. They themselves are the ones that have the most to gain. -
Well I like to play with my toys indeed, as somebody else said in a past post, this is not a noob laptop indeed.
BTW, I really didn't do much:
1.- Flash Musho Bios
2.- update to 12.1 preview drivers
3.- overclock
4.- better cooling
Alright... it's quite a bit but it really wasn't too bad. -
I have no doubt that, if you and HT hadn't have had to do these things, you would have found plenty more! Maybe start your own AMD-based Mars program?
-
Hey guys, i would appreciate any info someone could give me on this. i am getting ready to purchase this laptop but im not sure of a couple of the upgrades. not sure if they would be necessary for me.
Basically, i use it mostly for excel type stuff and internet, burning music cds and occasional dvd. I also watch some movies online occasionally and play some games. most of the games are just typical internet type games (facebook, espn) and i also play my emulators like the NES, Atari, N64, stuff like that. my current computer sometimes slows down while im playing games like it is pausing and unpausing and when i watch videos or movies i get the same thing sometimes. ive been told this computer should easily handle this type of stuff, but i am unsure about whether some of the upgrades are worth it for me.
The first is whetehr the 15.6 Full HD anti glare LED (1920 x 1080) is worth the extra money for what i do. My feeling is that my current screen (the brightview 1366x768) is fine because i have never noticed a problem before, but i have also been thinking about upgrading to the blue ray player and wasnt sure if it mattered then. FYI, my current TV plays 720p and 1080i and i dont really see much difference anyway. All of my current movies are regular DVDs so im not sure how much of a difference the occasional blue ray watching would be noticable anyway.
The other thing i am not sure of is the differences between the AMD Quad-core a8-3550mx accelerated processors versus the cost. do i need the 5.7ghz/2.0ghz, 4mb l2 cache for what i described or would a cheaper option do what i am using it for. There are several lower options like the AMD Quad-Core A8-3520M Accelerated Processor (2.5GHz/1.6GHz, 4MB L2 Cache), or the AMD Quad-Core A6-3430MX Accelerated Processor (2.4GHz/1.7GHz, 4MB L2 Cache). I just dont know how much influence these have.
The final option im stuck on is the AMD radeon discrete-class graphics [hdmi,vga] or the upgrade to the 1GB AMD Radeon HD 7690M GDDR5 Discrete Graphics(TM) [HDMI, VGA]. i just dont know much about this stuff and some just seems like overkill to me.
Thanks for any help out there. -
For the use you describe, there's no use in getting anything but the minimum, with the exception of the 1GB AMD Radeon HD card.
For streaming and television, you won't find much beyond 720p, and the experts say that unless you have a REALLY big TV, you can't really see the diff between that and 1080p. At home, I have a huge home theater setup, and unless you're watching a 55" (or bigger) TV with 5.1 surround sound setup, then you don't need blu-ray. As a matter of fact, I've watched a regular DVD on it and I like it.
Anyway, you can still output full HD resolution to your TV regardless of the laptop's resolution.
For your laptop, based upon what you described, you'd be happy with the stock screen. While I DO admit that the upgraded screen is better, it's mostly for color and viewing angle, and not so much increased resolution. The upgrade screen, based upon what people have posted, is higher quality than stock all the way around.
As far as the processor goes, sounds like you're into vintage gaming, so your emulators should be fine.
Overall, doesn't sound like you're going to push it much at all, nor does it sound like you're of the OCD crowd (the people that twitch just knowing it isn't the absolute best regardless of their use. I used to be one.), so it sounds like the budget stock w/ the video adapter upgrade would be perfect for you. -
HP Pavilion dv6z Quad Edition customizable Notebook PC
•dark umber
•Genuine Windows 7 Home Premium 64-bit
•AMD Quad-Core A6-3420M Accelerated Processor (2.4GHz/1.5GHz, 4MB L2 Cache)
•1GB AMD Radeon(TM) HD 7690M GDDR5 Discrete Graphics(TM) [HDMI, VGA]
•FREE UPGRADE to 6GB DDR3 System Memory (2 Dimm)
•FREE UPGRADE to 640GB 5400 rpm Hard Drive with HP ProtectSmart Hard Drive Protection
•Microsoft(R) Office Starter: reduced-functionality Word/Excel(R) only, No PowerPoint(R)/Outlook(R)
•No additional security software
•6 Cell Lithium Ion Battery (standard)
•15.6" High Def LED HP Brightview (1366x768)
•SuperMulti 8X DVD+/-R/RW with Double Layer Support
•HP TrueVision HD Webcam with Integrated Digital Microphone and HP SimplePass Fingerprint Reader
•802.11b/g/n WLAN
•Standard Keyboard with numeric keypad
•HP Home & Home Office Store in-box envelope
With coupon code NBJ5867
$574.99 w/ tax
...but you could probably even get by w/o the upgraded video, which takes you down to almost $500 even. -
Hey guys I just slapped in a Corsair Force 3 into the DV6-6108us and I'm only getting Sata II speeds, what gives?
Trim is enabled, so the drive is already in AHCI, did HP not give us a SATA III port on the main connection? -
Thanks for the info, it has helped me rule out the better screen. i do agree that is probably overkill as well as the blue ray. i can probably save money on both of those and buy a ps3 anyway. When you say "video adapter upgrade" you are referring to the 1GB Radeon correct? It sounds like this is not affected by the processor. So for what i do there wouldnt be any difference between the
STOCK: AMD Quad-Core A6-3420M Accelerated Processor (2.4GHz/1.5GHz, 4MB L2 Cache)
UPGRADE +$20: AMD Quad-Core A6-3430MX Accelerated Processor (2.4GHz/1.7GHz, 4MB L2 Cache)
UPGRADE +$50: AMD Quad-Core A8-3520M Accelerated Processor (2.5GHz/1.6GHz, 4MB L2 Cache)
UPGRADE +$70: AMD Quad-Core A8-3550MX Accelerated Processor (2.7GHz/2.0GHz, 4MB L2 Cache)
Im definitely closer to making my decision, so i really do appreciate it. -
OH SNAP they pulled that XT nonsense out of the rabbit hat? Dammit.
The A6 3430MX is probably fine, but A8 has better integrated graphics. Even without 7690M card you will still have video output ports, and integrated graphics can decode two bluray-spec video streams at the same time with basically no CPU load. The only emulator that a laptop should have trouble with is Dolphin's Wii emulator, but fast CPU is usually more important than graphics. You may not need the 1080p screen upgrade. Battery upgrade is pretty cheap, may be worth it depending where/how long you use the laptop. -
That's a 33% increase in base CPU speed, the Turbo speed is less important. I would say for spreadsheets and DVD burning (which may involve recoding video?), the 2.0GHz base clock makes more difference than a faster GPU. Better to put $70 into the processor than graphics.
EDIT: Bumping to A8 also has close to 30% faster integrated graphics than A6. HT's excellent http://bit.ly/wKA5UA shows how impressive the integrated graphics are on A8 APU. The iGPU can just barely manage Battlefield 3 in a 64-player map, nothing you listed could put a dent in that performance. -
Correct. If you get the bottom-end processor and the 1GB Radeon, and then choose the external graphics adapter for everything you run, it sounds like you'd pretty much have the best of both worlds on a budget.
I built an A4-3300-based HTPC with NO external graphics card, and it runs all the video and blu-ray I throw at it, but I doubt it would work for even vintage gaming. -
Clarkkent: Your A4 should work with all emulators up through and including EPSXE. PCSX2 and Dolphin would suffer though. It handles all your video and bluray because it is (Bluray Picture-in-Picture Compliant), independent of CPU speed. We really only need the 7690M for newest DirectX games. And remember the OpenGL thing? I thought most emulators use only OpenGL, which means the iGPU is the one doing all the work anyway.
-
Thanks for all the responses guys.
the one emulator that currently gives me the most problem is N64, so it sound like all of these options should accomplish what i am asking for. Now i just need to decide which processor to get and if i need to upgrade that too. one thing i am confused on now (again i know next to nothing about these things) is the quote
"If you get the bottom-end processor and the 1GB Radeon, and then choose the external graphics adapter for everything you run...
with this one do i actually have to manually "choose the external graphics adapter for everything you run" when i play or do something. im sorry to say i dont know what that means. i usually start up a game or program and just play it, not knowing what all the various settings do or is that something the laptop does on its own.
It's bad, i have a math and computer science degree, but know very very little about hardware, more of a programmer type of guy and unfortunately that is worthless when trying to figure this stuff out. -
I thought of that too, but it pretty much limits the ability to possibly expect more out of the graphics if he might want to up it on the games a bit. I would have made the same recommendation if no gaming had been mentioned whatsoever.
However, based upon the results y'all have got here, it doesn't sound like it would take that much to get the A6 up to the A8 range w/o hardware mods just using the K10STAT OC.
Consequently, the lower end processor and higher end video seems to be the most flexible combo for the price.
Calc - Yeah, I'm hip to the OpenGL thing, but if it were me, I'd want the possibility of being able to run something a little more demanding. Besides, the Handbrake benchmarks for the stock A6 and A8 don't vary much from each other, so it doesn't seem that encoding would suffer much using the A6.
Besides, you KNOW that just about anyone who shows up here is going to be infected at least a little with the K10STAT bug...you guys just have that effect on people.
Lastly...you'll debate anything. We all know that.
-
Don't worry. Just check out HTWingnut's optimization guide (available through his sig line here) as well as graphics benchmarks (also in the sig line) and all will become clear.
-
I have an A6 and for some reason seems that xFire works wonder with me....
-
Anyone have a clue why my SSD spazzes out like this? Especially on a supposedly SATA III Controller? That's weak! -
Ok, thanks again. i guess i just need to make a decision on the a6 or a8 or i will never actually buy it. It's too bad i never learned any of this stuff like "base clock" or GPU or integrated graphics.
does the processor have any effect on the graphics card or vice versa when dealing with this level of stuff. ie, does upping to the 1gb mean something in terms of the upping of the processor or are they not relevant for what i am doing. one of the previous posts seem to indicate that the higher processor might be detrimental to the upgraded 1gb radeon or i might just have been misunderstanding it. -
Which games in particular? So far it just has been more or less a total dud and since a mild GPU overclock can match stock Xfire performance, I think overclocking the dedicated GPU is the better solution if you have stutter issues. Seems the faster I OC the GPU the worse the stutter when running in xfire.
-
Use CrystalDiskMark and AS SSD bench
-
Sure, if you move it further out of the 1.75:1 (or less) CF ratio, you're more likely to have problems. (Since it still trys to CF even if useless, apparently.)
This is exactly why I wrote earlier about the need for better matched iGPU and dGPU (hopefully with Kevari), but I would be afraid that someone at AMD would interpet this as a worse dGPU to more closely match the iGPU...instead of vice versa.
I guess the appropriate test would be to decrease you dGPU speeds until the stuttering is gone in Xfire, and then compare your FPS with Xfire to your FPS using the OC'ed dGPU only. I wouldn't be surprised if the figures were pretty close to each other. -
Better yet they need to come up with a way to modify a performance ratio variable so that you can tune it in.
-
Severe performance let down
-
I've mentioned them before O_O, with caps and 12.1 preview: Skyrim (1080p), Deus Ex HR (1080p), NFS Hot Pursuit 2011 (1080p), Crysis 1 and 2 (1366x768 not checked at 1080p), DiRT 3 (1366x768 not checked at 1080p), Battlefield 3 (1366x768, 1080p stutters), NFS Shift (1080p), GRID (1080p, but even with mods this is a relatively easy to run one)
Other games tested: Fable 3 (1366x768 not checked at 1080p), Trine, Blaz Blue 1 and 2, YS Origin... but none of them actually is very demanding as the past ones.
In all the games the stuttering was only visible with Battlefield 3 at 1080p, all the others ran extremely smooth. My settings are almost everything on high except for the shadows and post processing which I set always on low (and I tend to use FXAA injector since it works better and faster than SMAA 2x) Battlefield 3 and crysis 1 were the games were I had to put everything on low for the former and a lot of lows and mediums for the later. -
Corsair Force Series 3 120 GB SSD SATA 3 Review - ATTO Disk Benchmark, Crystal DiskMark and AS SSD - The SSD Review
Also do a AS SSD so we can see alignment and driver.
Edit: and read the PCMark/afterthought page -
Yup, low score. AHCI mode should be enabled by default (Can't modify it in the BIOS) and Trim is enabled. -
Fom your lips to AMD's ears. They'd need an active utility to measure performance and then set an an optimum ratio, or constantly measure performace and dynamically adjust.
Like I said before, standard CF and SLI were designed to originally two exactly equal cards, so any tweaking of that scheme isn't the optimal solution for Hybrid. However, Hybrid's solution would be effective for standard CF. -
It actually seems right(compare to the review) , considering SSD have big write performance decrease on small drives.
Also factor in: Desktop vs Laptop
Here is a review on 60GB I just found
Corsair Force 3 60GB SATA III SandForce SF-2281 SSD Review-Hi Tech Legion-Corsair Force 3 60GB SATA III SandForce SF-2281 SSD Review -
kk thanks for the information
Sly Corsair people using marketing tactics!
-
Just find a dv6 6135dx if you can still find them.. Better CPU A8 3500m and once you get comfortable with it, you can do a mild overclock to the cpu if you really desire so. Newegg.com - Refurbished: HP Pavilion dv6-6135DX Refurbished Notebook AMD A-Series A8-3500M(1.5GHz) 15.6" 6GB Memory DDR3 640GB HDD Blu-Ray Drive AMD Radeon HD 6750M
Some of us like to tweak and mod, but when it comes down to it, stock would have been just enough to do about everything. -
TD may still have the 6135dx, I must assume HP is dumping all their unused parts to make these amount of "refurbished" laptop.
Most HDD should br old drives. Probably new broad though, as these amount is just insane. I mean 100+ newegg review? Are we looking at like 500 refurbished unit...? -
Just curious, where are you getting the 1.75 ratio from?
AMD should be able to bake their drivers with 1:2 AFR instead of 1:1 AFR... Tri-fire seem to work pretty well, dunno why they can't get dual graphics to work right. Maybe it's such a niche setup that they aren't putting much manpower into it. Heck, if they really wanted to they could probably do all the post processing on the iGPU so you pretty much get free fxaa and stuff.
Well the application profiles in 12.1 let you choose the CF config (not the exact ratio though), but I haven't really messed around with them so I don't know if changing the settings do any good. -
Right. But that's for the specific game not necessarily for the card performance configuration.
-
1.75 from HardOCCP. Great soiurce of trusted information.
Dual/Hybrid may have been a niche product at one point, but since AMD is targeting mobile processing as its main market, the Hybrid needs to be revamped since it'll affect the majority of their target clientele. -
bobapjok: Faster CPU will never slow down graphics. The 1GB dedicated video card just draws fancy 3D games better than integrated graphics, but it mostly affects modern PC games. Faster CPU makes editing large spreadsheets with complex math faster, and allows emulators to run smoother.
N64 games easy to render(draw) but sometimes difficult to emulate(recreate game CPU features). -
I wonder if AMD might try a couple things in the future. One would be to use an asymmetric frame rendering engine so that say 2 frames are rendered on the dGPU and then 1 on the iGPU. This might help on the microstuttering issue.
The other possibility, especially with the upcoming Trinity APUs would be to put two of them in a laptop and CF the two iGPUs together. You'd get 8 CPU cores and two fairly capable iGPUs working together. No dGPU just the dual iGPUs.
Lastly, I found this article on microstuttering with desktop GPUs. The methodology seems like it could prove useful for benchmarking our llano laptops -
Thank you. That was a big help. I think I am going to upgrade to the 1GB graphics card and also the better processor. I appreciate the help.
-
Now THAT'S an interesting thought, and one I've never had!
I guess we don't know how many GPU cores Trinity will have, nor what it'll run at. I assume we can assume at worst it'll be 400 @ 444MHz, same as the current one, but a slightly more efficient "4D" design.
Trinity will be sort of 4 integer cores, and 2 floating point cores...I can see having 4 modules as being useful...in fact that's what I was hoping we'd get in notebooks (I'm sure we would if AMD could sell higher end stuff against Intel...at this point I'm more interested in AMD even as a power user/gamer since they don't have all the driver problems!)
Anyway, I can see that being useful from the CPU side, and obviously useful from the GPU side, and if it meant the crossfire would actually work better and on more games...yeah, I can totally see that being useful!
Of course I'd RATHER have 4 piledriver modules on one chip, and 800 (or more) GPU cores on the same chip-more efficient, but I'm sure we won't get that.
But that would be very interesting...something possibly approaching or basically at high end GPU performance when crossfire works, and the CPU portion should compete well too...and all from integrated chips... well, this ain't your daddy's integrated video LOL -
Dual Hybrid is also known as Asymmetrical CrossFireX, but given what little I have been able to find out, it seems to work in a passive "catch as catch can" method, meaning when it sees a processor as available, it uses it, regardless of speed or capability.
Consequently, an inherently slower processor is going to drag down the faster, and especially if it uses the shared DDR3 memory vs the faster processor's dedicated DDR5.
I think that this is causing the microstuttering issue, and, given that there are no other issues with drivers or hardware, is a logical assumption.
I believe that setting a specific ratio MIGHT help, but for this to work, the system would have to dynamically accurately measure what's going on at all times in order to adjust this ratio properly. The ratio would vary depending on the game detail settings, but also the specific frame graphic and detail as well, so you couldn't get by with an average static ratio. (For example, in a part of a game where the screen is mostly dark/black, the weaker processor would perform better than it does in a lighter and fully detailed setting.)
It seems to me that since frames are so relatively brief, the best static scheme would be for the weaker processor to render its frames in less detail than the stronger, which, during the action, might be almost completely unnoticeable, but that would also require a huge overhaul of drivers in order to allow two different, dedicated setting per proccessor. This would allow the weaker processor to have a more active role, and thusly help achieve a higher total frame rate.
The best solution, of course, is for AMD to utilize a better iGPU in the APU in order to better match the dGPU, which would be only a partial help due to the DDR mismatch that you'll never get around, though faster RAM and an OC'ed APU would certainly help as well.
I guess we will see what AMD has in store. Since CrossFireX in notebooks is new, I would expect it to bring challenges all its own, but since AMD is now targeting the consumers and market where the money is, I can't imagine that they wouldn't want to specifically address this.
Since they are planning to introduce GCN graphic cores with Kevari and not Trinity, at least that will help address the hardware-matching issues, so it looks to be that they'll have aboout a year to straighten the drivers out.
Provided, of course, that AMD addresses their chronic driver problems. (They should just throw a bunch of money to Nvidia's best team members and draw them away.) -
Thermal Compound Eval Link:
Thermal Compound Roundup - February 2012 | Hardware Secrets
I use Arctic Silver Ceramique because it is both nonconductive and noncapacitive. -
You know I had a similar thought as well. Dual CPU's would actually consume less power than single with a dedicated GPU. Then you could SLI the IGP's, perhaps even have some shared onboard or even on-die GDDR5 and the thing would FLY. CPU cores are already managed like individual CPU's by Windows at least that I know of, so adding an additional chip would just require some hardware adjustments on the motherboard and appropriate drivers.
In any case I think the Trinity iGPU will be much faster than 444MHz. And I really hope they do manage to find a way to throw some dedicated GDDR5 on chip or at least on the motherboard. That's really what's holding it back atm. Even a 256MB "buffer" vRAM would probably help immensely. -
Relatively fresh tech specs fopr Trinity:
http://www.techpowerup.com/img/12-02-13/87a.jpg
Article @ AMD "Trinity" APU Models Further Detailed | techPowerUp -
just saw this thing.. same specs but 17... albeit the resolution is lower compared to a 1080p
HP Pavilion dv7-6165us LW460UA Notebook PC - AMD Quad-Core A8-3500M 1.5GHz, 6GB DDR3, 640GB HDD, DVDRW, 17.3 Display, Windows 7 Home Premium 64-bit at TigerDirect.com -
Looks like they changed their minds and are bringing GCN to Trinity, along with much faster clocks.
So...maybe I will buy a Trinity.
Also, as far as socket compatibility, it has been definitely stated that you can't put a Llano (FM1) in a Trinity socket (FM2), but I haven't yet seen if the reverse is true. -
Wow! A few 17's are floating around? There were a couple on NewEgg during the 6135dx madness.
Clarkkent: I am not sure what to make of those numbers. A10 has 384 cores @ 760MHz against original A8 400 cores @ 600MHz, not a world ending bump without architectural advances on the cores themselves. We'll have to wait and see more about the laptop models, very curious what they did there. -
Hmm...looks like they'll be actually REDUCING the number of cores slightly...well, they should be slightly more efficient, and we don't know how fast they're clocked, so that may mean nothing.
I really doubt it. If they were doing GCN they'd have had to have that in the works for years now. It's supposed to be an updated "4D" design, which is a bit more efficient than what's in there now.
*HP dv6z AMD Llano (6XXX series) Owners Lounge*
Discussion in 'HP' started by scy1192, Jun 22, 2011.