Google and YouTube are wonderful things.
-
-
I did, but all they showed were blurry internal AMD simulated benchmarks. Let me rephrase, are there any "real" (i.e. actually performed) benchmarks?
-
1. You really have to dig and extrapolate to come to logical conclusions. Yeah, the "fuzzy projections" are just that, but they're supported when you look at beta performance of Trinity vs Intel on YouTube.
2. Also, @ Passmark, look at comparative CPU/GPU performance. While "Trinity proper" isn't listed, separate components are. BD core vs "Stars" K10, GPU, iGPU...you really have to break it all down and put it together yourself to see that, theoretcially, Trinity should deliver.
3. Add to that the known MS issue with BD (ironically discovered by Linux pros) and read the news there. MS has watched AMD carefully and admitted to their error and promising a fix for Windows 7, and working properly with 8 from the start. (One patch of which they accidentally released when it requires two patches, so they withdrew it quickly.)
4. Finally, there is more than one YouTube video comparing real-world gaming performance of beta-Trinity. Unfortunately, I couldn't find one that showed actual benchmarks, but gameplay was just as smooth on both Intel and AMD laptops. (Laptops!)
Bear in mind that, this time through, AMD is more tight-lipped than they have ever been, which is an indicator, especially when Trinity is only about 2 months away from roll-out.
...and to you naysayers of Xfire on our current setups...expect that this will be fixed in Trinity by "upping" the on-die GPU to more closely match the now-optional discrete GPU. Remember...llano was experimental in nature, and directed more towards power saving performance, so they didn't want to incur flull-blown costs when so much was at stake. Plus, experimentation had begun long before BD was mature anyway. -
I don't know why for many people xfire doesn't seem to work well, I'm yet to see an extreme stuttering in games, battlefield 3 at 1080p was the only one that ran slow but at 720p it flies...
-
Imagine it flying at 1080p with no stuttering...
-
yeah wing, was driving me crazy not being able to buy a new lappy for xmas cause I couldn't find anything more appealing 'bang for buck' than my dv6z
got it for 220.00 WITH the geek squad prep, (they cleaned everything out, minimal install with the crucial drivers etc. setup in a single button 'restore' partition...nice feature btw on the lenovo's, you don't even have to able to boot, just hit this little mech button above the keyboard and it goes into full restore mode, leaving your data on a sep partition.
dunno bout that , from what I understand, it was 299 'once' and i think that was on cyber monday. thing is fairly well equipped to tell the truth. has a VERY nice keyboard with indented keys that self center your fingers. I'm a fast typist and this one makes me even faster.
brief review of my seat of the pants impression so far.
1. good looking all black, generic wedgie type design.
2. screen is VERY bright, actually bothers my eyes if i put up to max so its generally set at two levels darker
3. keyboard is excellent, nice feel, not loud, very surprised at quality
4. construction is all plastic but nicely done, and no discernible flexxing.
5. has several very neat options. first, a trackpad kill fkey which is soooo appreciated. second, a physical wireless kill switch, big multi-mem chip reader, takes six or eight types of mem, very nice sound for a lappie, big surprise here actually. voices are very accurate and tones well rounded and full.f
6. it has hdmi 1.34 and i have to tell you, out of all of my laptops, this is the only one that fits a perfect desktop onto a 1080p big screen. laser sharp and clean clear to the edges (panasonic viera 55") WOW looks and plays fantastic .
7. has hdmi, AND vga, and 3 usb 2.0 ports AND esata/usb combo port,
8. std opticarc dvd/cdrom everything burner
9. AND, sob even has a place to plug in a sim phone card hidden under the six cell battery.
10. came with google chrome installed, the lenovo users manual in the docs file, latest Adobe reader, and fully licensed WEBROOT antivirus installed, up and running. Has a few useful things like facial recognition software, AND fingerprint reader. Once you set these up, they seem to work just fine and damn its soooo much easier and faster to jump start the boot and call up your common proggies
11. has the ralink wifi adapter that I like (VERY configurable) and lots of others don't
bgn etc.
12. hd is only a 5400 320gig but no big deal for something this cheap and easily swapped out
13. comes with a single 4gig module in one of the two slots, so presumably operating at single channel speed I'll have to swap out for two fours of faster ram and see what effect it has, i suspect it will be significant.
14. the cpu is a dual core E-450 with an onchip 6320. This is in NO way comparable to the dv6's with the quads, however, it runs two instances of wow (i dual box a lot
just fine and have zero complaints about its performance in the game at "good" settings with the view distance and res punched out a bit further for pvp
I was very surprised it could do this at all, much less do it well. No raid/heroic dungeon speed issues at all.
15. installed skype and works just fine. call quality is better than my cell phone and video conferencing works fine, tho the camera is much lower rez than we have on the dv6z.
16. ok, last thing, this box is QUIET! I have yet to audibly notice the fan, even while gaming on wow, and according to my afterburner temp monitors, the apu hasn't gone over 59C yet, not matter what i'm doing, and thats in an ambient of about 75F.
The power brick is surprisingly small, I think the 450 pulls around 18w...with the std 6cell, on std power usage I can get about 4hours surfing online etc.
not too shabby. I'll have to set on powersaver, lower the screen brightness and turn off stuff I don't need and see how much I can pinch out of this thing, but looks good so far.
Turns out, my bios has any apu overclocking locked out. Some stand alone boards have built in overclocking and apparently, given enough power, the chip can be overclocked like crazy. the built in graphics self overclocks around 20 percent on demand.
So, there you have it. In sum, its a helluva box for under 300.00. The keyboard is so finger friendly and the machine so quiet, that I find I'm doing a lot of writing in near dark with it sitting in my lap.
Seer
Oh, new details emerging on the impending trinity lappies.
Don't know the pricing, but it sure looks like they are going to rock the house
I'm saving my pennies
-
how are you guys overclocking the gpu outside of a bios mod? with 12.1a drivers and msi afterburner, i can only underclock the speed my modded bios is set at.
BTW i still need a laptop cooler, any recommendations? I bought a cheapo ebay led fan one and it sucks . Something budget minded, or i might just make my own -
Trying to satisfy my curiosity, and yours, I won't bore you with all the details of all my cross-referencing research, but if any of you know or can find more accurate details regarding Trinity's iGPU, here's the latest I can come up with, based upon figures and approximations of what we know about the A8-3500M and what Trinity is supposed to be.
First of all, I hate the old "mobile 6xxx GPU" is actually a "5xxx" GPU. Gives me a huge headache trying to find equivalences...but I still got it figured out (I think). Of course, your mileage may vary.
Currently, extrapolating gpu-specific information (pixel fill rate, etc), the 6620g seems to be an HD 5550 equivalent. From personal experience with my HTPC builds, the HD 5550 was GREAT at video, but not so much for any gaming...which is what y'all have found. No surprises.
However, since I scoured the web looking for some kind of official info regarding Trinity's iGPU, and found none, I was relegated to doing some algebra and more chart-searching. I promised you no boring details so I'll cut to the chase.
Looks to be that the on-die iGPU should be the desktop equivalent of an HD 6570 or 6670, if promises of performance increases of 30-50% are to be believed. (And all evidence points to fact, currently.)
Consequently, from Passmark:
Sabine
on-die
6620g = 465
vs
HD 5550 = 612
dGPU (for dv6-6135dx) : 6750m, = 1330 Passmark
HD 5670 = 1233
Seems about the expected hit in iGPU performance given memory cache and Star core differences. So...we'll call it accurate, and look at Passmark projection for Trinity:
Trinity
on-die
equiv HD 6570= 1042
equiv HD6670= 1233
...so what this means that if Trinity lives up to it's promises, you'll get the same performance with the iGPU as you're now getting with the 6135dx's dGPU. And according to you all, it's good, and even better OC'ed. Consequently, if paired with a dGPU equivalent to the current one, you should get really good xFire performance, so no OC'ing should be necessary.
However, this doesn't take into account any improvement with the Piledriver (improved Bulldozer) core.
The Passmark CPU score for the A8-3500M = 2892
The Passmark CPU score for the FX-4100 (only quad core BD I could find) = 4331
Bear in mind that I'm pretty sure the Trinity will be clocked lower, but it'll also include the MS Windows fix and it's a Piledriver core which promises to be better than Bulldozer...so a slower clock just might be considerably offset by these improvements.
Of course, I had to extrapolate from known info and projections, so it's still anyone's call...but, hey, I did the work so you wouldn't have to. (And so you know...I'm a PhD and a former math teacher and used to meticulous research.) -
11.6 Drivers work outside of a bios mod. Find what works for you and then try to match one of musho's modded bios clocks and then flash to that. Then update Catalyst. Not sure if any other version works for OC. What are your clocks currently at?
As for Coolers... yadda yadda yadda
I chose the bamboo one because it allows for modability - bigger fans.. panels.. angles and is much cheaper in the long haul!
-
No surprising you.
Don't be short on the details. Maybe account for people who are overclocked and do a comparison to speculated Trinity performances, like this one.
-
12.1 works. In MSI Afterburner you have to make sure to enter the EULA statement in the cfg file. MSI Afterburner limits. - Overclockers Forums
Awesome, wish I had seen that deal. I would love to get ahold of a Lenovo with the E-450. My eee 1215b is a piece of junk. It has E-350 which is fine but the chassis is junk. -
Ahhh, that's what it was, forgot about doing that. Thanks Wing.
-
LOL! You don't want much, do you?
In a nutshell...
Your fully overclocked dv6-6135dx should perform about the same as the Trinity @ stock (using the iGPU only.)
Trinity with dGPU in xFire should leave everyone in the dust, and run 1080p with moderate-to-high graphic quality settings at a playable frame rate (30-50+).
How's that? -
Sounds like good news..
Some bad.
And the end currently, ugly... means I gotta upgrade yet again!
Thanks to the plethora of research clarkkent57 (all the leg work) has done to inform us of his possible discovery. I now think of Llano as the much uglier sister of the two .. the one we all say... 'ahhh, she'll dooooo.. - double bagger.'
What?! Get's the job done... -Llano APU
-
If it turns out that Trinity is socket and chipset compatible with the Sabine, then that would be good news for all of us dv6-6135dx owners!
-
Remember the S1g1, S1g2, S1g3 from the Turion II/X2 sockets?
S1g1 chips couldn't use S1g2 or g3 sockets but g2's could use g1 chips and then g3 sockets were backward compatible. Something like that.
Seeing as we are FS1g1 (HWiNFO64) and Trinity being FS1r2... wouldn't the same reasons apply? -
doesnt look like i can oc the gpu/mem any more than the bios setup at 740/880. Would get artifacting in 3dmark11 with speeds i tried, that is if it can even launch the program (would get a black window when launching the program). and tried 780/880 to startup and got a bsod after a minute in windows..
I guess 740/880 aint bad, still a huge overclock over stock. -
Definitely and you got past 6770m speeds. So you made it to the other side.
-
is it possible to adjust the gpu voltage?
-
Not at this time, no.
-
OK. I have Gpu-Z running. That's how I know my changes in Afterburner are taking effect. But according to AMD System Monitor, still only the iGPU is being worked by Kombuster. Also, On the lift side of the Kombuster window, it says 6620G (in red). Will that change to the 6000 series when it's working that card? Nothing I do wants to stress the dGPU. Is there something in Gpu-Z that will force the 6750?
-
Sometimes when you apply settings with Afterburner, it disconnects the dGPU. So a restart is required to bring it back. And for Kombuster, gotta use GPU burn and set DX11 or 10. Also make sure high performance is selected on switchable graphics for Kombuster.
Other that that, I can only assume a restart is required and then load Kombuster up and GPU stress it, see if it uses both. If it does and you go back to Afterburner and apply an overclock and the dGPU doesn't activate during the 2nd attempt at Kombuster, you found your problem.
Both red and green bars will display for temp and load. -
nice on the phd and math. What was your doctoral thesis about?
For myself, I'm a "hand me the thing and get out" kind of guy. trinity can be overclocked, at least the desktop version. The clocks on the laptop version are higher than llano. Stock to stock, Trin is around 30 plus percent faster on the dx11 graphics proggies I've seen. In some instances, more than that.
with a suitable dedicated, xfire /'dual graphics' is way fast. AMD has done some good work. Trin is better, much better, across the board. Frankly, I don't think they are going to be able to make enough of em fast enough. unfortunately, I have no info on the socket compatibility. It *looks* similar enough externally, but...
seer
oh, one more thing, MS's 8 is optimized now for the trin cpu architecture
It is even faster under 8.
Edit: sorry Clark, just saw your earlier post where you noted the win8 advantage etc. good research and analysis and VERY close to the *mark*. salutations -
OK, applied settings in Afterburner, ran Kombuster DX11. Kombuster set to High Performance. Still just iGPU. Made sure settings were still in Afterburner, checked apply on startup. Restarted, ran Kombuster DX11, GPU Burn in, still just the iGPU. Dual graphics is selected in Catalyst software.
Now... when I redid my system, I did not install the HP drivers. I went straight from the Windows install to the 12.1 driver.
Really want to know what kind of OC I can get on this laptop!
Thanks for your help and patience!
edit: Really weird. When 6750 speeds are default (600-800), the AMD System Monitor shows 6755G2 AMD dual graphics (still only stresses the 6720). When I apply any changes to the clock speeds, AMD System Monitor splits the cards. Is that normal? -
Ah, there's your problem! You need to install HP drivers. Use this one: AMD High-Definition Graphics Driver HP Pavilion dv6z-6100 CTO Quad Edition Entertainment Notebook PC - HP Customer Care (United States - English)
You need the HP drivers to activate switchable graphics. Then install 12.1 on top of it. Should resolve your issues. You can now overclock with 12.1 drivers too. -
Exactly why isn't it supported from AMD's APU drivers yet?
I've installed the HP's first, then installed 12.1's and when I run Afterburner with the overclocking set to 1 and all that, if I apply a clock that I want it deactivates the dGPU. And Everything I use afterwards will not activate it, until I restart the computer to bring the connection back. It basically means I can't use Afterburner for some reason I am not seeing...
I'm sure I selected the right device. It shows musho's modded bio on the display, but I can't edit anything and keep it without losing connection to the 6750m somehow.. -
Turn off ULPS. Simplest way to get Afterburner to activate and recognize Afterburner. See here: http://forum.notebookreview.com/hp-...xxx-series-owners-lounge-227.html#post7861634
You need HP's drivers because it's proprietary to HP.
Also makes sure in msiafterburner.cfg you set:
[ATIADLHAL]
UnofficialOverclockingEULA = I confirm that I am aware of unofficial overclocking limitations and fully understand that MSI will not provide me any support on it
UnofficialOverclockingMode = 1
Then you will have to save the file elsewhere, then copy/paste it to your program files(x86)/msi afterburner directory because you can't save over any files in your program files directory in Win 7. -
I honestly haven't had the need to install the HP default ones, when I wiped the drivers even from device manager I went straight to 12.1a and then to the ahci drivers (they contain the usb 3 drivers and other goodies)...
and that's pretty much everything that I did. -
It gives you switchable graphics and activates 6750m from a fresh Windows 7 install (not HP recovery)?
I just did a full driver sweep with Driver Sweeper and installed 12.1 and had no switchable graphics, it just defaulted to 6620G. I did this a couple times. If you uninstall from device manager that won't do much,it's just the video drivers not the vision engine portion. But if you were to do a fresh install with Windows you would not get switchable graphics if you went straight to 12.1. I guarantee it!
BTW, where did you get the AMD AHCI drivers? I ask because last time I installed AMD AHCI drivers not from HP, it caused issues.
edit: these? http://support.amd.com/us/gpudownload/windows/Pages/raid_windows.aspx -
HT, why use that one instead of the one for the 6135? (sp54362)
-
Try it and see if it works for you, but that one failed on me multiple times.
-
Lol, I forgot to disable ULPS (again.) Now I can see my 6750m temperature, clocks and the whole shebang in AIDA64 now.
Good job Wing, quick with the responses. Thank-you.
-
No problem. That's the one thing with this laptop there's so much tweaking to do to get it to perform, it can be frustrating sometimes. I think I will go ahead and write the guide I've been planning on for a long time. If not for others but for me to remember if/when I go to do another reinstallation. I forgot to do a lot of stuff last time too!
-
Actually I don't, and never will use the default windows installation of any machine, I have my own copy of win7 ultimate (In my old inspiron 1501 I upgraded the ram up to 4gb, then I needed a 64 bit version, and that very same is now installed in the DV6)
I remember the first time when I acquired this laptop, I turned it on... took more than 5 minutes to start in and when it finally entered in the windows welcome thing I turned it off, put my disc and formatted the HDD.
But anyway, yes those ahci drivers are the ones, and yes I don't use HP default ones. I used them before, but when I knew where to get the USB 3.0 standalone I didn't have to.
Edit: Other thing, the HDMI audio is not from amd, is actually from realtek and their true drivers are here http://www.realtek.com.tw/downloads/downloadsView.aspx?Langid=1&PNid=24&PFid=24&Level=4&Conn=3&DownTypeID=3&GetDown=false -
So you NEVER installed the HP video/chipset drivers before installing 12.1a?
-
OK, about the throttling due to thermal overload, I found this pdf from some AMD guys (Denis Foley is fairly well know and published and is an AMD fellow): AMD's Llano Fusion APU. On page 26 it pretty clearly states that the power density multiplier is applied to limit the TDP of the chip based on the topology of the cores AND the ambient temp. So it looks like it might well be a combination of load / clock cycles / core arrangement AND temperatures that cause throttling.
-
Hot damn! I have NEVER been able to get the switchable graphics with anything other than the HP drivers. Just did a full uninstall of all AMD driver, driversweeper, and ccleaner and glary utilities registry sweep. Then installed ONLY the 12.1a drivers and it works! I swear I have done that a half dozen times before with various drivers and no switchable graphics showed up. +1
I may do a full regular Windows install and see how that goes. -
I did that too, after installing my SSD...installed Windows 7 from scratch, then the 12.1, then the AMD AHCI. Switchable graphics, no problem.
-
Nop.... Although I arrived late
(I had to reformat recently, that's why)
-
Seems they fixed it all with 12.1 drivers. Prior to that switchable graphics and overclocking wouldn't work.
-
Tried uninstalling the 12.1, blue screen. Tried installing the HP driver, blue screen. Tried installing 12.1a, blue screen. But upon restart, and try to reinstall the drivers, it says driver already installed. But don't have the ati options on desktop rightclick. Going to try a full reinstall. HT, would love to have your guide! Install windows, which HP updates to install, etc. Maybe then I can get the 6750 to overclock. (well, I can overclock now, just not test).
-
I did clean install->12.1->amd ahci aswell.
Then you can extract+run ati setup the HP one, iirc it install a small thing( forgot but not important stuff, not a driver) Just don't RUN it because it embed a cmd to dismount all your driver.
edit: BTW if you dont want the beat thing(worthless imo), download the audio driver , extract it then install through window update driver. -
I did some small test today, p95 @ different pstate(lock via k10) then start kombustor.
It "seem" to be low voltage state throttle less( not exactly sure but meh)
Anyways I give up , Conclusion as of now
100% no thrttle : turn off the damn Cool n Quiet
Not much throttle : same pstat across
Pray it dont show the 30sec up/ constant throttle : Dont stress the laptop "too much" with CPU+GPU load ~~ ( Double burn/ Crazy HD playback with filters + 60fps)
Gaming usually is fine.
What the slides saying seem to apply to dGPU aswell, i did the test with xfire off(DX11). -
Any improvement in sound performance? Id like to keep the Beats thing since it kinda impresses people who think about those stupidly expensive headphones thinking its an awesome company lol
I just tried a straight install of 12.1a with doing full install and sweep and I get the same switching between problem I was getting before, so I have to install the HP package, than 12.1a . No biggy imo -
UPDATE: I found this quote from Anandtech, from December:
" Meanwhile for integrated GPUs AMDs Trinity will be using a VLIW4 design derived from AMDs 6900 series Cayman GPU, and at the same time it stands to reason that at least some of AMDs 7000 series will be VLIW4 in order to have something to CrossFire with Trinity."
OK...this just gets weirder. Why? Because Trinity's promise is 30-50% better than Sabine. Sounds good, but...check out the Passmark scores for the 69XX series:
Radeon HD 6950 3050
Radeon HD 6970 3129
Radeon HD 6970M 2629
Radeon HD 6990 3198
Radeon HD 6990M 2610
Not a dog in the bunch, including the mobile processors (which, if are actually 5XXX seies -based, can't be Cayman, and thusly can't part of Trinity...but holding onto these scores for the mobile processors is still realistic, due to memory diffeences [DDR3 vs DDR5], possible reduced memory bandwidth [64 vs 128], etc. You know...all the stuff that happens to a good video core once incorporated into a mobile solution.)
OK, here's the passmark score (again) for the 6620G, embedded in the current Sabine line:
Radeon HD 6620G 465
and the dGPU:
Radeon HD 6750M 1130
Ummm...on the graphics rating scale...this would be, like 600% faster, equating to a 500% improvement...without considering CrossFireX. I have a hard time believing this, unless this is all hampered by the APU's CPU section somehow. But even then...the CPU in the Sabine isn't THAT crippled, so PileDriver CAN'T be.
I mean...I HOPE AMD could deliver on this kind of performance in a laptop, and these days, there shouldn't be a reason they can't...especially when you see that, as referenced above, mobile scores like that are possible.
Maybe the "30%-50%" improvement ploy is just that. Inflate the estimates just enough to get people to take notice, remain silent, and then let the reality blow them and the competition away. (God knows that the opposite advertising approach didn't work for AMD before: promising the moon with a bunch of long-before-the-release hoopla, with the real action starting and ending with a resounding thud.)
AMD's always been David to Intel's Goliath. We WANT to see the underdog win, or at the very minimum, put up a heckuva fight. It's good for tech development, good for prices, and darn good entertainment as well.
Frankly, if the released version of Trinity was that good, that should be exactly their advertising approach, as the product catches Intel flat-footed and just sells itself through performance. Heck...never mind the cost at that point, but a relatively lower cost at that performance level would sell like hotcakes. (Geez...actually, even hotcakes would probably end up being envious.) -
Passmark doesn't look very accurate. The difference between the HD 6990 and 6970 should be a lot bigger than the difference between the 6970 and 6950. And it also rates Intels iGPU above the 6620g.
Anyway, I doubt Trinity is going to reach 6970m performance. Judging from the die shot from SemiAccurate, it looks like it has 384 VLIW4 stream processors (compared to Llano's 400 VLIW5 sp). Depending on the clock frequency, I would say that that a 30-50% increase in iGPU performance sounds reasonable. -
Thanks for that 'shot link. I needed the info of # stream processors, and also to see that it's a two-core PD module, not 4. Was kind of wondering where they were going to get all the chip real estate they needed, as well as heatsinking. Good move moving from the four Stars cores to the two PD ones as well. Throw in the MS fix and there should be about a 20% boost. Guess we'll see when the patch is released...looks to be next week sometime.
Given the possibility of retaining or upping the clock speed of the GPU, then 1/4 of the cores should produce around 1/4th of the performance of the benchmarks. So...50% GPU gain isn't unreasonsable at all. (By the way, I checked different benchmarks and they were all pretty much relative to themselves...just not each other, and Passmark's didn't vary that significantly, and a convenient place to go too.)
So...a 20/50% CPU/GPU boost is definitely nothing to sneeze at, and looks within the realm of realism (until of course, you OC...and you all KNOW you will.)
You probably SHOULD OC the IGP in order to try to get it to as close a 1:1 ratio as you can to the dGPU, for purposes of CF. Let's hope we can tweak voltages by then. (New MSI Afterburner beta is out today...so there's hope.)
The lack of GCN and stepback to is VLIW5 kind of a bummer, but easier to match a discrete with CF, I guess.
Well, the dream was good while it lasted. Given that 70% of PC sales are notebooks these days, (and the major market niche is multimedia, not processing), I still see only good things for AMD, but I think I'll hold onto the dv6-6135dx, skip Trinity, and wait for the GCN gen. -
We could ask a bit of iGPU overclock... why not, would actually make xfire more efficient. My mod would gladly dissipate any heat coming out of it
.
-
I can't imagine they didn't take that into account, given the reaction to the gross mismatch of iGPU/dGPU in this generation. (Or the results by OC'ers as well.)
-
can someone plz post the correct steps for full unistall ati drivers and using 12.1 correctly?
i unistalled.used drive sweeper.installed 12.1a but when i high performance mode got black screens.
any suggestions?
*HP dv6z AMD Llano (6XXX series) Owners Lounge*
Discussion in 'HP' started by scy1192, Jun 22, 2011.