heres a pic
Installing 12.1a (that one always worked good for me) if that doesn't work I'll pull the CMOS battery
Edit... No luck with 12.1a either. It's late will pull CMOS battery tomorrow and hope for the best
-
Attached Files:
-
-
dynamic mode on - checked
your crossfire is gone - checked
Yea go ahead and pull the cmos battery and pray. Use a nonsharp/broke pencil or plastic thingy.
Take out AC/batter, then cmos battery, then press power button for a few sec. It should do a bios reset at boot up. Don't touch any button when power on until you see some word come up. -
You can't run dual graphics with only 1 stick of RAM. Dual channel is a requirement. So your laptop is fine
-
Hahahaha really?!?!? Crap I freaking wiped that thing for nothing then. Oh well at least I have a fresh install to play with now
Thanks my gskil 2x4gb kit will be here tomorrow too! -
HI, I have DV6z with 1080p screen. My screen sometime flickers at top. Although it is very less still it is noticeable . any suggestions what could be the issue?
It has dual graphics with 7690M and 6620G cards. Processor - A8 -
Tinderbox (UK) BAKED BEAN KING
Is this you problem?
As far a have read on the HP forums, it`s a wide spread issue with the DV6 , the only fix is to keep the screen on all of the time, and use sleep to save power, do not use a display off timeout.
I get very thin black horizontal lines at the top of the screen and though the middle, it`s it shows up mostly when i use firefox browser , but i also see it on the desktop.
Crap is it not
John.
-
I have talked to HP support. zthey are asking me to update amd graphics card drivers. Currently i am downloading them. Will update the result.
-
Tinderbox (UK) BAKED BEAN KING
The problem can go away for a day or so, after updating drivers, so dont just to conclusions.
John. -
I had dual graphics on my asus even with 1 stick of ram.
2 sticks is not a requirement.
btw, prelim benchmarks of the dual core trinity are out over at semiaccurate.com forums.
since A10-4600m is clocked 12% slower per "core", you do the extrapolation.
ipc vs llano is lower by about 5% or more, but they bump up the base clock by 30% and turbo by 15% vs A4-3320m
the piledriver cores with crippled fpu suffer somewhat. Not sure how that translates into realworld everyday apps.
Just to test, I simulated A4-3320m by running my A6 at 2ghz, and running cinebench 11.5 in dual core mode.
Trinity A6 dual core runs at 2.6ghz.
I scored 1.20 compared to 1.13 for trinity.
trinity single "core" is a little better with the fpu all to itself. Scores 0.7 cinebench, vs 0.6 for my single core 2ghz A6.
http://www.bouweenpc.nl/wp-content/uploads/2012/04/Cinebench-11.5.jpg
. -
I really hope trinity comes with a nice discrete GPU. Seeing as how they will probably have it paired with a 7000 series card, hopefully it will be a 7750m or 7770m (or hopefully more powerful).
I feel like putting in anything LESS than a 7750m would be a big waste.
Since the new Dv6t received about a 30% bump in GPU performance, putting it right above gtx 560m levels, the 7750m or 7770m would make the most sense, as that bumps us up to just under the same levels.
I want to believe we might get a more powerful option (7850 would rock), but since HPs AMD lineup has always been about a BUDGET gaming laptop, putting in too high end of a GPU would result in costs above the target market
Thoughts? -
Yeah, I wish they'd pair AMD's best CPU with one of their best GPUs...though it had better at least match what they stick in their (IMO crippled) Intel systems! Hmm, 7770m doesn't sound half bad! 512 cores versus 480 of the old ones, and I guess these are worth a lot more too? Still, I'm always up for a 1000+ core part :-D
-
Two problems:
1. Too high-end of a dGPU and you lose a lot of battery life..."Green" is the thing right now, and a major marketing strategy of AMD. They're not trying to get absolute best performance; they're going for best-bang-for-buck, another major strategy difference as compared to Intel.
2. If the APU and dGPU are too mismatched, CrossfireX either doesn't work, or causes degraded performance of the dGPU. CrossfireX is also a big selling point for AMD, because they're the first (and so far, only) to bring dual-graphics to notebooks, something Intel and Nvidia have yet to do. -
Regardless of what you hear regarding CPU performance, with respect to AMDvs Intel...AMD's not playing that game anymore, and going for the user experience, and the majority care about graphics than math.
Also, regarding early benchmarks, don't beleive everything you read about degraded PileDriver results as compared to Llano's Stars' cores. PileDriver is an improved BullDozer, and for best performance it's going to need the BD patches for Windows 7, but Windows 8 won't need them.
Remember, Stars' core llano was a physical four-core device, and Trinity's four-cores are actually two physical PDs. -
Not everyone cares about battery life, and anyway GCN apparently idles at almost 0 regardless.
IMO that's completely irrelevant. I don't WANT crossfire, I want the integrated graphics to just shut off if they're outclassed. In the case of this, just switch to the 1000+ core GPU. (Or better still, don't waste transistors on integrated video-put them towards a second piledriver core.
Not quite sure what you mean by that. Intel's a joke, and Nvidia has had SLI for years, as has AMD.
Yeah, I'm wondering what that's going to mean for performance. Seems like for integer it should be better, but for floating point? Err...I'm not sure what 2 piledriver floating point cores can do against 4 stars ones. Wish it was 4 piledriver cores with no integrated video + a high end GPU. I know that's not happening, but oh well heehee -
Not sure what you are getting at, but this thread is looking at a versatile machine that can game that cost around $500 .
if you need a highend cpu +gpu then better look at a clevo or alienware -
1) But with manual graphics switching the power consumption of the dGPU should be ZERO when the system is running on the 7660g. Therefore there should be no difference in battery life, as long as you remember to turn off the dGPU. No one really cares about battery life performance on a dGPU because 95% of the time, the dGPU would be throttled anyways. Plus the 7660g should be more than powerful enough.
2) The 6620G (and more so 6520G) and 6750m overclocked where already a pretty large mismatch, and everything seems to work (for the most part). Heck with the major improvements with the 7660g, it will probably be a lot closer to the 7750 than the 6620 was to the 6750.
The Dv6 has always been the "jack of all trades" laptop. Powerful enough to play games, but also providing superior battery life due to efficient integrated computing. -
Hey, I was just wondering about the performance of my laptop compared to others. When I took the Passmark performance test, the results weren't very impressive.
Here's a picture comparing just my processor with another, and the difference in speed is huge. Can anyone help me figure out why this is, as some other things have disappointed me with this laptop? -
I wonder if your CPU is locked to 800MHz for some reason? And possibly you are running only single-channel memory mode. Pull down MSI Afterburner + Kombustor, HWinfo 64, and Cinebench 11.5 and please post screenshots.
Also, I would not be surprised if the average for 6135dx includes a lot of mildly overclocked machines. SO if you have 800MHz locked and the rest average closer to 2.4GHz overclocked, that WOULD be close to a 3:1 gap!! -
Okay, I'm downloading and installing that stuff right now, so I should have the results soon. Another thing though, is that I am also overclocking at 2.9 GHz according to k10stat, but this computer has been acting up since day one so I really don't know.
-
So I just bought a DV6Z 6135dx and im in the process of upgrading the drivers on the machine. Im an avid Nvidia user and am very unfamiliar with the ins and outs of ATI but I AM familiar with the technology and what has been talked about more or less in this thread. My question for the fine people in this thread is: What is the difference between using ATI Catalyst 12.4 as opposed to 12.1a? And if it is possible to use 12.4 is it better? I currently have a modded bios with the set overclock of the 6750 @740/880 so can anybody offer advice on this?
-
8 gb 1600mhz gskill kit is installed, and guess what? Dual graphics option is back!! Thanks for all the help (side note one of my original ram sticks must have been bad, I'm having none of the graphics glitches in games that I had before in skyrim and gta 4--- the screen would get all jumbled up before)
-
I'll just leave this here again. Extrapolate what you will, Cinebench scaling in particular should mostly apply to Trinity unless there is a magic rabbit-hat instruction for FP??? http://www.theinquirer.net/inquirer/review/2141735/amd-interlagos-supermicro-workstation-review
Yep HP's require both sticks for dual graphics while some other brands do not.
Is that buggy 2.9 crippled by Coolsense: Coolest or Coolsense: Quietest? That's why additional programs help verify whether K10Stat numbers mean anything at all. -
:/ Well this is interesting, Cinebench said that I only had one core and was running at 1.5 GHz (which is what it should be at), HWiNFO said I was running at 2.9 GHz with four cores and K10stat says I'm running at 2.9 GHz with one core.
Computer - Imgur
Here is the results from the benchmarks. -
Is it normal for CPUz to not show my ram as dual channel on my dv6z? It is two identical sticks at 4gb each. On my toshiba CPUz shows it as dual channel
-
Tinderbox (UK) BAKED BEAN KING
Sorry Foot in Mouth,
John.
-
@ Wolfpup - AMD's strategy isn't all about you or what you want. They're not targeting the high-end enthusiast crowd, because that's not where the volume of sales is. They left that to Intel.
"Bang for buck" means optimization and efficiency. Includes battery life for most of the people who use notebooks as a mainstay. As far as SLI/CrossfireX in a NOTEBOOK, we're talking affordable mainstream...again...the segment AMD is targeting.
@ rmacgowa - You project too much expertise into the segment of the population that AMD is targeting. Very few mainstream notebook users are going to take advantage of any advanced features, like manual config. Most mainstream users turn it on, use it, and turn it off. Gamers don't drive the mainstream market, but not all gamers can afford a high-end notebook, etiher.
As far as CrossfireX goes, historically, a problem people have had is higher framerates with stuttering (ask HT), as opposed to smooth play without, with lower framerates. Long ago, in this thread, I posted links to a couple of professional reviews about this which clearly stated the maximum performance ratio between gpus, beyond which Crossfire is useless...and apparently, doesn't turn itself off when it is and has caused unpleasant user experiences despite what the framerate is.
I'm not making any of this stuff up, and if you dispute it, then go scour the web for the information yourself, like I did.
@baiii Thank you. You "get it." -
Cinebench has two CPU tests the single core and the multi core. Then it shows the multicore efficiency.
Btw how many fps in OpenGL results? -
Point taken, I probably do have an over-optimistic view of the expertise of society lol. However, if AMD makes manual switching as simple as they made "high performance / low performance" switching that was standard in our model, I do believe it would be something that the average user could figure out.
Regarding Crossfire X, the point I am trying to get across is that because the 7660G is so much more powerful than our generations offering, that we can be "allowed" to use crossfire with a higher end card than the relativity weak 7670m shown in benchmarks.
If you look at the benchmarks regarding the A10-4600m, the 7660G scores 1135, and the 7670m scores 1061 ( http://www.notebookcheck.net/AMD-Radeon-HD-7670M.69483.0.html). while the 7660g2+7670 scores 2083.
It seems that they have taken a page from your book and tried to include a dGPU and iGPU of almost identical performance in order to get the most out of crossfire.
There is a big problem with this though, and that is compatibility. As we have shown over the course of the last year, many games have big issues with AMD dual graphics, and do not even get a frame rate boost. While we may get a big performance boost in 3dmark 11 like this laptop shown (from 1600 to 2200) we realize that there are almost no other situations where this translates into a 37.5% game performance improvement.
So lets say the new system comes equipped with a 7660g2 + 7670m . If we discover we have a game that has issues with amd dual graphics, and turn it off, we are left with the option of running on 1 of 2 weak cards, and an overall graphical performance lower than the system that a lot of us have already had for an entire year.
Therefore I think it is perfectly justifiable for AMD to include a higher end card such as the 7750 in the next gen DV6Z. I do not want this system to be bleeding edge and cost $>900 ,because if that were the case then I will probably never buy one, however they need to give us something a little bit more powerful than the 7660g2, at least enough to compete with a lower end dv6t.
I am not trying to disprove any facts that you have proven (and I think it is important you have shown them). I am trying to show you that it would make far more sense that the next generation DV6z contains at least a mid end GPU. Unless AMD has really worked out every possible kink in dual graphics , they should not be relying on it as a main source of performance boost, and be marketing as more of an added feature that helps in certain circumstances. (Notice how they always show the benefits in synthetic benchmarks when they advertise, and never actually game frame rates) -
and that's why too one of these CPUs WITHOUT integrated graphics makes sense for the mid range. Move the transistors from the CPU to the GPU, and you get 2x the power or whatever from the system, for the same cost. Heck, you could use FEWER transistors and end up with a more powerful end result even.
These integrated things are AWESOME at the low end-I've been recommending Llano left and right for months now to people looking for low end systems as they seem super well balanced for the price. And not bad for mid range either of course, but they could be better without increasing (or actually decreasing) the costs.
Even if what I REALLY want is more Piledriver cores in a CPU with no graphics + high end GPU :-D -
Based upon all the reading and research I've done (since I invested money in AMD's stock, I literally have a vested interest in this), the plan for "bigger and better" seems to lie immediately after Trinity, which is itself a stepping-stone. Even Trinity is simply a "make the best of what you already have in stock" approach, and that's a beautiful approach. After that, once AMD knows what works best, it's time to inject new and/or better tech.
I commend this evolution for many reasons, to include the fact that making the GPU available to assist the CPU is a stroke of genius, and a sign of true integration to give the masses the best of both worlds through product development, while making it cost-efficient as well.
Frankly, this approach puts AMD light-years ahead of both Intel and Nvidia, for this purpose only.
Unfortunately, AMD really needs to fix their driver situation, and this is the area in which I am most uncertain as to how they should proceed. While Nvidia has far outshined ATI on "everything drivers", AMD won't really be able to "woo" Nvidia's staff away, basically because they're two different animals. It makes sense for AMD to cultivate heavily at the university level in order to provide a workforce that would seamlessly integrate with them, and bypass anyone in the market who is doing the "same old thing the same old way."
Drivers not only kill them on performance, but on CrossfireX as well. Given everything I have worked with in the past, about 10 years ago, I junked everything ATI because it was just too damn hard to work with, as opposed to Nvidia. I mean, if you like to build and tweak, that's a hobby all its own, but if you just want things to work because you have a use for an optimized system, then tweaking is a nightmare, not because you can't do it, but becuase it's a time wasting pain in the rear.
For this reason, I absolutely agree with you that optimization controls need to be made as easy and automatic for the average person as well; after all this approach is simply an extension of making the OS as user-friendly as possible too.
And sorry if you thought that last comment was directed at you; it wasn't. However, I'm glad you see the problem with "projection" into a non-tecchie population (AMD's biggest market). I deal with that every day when I work with medical professionals that believe themselves to either be technically proficient and/or think of me as their minion/underling who should jump everytime they want something. <sigh> And then there's still the large group of users that are still looking for the "any" key. Believe it or not, the bulk of my clientele STILL doesn't understand how to shut down or restart properly...no matter how many times you teach them or how many "idiot-proof" icons youo make for them. NEVER overestimate your audience, otherwise Murphy will pull your pants down every time. (So says 35 years of experience in electronics and computers. People never fail to surprise me. I should have joined the Peace Corps.) -
There has been speculation (I think by one of our own members here, maybe Vect?) that the A8s will be more focused on CPU power while the A10 give up some CPU power for more integrated graphics power.
I think it sounds like a cool trade-off, but once again, all speculation. -
When people recommend other to buy a computer, they wont say it have crossfire, 3dmark xxxx etc.
They say "It run SC2/LOL high and it have 4 hour battery life" That's the reality.
This is also what intel want to do with the HD4000.
It is adequate power that matter in sells. people can over paid just because something look nice while have the same power as a budget machine -
RomanTheManJohnson: AAAAAAAAAAGGGHHHHHH!!! Okay I'm going to level with you. Either Windows is missing drivers/screwed, or your CPU is TOTALLY boned. My first suggestion would be going into Device Manager and looking for missing or unidentified hardware, error flags etc. It looks like for some reason the other three cores are...not visible or just not accessible. Win-R, "msconfig", check advanced boot for max number CPUs. Also consider grabbing the chipset drivers from HP website.
TRINITY VS BULLDOZER IPC: The numbers on that Interlagos rig are enlightening. But the Cinebench score is the easiest to interpret.
16 modules/32 threads, 2.2/2.5GHz, L3 cache
Per-core performance of 0.61 with scaling of ~23x
That placed it in the range of 12 core/24 thread 3.33GHz Xeon for Cinebench. Of course the dual E5's are out now (16 core/32 thread) so Interlagos got pantsed on render speed after that article.
Back to Trinity: If a 1 module/2 thread 2.7/3.2GHz laptop chip (without L3) scores 0.7 with scaling of... 1.6x? (Was it 1.13 or 1.20? That's either 1.6x or 1.7x) ...that shows a couple big things!!
First, it looks like IPC probably suffered a bit from lack of L3 cache. (Score 0.244-0.277/GHz vs 0.219-0.259/GHz, >5% diff) Second, multicore scaling is better? By roughly ten percentage points of core ratio in a platform that scales with almost perfect predictability.
Thermal envelopes are their own problem, and I doubt we'd see a low-clocked quad module laptop from AMD. So naturally the Trinity chips will not supplant i7 laptops except maybe on graphics. However slabs on desktop and server may pick up a lot of performance in this cycle. From the laptop TDP I hope Interlagos can be replaced with a 3GHz+ chip, which may greatly narrow the gap with E5 Xeons.
At least the huge clock boost for Trinity should prevent BF3 lag spikes and other problems with game engine timing...even the 35W A10 has 50% higher base clock than my A8-3500 so overclocking will be less vital.
rmacgowa: What I said was, within 35W TDP the new A8 should have more headroom for CPU than A10. Slowest 35W A8 should be a little faster than 35W A10, while 45W A10 will be the premium. If the IGP is the same, those extra ten watts go into the CPU. So the 45W A10 should hit above 2.6Ghz base compared to 2.3Ghz base for 35W version. Turbo 3GHz+, while a 45W dual-core (if they exist) should have base clock at 3GHz+ and Turbo above 3.5GHz...desktop models have 90W+ TDP (more than double), some will go to faster IGP and the rest to base clocks reaching 4GHz+. So in my way of thinking, the move from 2.3GHz to nearly 3GHz base is achievable with ten more watts. Moving from 3GHz to 4GHz+ may need double the wattage but we can't be sure yet. It's just how CPUs work.
Pardon my crude figures. I have only Bulldozer/Interlagos watt numbers and a couple Trinity leaks. For certain the lack of L3 will reduce wattage, and Trinity clocks indicate significant differences in power consumption. Mark my words, it will be strange if there ISN'T a single-module laptop hitting above 3GHz and a dual-module hitting above 2.6GHz. How much it goes above that I cannot guess just yet. -
didnt realize this mc hammer video had an hp laptop with the beats audio logo that they zoomed in on lol
MC Hammer Better Run Run Jay Z Diss Track Official Video - YouTube -
Tinderbox (UK) BAKED BEAN KING
I wish the USB 3.0 ports were on the left instead of the right side of the notebook, as my mouse is plugged in on the right and it does not need USB 3.0.
John. -
They must have changed this for the Quad Edition; my DV6z has a pair of USB 3.0 ports on the left and a pair of USB 2.0 ports on the right.
-
Tinderbox (UK) BAKED BEAN KING
Yeah, the ones with the blue centers are on the right USB3.0, and black on the left USB2.0 on my DV6z
John.
-
3.2 ghz . it can b overclocked to 3.8 with good thermals and strong power supply.
3.8/4.2 overclocked 45w chip, BUT, doesn't appear that 45w laptop system will be released initially by any manufacturer. HP's G6 with A10 available for order now in orient. Keep eye out for Asus unit.
seer
oh, one other thing, there IS a FOUR module (meaning 8 piledriver cores) chip engineering sample that was made available. I have not seen it run, but it exists. Intended market not clear, most likely desktop chip, but those pesky gaming notebook companies *may* stuff it into a big time gaming lappie. A proto was built and shown to a few by an oem whose name rhymes with kompal
It had two 7 series dedicated gpus (which were unidentified) and the internal configuration wasn't clear, but I was told by someone who should know, that the performance was *intriguing*
It also was set up to run eyefinity, i.e. multiple monitors.
One thing that has become clear, pricing is going to be very favorable, enough so, that all sorts of configurations become economically feasible, especially for the enthusiast market. We'll just have to wait and see. product intros will start next week.
seer -
One caution, not all reviewers are created "equal". There are actually few of them that have a clue how to overclock at all, or any inclination to do so.
For instance, where other than here, have you ever heard a one of them, note that it is a relatively trivial matter for an a8 35XX to achieve 2200ish in 3dmark11? One of the biggest errors people make vis a vis xfire/dual graphics performance is not keeping in mind the limits of the cpu/igpu sharing of resources when trying to configure for max performance. The controlling algorithm is NOT THE SAME AS INTELS!!! Let me say that again, it is NOT THE SAME AS INTELS. Accordingly, your core settings are different. Fortunately, the combination of K10Stat and Fusion Tweaker, permit you to optimize the cpu's core performance while minimizing its *perceived* capture of resources (perceived by the algorithm). Roughly put, if the algorithm perceives 'available' resources, it will direct same to the gpu. The more frames the igpu can generate, the more it will correctly interleave with those generated by the dedicated. Xfire/Dual graphics can give you 175 (roughly)% of what the igpu can do. (its actually a bit higher on a finely tuned system) Output is effectively dependent on Igpu output so anything you can do to increase that , contributes massively to total output. I hear a lot of talk about the appropriate dedicated gpu for a LLano/Trinity system. If you are planning on building a desktop system, the *sweet* spot is going to be a dedicated one level higher than you would otherwise calculate, for two reasons:
1. If a proggie *has* to operate strictly off the dedicated, you have the power you need.
2. Once you've optimized the output of the idgpu, you have the capacity to take advantage of it, with the oversized dedicated. If you cannot get enough out of the igpu to end the *stuttering*, you can tune the dedicated "DOWN" until you get a *match*, which is *most* instances will give you clear and liquid gaming at rates still substantially more than you could achieve with the normally configured dedicated alone.
The AMD engineers have spent a LOT of effort with Trinity smoothing the rough edges off the interleaving process while trying to maximize the fps output of the internal igpu. To my eye, they have succeeded admirably. Unfortunately, many of the pending machines are being (at least initially) configured with an insufficiently powerful dedicated gpu (to my mind at least). On the other hand, the *bang for buck* enthusiast builders, look to be doing the opposite
Trinity on the right mboard, opens up a ton of tweaking possibilities. It will take us some time to sort it all out once we get our hands on it (as long as they leave the chips unlocked). Just watch out for the initial reviews. People are being very wary of stepping on Intel's toes, consent decree or no, and the manufacturers are being very cautious about putting out configurations that overpower their more costly Intel based units. I have heard more than a few times, discussions centered around "Poison Ivy"...
seer -
seer, that is some great information. You deserve many +reps.
Can you confirm any GPUs being paired with Trinity in notebooks (excluding 7670m), or have they all been "unidentified"? -
Tinderbox: My 6135dx has no blue ports, but the two on left have "ss" printed below. That's USB 3.0, so if yours are blue and on right...you have a *very* different motherboard.
seeratlas: I certainly hope you're right about the *potential* and wrong about the *implementation.* Perhaps our best bet will be a whitebook, prospects for those are more promising on AMD side. Also it would be a landmark achievement if overclocked Trinity is more than 50% faster than overclocked Llano but we shall have to see. Pushing 4GHz on mobile would certainly reinforce the claim that desktop can touch 5, and vice versa.
I am concerned about temperature restrictions. Bulldozer cores had a low thermal ceiling irrespective of proper TDP. Granted that cooling is a big limiter for unmodded laptops, we don't want a big GPU to cook the APU with shared heat. That issue must have been dealt with in Piledriver cores. It must. -
that could easily be solved by giving the dgpu its own heatpipe that doesn't travel over the cpu, unlike the dual gpu llano setups today
-
the problem is cost and space probably, with a extra pipe and possible fan then the thing cant stay sleek and "Stylish". these dual gpu llano is below 6lb iirc.
-
Hello. Excuse me for butting in, but I've been watching this thread, along with the Asus k53ta thread. I'm looking to buy a laptop but I am hesitating a bit. Firstly, unlike other laptop brand subforums, this place seems to have the only "will you buy an HP again" thread, which is just sending off warning signals. Secondly is Trinity.
What I'm looking at is the HP dv6 with A8-3500M, 6750M, 6GB ram, and the usual 5400rpm HDD, which I believe is on the higher end of Llano and GPU. Prices seem to be just around $450 for used/refurb. I'll be using it for mostly Diablo 3 and Guild Wars 2.
Should I go for the refurb higher end Llano, or wait for an entry level Trinity (at the $450-$500 price point)? Thanks in advance. -
rmac, i was *told* that at least one mfg had paired the A10 with a 77xx series chip for testing<wink> but I didn't ever get a followup on whether a production decision had been made. That combo was a good one as it would at base clocks do 12 to 16 percent or so better than the 560m gaming machines. I would * love* to get a shot at tweaking that kind of system, (and almost did
On the cpu side, frankly, I was hoping piledriver would do better *real world* than what is apparent at base clocks. The *official word* is like 26%? Well, I dunno. I saw a bit less than that on some of the apps a lot of people use. It's good to remember that on the other hand, Intel's vaunted IVY is some 3 to 5 percent faster than sandy? hold me back
The point being that amd is claiming 26 percent in office type performance, but at base clocks. Since Piledriver is VERY dependent on freqs, overclocking is very productive. Fortunately, given sufficient cooling capacity, piledriver cores can run some damned high frequencies. Some have gotten mid to high 4's on air with dead solid reliability due to the SOI process on the 32nm chip. I have my fingers, toes, and incense burners crossed for an A10 with a 7770 in it. Since the 77xx series can do a "zero core" , i.e. shut essentially off when not needed, it wouldn't necessarily kill off the batt power for normal functions when the gpu isn't needed. When needed, tho, it can really throw some pixels
Toshiba was playing with a very strong Satellite demo unit, MSI has a gamer in the works, lots of rumors bout Acer's intentions, for sure they will have 'something' but as to what....
, unfortunately, HP appears to be lowballing with the G6, the top model I've heard of teamed the A10 4600 with a 7670m but, it may be good low dollar value. I'm looking to see how much of the Asus prototype actually made it to production. It was * nice*
and of course...there's Compal, who probably knows more about making trinity *work* than anyone. They have a number of units in the wings to satisfy pretty much everyone. AND, you may be right about the white box builders. Unfortunately tho, those units won't be the *bang for buck* bargains we've seen with the various llano boxes. Then there's gigabyte which has a killer desktop board that unlocks pretty much everything, especially the mem controller which is critical to igpu performance. Given the right memory overclock, that board can REALLY fly...AND...there was talk about them putting out a laptop under their own name... Lastly, theres Lenovo. They have a good mid/low level lappie coming, and there was a rumored full bore A10 dual graphics (enthusiasts) laptop in the works, but haven't heard squat from them in some time. Lenovo has a lot of experience with the fusion system, and built maybe the best of the low level laptops using the Brazos series chips.
Unfortunately, the really *good* stuff is probably not going to make it to the announcement on the 15th, but production has been very good, amd is shipping a ton of the chips, soooo, things might go better than I'm thinking. At least we won't have long to wait before we know.
Oh, one last thing, as I said, piledriver is very stable under high clocks, which is good, cause the lack of L3 cache apparently requires elevated clocks to compensate. I really hope compal's mxm dual dedicated gpu beast of a gamer makes it to the specialty market. And I hope someone calls it that, cause the moniker "The Beast" is well deserved
In gpu intensive gaming, it can match up squarely against the likes of custom configured Alienwares etc. at a reportedly 'startlingly' lower price.
seer -
How do I find HP CoolSense? I mean, is there any way to control the fan? It gets really annoying when the fan gets loud at just 50 degrees C.
-
Seer, may I ask where you get all your info from? I stalk this thread almost everyday, but never got to figure out who you are or who you work for.
PS. Whatever news you have of Trinity is very interesting to say the least.
-
mambastik: $450 with dedicated GPU is a pretty good price. The 3500M APU is the lowest-clocked A8 but K10Stat makes everything nearly equal in the end. The actual iGPU is still A8-class and the dGPU has served me well. It will probably play your D3 rather well, thought someone here was in the beta...? If you wait for Trinity, expect to spend closer to $600 so take that as you will. My machine with same specs plays SO many games SO very well. Upping the CPU helps with a couple games that are totally clock-dependent (they don't understand chips less than 2GHz). Battery is quite good also.
seer: I will be keeping watch on this matter, the previous demo of a super thin dual graphics ultrabook with dual cooling has me very curious about the preferred configurations. As you said about timings the laptop chip can do more with a supportive BIOS, bringing extra bang for admittedly extra bucks.
BecauseImFD Hit the winkey and type "cool", that should filter down to HP Coolsense. Coolest runs the fan a lot and slows CPU unless you have a modded BIOS, Performance is probably where you are set now and Quietest limits both fan and CPU to reduce noise. There is no complete "fan off" state and depending what you mean by loud...why is your chip 50C anyways? -
AMD Trinity Laptop - Page 19 - SemiAccurate Forums
blah-blah-blah
AMD A10-5800K, A10-5700, A8-5600K and more new APUs specs leaked
Noting purported desktop GPU clocks...
AMD Trinity APU Lineup Leaked, Specifications of Eight Desktop and Mobile APUs Detailed
I've done a little spidering. Looks to me like the new chips fit a roughly 50% clock increase going from 35W mobile to 65W desktop, with more like 75% at 100W. Some of the numbers that site stole are guesses though. Not enough info RE: MX mobile chips. I would still expect an appropriately binned 45W mobile APU to push around 3GHz... Of greater interest are leaked numbers filtered from online test databases, indicating Trinity APUs have similar floating point IPC to Bulldozer and 5-20% MORE integer IPC despite lacking an L3 cache. The real question is how much they've adjusted performance of newer instructions that Dhrystone and Whetstone cannot test. Especially the encryption accelerator will affect potential adoption in the corporate sphere.
It seems pretty broadly confirmed that they have messed with the branding though. A8 becomes A10 and get faster, A6 becomes A8 and gets faster. A4 becomes A6 and gets faster also, but probably not as fast as Llano A6 due to dual-core. Maybe the new Trinity A6 dual will have similar performance to that Llano A6 tri-core? And then something fiddly regarding the new A4. There seems to be some disagreement how much the chip resembles E series and how much it resembles A series. It is my hope that the A4 is just a "salvage bin" for conventional Trinity APUs, or an actual single-module APU. If the new A4 is actually E series in disguise, that would be a disservice to the branding.
MAMBASTIK: Based on these numbers you should grab the refurb for $450 while you can. Mine works great and there should still be a 90-day lemon coverage from HP in case something magical goes wrong. Getting a much faster laptop from the new generation would cost you between $600-800, probably a large enough gap to not be worthwhile. While if you catch a sale in December, your A8 system should still be worth $350-400 easily on Craigslist or similar after-market resale. -
Wanted to thank you for being so helpful, and also let you know that I performed the replacement, and documented the steps in a new thread found here:
http://forum.notebookreview.com/hp-...acement-upgrade-instructions.html#post8503818
The replacement is seamless and fairly simple for anyone interested in performing the upgrade.
*HP dv6z AMD Llano (6XXX series) Owners Lounge*
Discussion in 'HP' started by scy1192, Jun 22, 2011.
