Man, that is some serious work and I applaud your effort. I tested two of your settings, the green one and 625/999/1563. The latter one caused a crash in 3dMark06, the sweetspot green gave me a 9507 @69C max temp.
A few days ago, I ran what I have found to be my most stable OC of 650/925/1650. It yielded 9714 3dMark at 65C max.
As I have ran ATI tool for getting my "stable"(is it ever, really?) OC, anytime I raised Mem above 925, I would get artifacting. At 900, I could push core/shader to 675/1700 without artifacts, but lost ground in 3dMark and frames in games. Go figure.
As Comptrekkie alluded with his setup, every card is different. I would strongly advise folks that don't do this often or at all, to start at the low end of your spectrum, and use those small increments, as you suggested. Someone may even be able to run your red settings, who knows.
Again, very nice write-up, I had never heard of that method before. Just makes me wonder why my Mem ratio doesn't work.
BTW, I'm also running the 185.81
-
-
Yep, I have this disabled. It seems like there's a magic CPU temp that, once hit, freezes the fan to max regardless of the temp after that point.
-
Not that it helps them, but this rare phenomenon occurs with me also on. (only after high stress and not always)
The exact polluter, I could also not yet been determined. (HW Monitor I can exclude this.)
If I find something, I will write it.
Hopefully you too. -
Have you already tried Windows 7 RC 7100?
Does it work smoothly with hdx? -
Works like a charm in 32 bit version.
I have already downloaded the 64 bit version and will be installing it to see the differences/ never used a 64 bit system. -
I have installed the 64-bits Windows 7 RC and it looks everything is working well.
Grtz -
Well, I've finally got another charger from HP, and guess what? It makes the same (or worse) hissing, electrical noise, from time to time, as the previous one!
Maybe I plug it and it starts making that horrid noise, but other times it doesn't make it for >24 hs! Pretty random it is.
What the heck is going on here? Is that how these chargers REALLY are supposed to sound? Or do I have really bad luck and this one is also broken?
In any case... what's the worst possible thing that could happen if this charger is defective? Destroy the battery? Destroy the motherboard? Nothing at all (because the charger would "auto shutdown" or something before sending straight 220v to the poor HDX)?
As a positive side, I got a free (although loudly) charger!
-
been using win 7 since beta 1 and no problems at all with 64 bit
-
Concur with the others, the 64 bit 7100 version is excellent. On loading, unlike the 7000 beta, you don't have to load the fingerprint reader drivers... and the ones you get via update are new versions. You still have to load the remote control (sp37925) and the TV Tuner (sp38987) manually. The Ricoh card reader will load via update also. I did get a codec error when trying to look at the TV live. I had to delete the TV driver and reload it, then all was fine. I'd say, reboot after loading the TV driver before trying to setup the Media Center and you might avoid the problem I had.
-
Thanks!
I tried the 650/1650/925 but it loses ground quickly in 3Dmark06. I suppose this is part of what rickiBerlin is referring to concerning the powermizer feature.
Ratio based OC'ing is how Nvidia does it. But more so of a ratio of core/shader than the core/mem.
For instance, look at clocks for the g92 (9800GT v 9800GTX):
http://en.wikipedia.org/wiki/GeForce_9_Series
When they went from 600mhz core (9800GT) to 675mhz core(9800 GTX) with the same chip, the shader clock moved up in the expected ratio.
600/675 x 1500/??? solve for ??? = 1688mhz.
The clock would need to be 1013mhz, but they choose 1100. Might be a nominal figure that internally would be 1012.5mhz. Much like advertised CPU clock speeds are nominal but in reality are a few decimals higher or lower.
I suspect the reason why ratio based OC'ing is done is for stability reasons. Everything (computing processors) will sync according to design. After all, the Nvidia GPUs are mini computers in their own right. Hence their CUDA architecture.
I'm going to test the mem clocks a bit more as it might not need to be sync'd in a ratio to core. -
Have you tried any games.
I've read W7 is slower than xp (games like cod4, css).
Futhermore there is a problem with punkbuster. I don't know if it's already solved.
http://kfah.spirecss.org/images/Untitled-1.jpg
http://kfah.spirecss.org/images/Untitled-2.jpg -
I can't make the comparison with XP, because I don't have it anymore on my machine.
But I can make the comparison with Vista. The games run smoother on Windows 7, for sure.
Grtz -
I suppose this is what stood out/concerned me, but I tried it anyway. Hence it crashed. However, I have powermizer set to off.
I remember a few years ago, when I pretty much tried to kill my XPS Gen 2 with a 7800 GTX by constantly OC'ing and tinkering, that I read it was much more harmful to OC mem in a ratio to core, basically because memory was already closer to maxxed out clock speed. Therefore, it could fry your GPU faster, increasing at same ratio. (Sorry for lack of tech speak and reference, but I can't find the ref. yet)
I know these GPU's are different architecture now, but I have always held true to that belief, which I think was re-enforced by separately and in small increments, OC'ing each until I got artifacts. At the time I was using ATI tool, and have done the same with the HDX. The results on both machines were similar, but that by no means is reason enough prove or disprove anything...just my results.
-
Definite difference in architecture between the G7xx and G8XX series.
As for the difference in memory overclock potential, likely it's because there are two RAM (GDDR3) vendors used on the MXM cards. One is Samsung (k) and the other is Hynix (h). What's more they have different access times. Samsung has 1.25ns @800mhz and Hynix has 1.20ns @ 800mhz. I know that my two HDX's each have different RAM on the 8800M GTS cards. I just didn't bother to denote which has what. Each HDX has different overclocking ability, gets slightly different 3DMark06 scores, and has a slightly different thermal profile.
As for harm in OC's Mem in ratio to core, I'll look into that. Though GDDR3, like DDR2/3, does well with overclocking. Up to 1100mhz on 800Mhz rated RAM. The amount depends on the GDDR3 vendor. It's all about voltage tolerance. One of the two vendors(Hynix or Samsung) can handle the overclock @ the set voltage.
Also, since the two RAM vendors have different access times, at higher overclock, they can lose sync with core/shader and cause a crash. When that happens, the GPU automatically downclocks and can only recover with a log off or restart. -
I wouldn't be too concerned unless you are just curious. The more I think about it, I believe it was more anecdotal forum talk based on OC experiences. This was over on the other notebook site, which at the time had a lot larger Dell following, particularly XPS Gen 2 owners when they first came out. They were the Corvette to our (Dragons) Cadillac CTS. Everyone wanted to get the most out of their rig, obviously, and we talked a lot about it back then.
-
Did you PerfLevelSrc DWord-generated, or why is up to you Powermizer "off"?
Quote: BTW, I'm also running the 185.81 -
use fraps and show results please
-
@ 2.0
Quote 2.0: "One thing to note, GPU-Z clock reports can be wrong. Especially when logging."
In what they refer?
GPU-Z log text correctly! (in terms of clocking up and down)
GPU-Z wrong clock frequency. (in terms of real frequency)
Example: my currently selected standard clock: 215/110/465
GPU-Z Display: -------------------------------215/110/465
True Frequency: -----------------------------232/110,25/499,5
The crucial thing is the automatic clock change (by Powermizer) to be monitored. (with GPU-Z is possible)
The resulting high latency (interrupt requests to the CPU) so they could also be observed. (> 3000μs)
This inevitably triggers disruption from!
I have no problem looking for, I had problems. For example, in GTA SA. It was a long way to find out what it was.
Create it but please have a GPU-Z Log in to play with Powermizer "on"! If it is quiet in the game will be clocked down.
The rules by which this happens, however, are flawed. If they do not in computer activities microruckler or sound drop outs noticed - congratulations!
However, I think I would find it.
Thanks for the kick-themes! -
If I understand you correctly Ricki, yes I did the PerfLevelSrc registry change. It was a while ago, I would have to find the correct settings.
The reason is I wanted to make sure I don't downclock on high end OC testing through 3dMark or ATI tool stress test. I monitor my temps and such while I do this.
If not testing, I'm not OC'ing, so I see no need for powermizer at stock clocks. And I never, ever use the Dragon unplugged, so that isn't an issue either. -
Here's the odd thing...
When I was using 179.28 from Nvidia, I noticed the PerfLevelSrc in registery. I think I didn't have any problems because I had something set at 3322 or 3333. I forget which. But I recall that I had everything the way it should be without having to change anything.
Now with the 185.81 from Nvidia, there is no entry in the reg named "perflevelsrc." it's gone. Nether are there any powermizer setting in registry.
I had a few driver in between the 179.28 through 185.81. So not sure what nixied the reg entries.
And like HovC, I never use the Dragons unplugged - only to cycle the battery once a month to keep it in good working order. -
Ok, I believe I understand now. Going to do more testing concerning this powermizer issue. Only thing, I have no reg entries for powermizer anymore and I'm using 185.81. Might have to add in the PerfLevelSrc and set it to 3333. Then test it @ 3322 & 2222.
-
I don't have it either with this driver. Strange, or I just never realized it was driver dependent and haven't checked. I did clean install a couple of months ago, and made sure to set it then, but not sure what driver I was on.
The things that make you go hmmmmmm... -
@ hovercraftdriver
@ 2.0
@ CompTrekkie
@ ?
Hovercraftdriver,this was exactly the reason of my question to them.
Yes, Powermizer is an integral part of each graphic driver.
That's why I wondered about their statements: Quote "However, I have powermizer to set off" and "BTW, I'm also running the 185.81 'End Quote.
From 185.81 are now the DWord value "PerfLevelSrc" (and a few others) are no longer registered. (modified inf file)
That's why I wrote: Quote RickiBerlin "For the deactivation Powermizer at 185.81, the DwordWert PerfLevelSrc only be created!" End quote.
Procedure: RickiBerlin Treat - http://forum.notebookreview.com/showpost.php?p=4637465&postcount=5469
Note: Since "PerfLevelSrc" does not exist, you can create it. (In "0000")
Procedure in the registry editor at this point: "Edit" / "new" / Dwort value (32-bit) / name "PerfLevelSrc".
Give it the hexadecimal value "3322" (for Powermizer "off")
Other settings ( "3333" and "2222"), see link.
Caution: This guide is only for those who understand it exactly!
Note:
1. 185.81've previously tested and works Powermizer as bad as ever. (pity :-()
2. Also I do not work with battery. (just like 2.0 and Hovercraftdriver)
to 2. It is also not possible to save Stohm more than Powermizer (PM "on"). For PM "off" because the 8800M GTS always Performenc3D mode. (whatever clock) This means they will always get full voltage. Even at my standard bar 215/110/465 lies fully in the excitement and the GPU temperature is ca.49C. (Fan Level 1). Advantage: no clock up and down, so no high latency, thus no Microruckler. Disadvantage: slightly elevated ground temperature (43C to 49C) but without the higher ground noise. For higher clock (games, BluRay or ...) I have prepared NSU files (N-Tune's profile). Path: Root / Users / "Your Name" / AppData / Local / Nvidia Corporation / nTune / Profiles. Since this editor can edit, you get the Mem clock under 199MHZ. This profile controls like me better than that of Nvidia.
Attention:
This text was automatically translated in part and may thus face major substantive defects!
Changes on their own responsibility!
The need must each decide for themselves!
In Powermizer "off", without starting profile, constantly clocking of 500/799/1250. -
I've compared Windows 7 and Windows Vista, but for some reason Fraps didn't want to run on my 64-bit Windows 7.
So I've used the in-game benchmarks of GTA 4 and Tom Clancy's Hawx
Windows Vista (179.28 driver)
Tom Clancy's Hawx ==> Average FPS 43
GTA 4 ==> Average FPS 22.35
Windows 7 (185.81 driver)
Tom Clancy's Hawx ==> Average FPS 49
GTA 4 ==> Average FPS 26.11
All games were run under same settings, so I think that we can tell that Windows 7 runs the games smoother.
Grtz -
So, after much wailing and gnashing of teeth, I decided to do a clean install of Windows 7 RC (64bit).
As most of you guys have posted, the majority of drivers installed well. However, I can't seem to get Windows to detect my fingerprint sensor, despite installing the 64bit driver for Vista from the HP HDX driver page.
I also can't get the quick launch buttons to work, no great loss I know, but I'd still lilke them to work as they should.
Any pointers?
TIA
CJ -
Did you try the battery pull method for the quick launch buttons? Power off, pull battery, hold power button for 30 seconds, install battery, power up. Since migrating to 7, it's more likely drivers, but anything is possible.
-
Here's the thing for anyone who's interested in doing this with an overclock:
1. You will be running a full time overclock and not an overclock based on demand. Can't advocate that as a good thing. Powermizer set to "on" (3322 or 2222) will only approach overclock speeds when the need arises. A full time overclock (or full time max performance) places more demand on the GPU and increases watt usage whether or not the graphics muscles is required.
2. You will raise your idle temps 5+ DegC. (While that might not be a bad thing in and of itself, depending on your overclock, your fans will run longer after playing a game. What's bad for GPU's is temp range more so than temp high. Like any metal, which GPU solder bumps are made from, the wider the gap between idle temp and max temp, the higher the stress on the metal. HDX doesn't have to worry about it whether you do this or not since the gap is much narrower than it's less sibling 8 series chips.)
3. You will see no performance increase (3Dmark06, FPS) for doing this. I haven't noticed anything in games that benefit materially from doing it. Games will stutter at certain points more often because of automatic saves or HDD activity than they will from switching MHZ states on the GPU. I apparently haven't had powermizer off since 179.28 driver. Can't tell the difference in any game I play other than lower idle temps. Tested 185.81 with powermizer on/off and find no difference in any game I own. No difference in general browsing and internet video playing. I don't watch DVD's on the HDX and don't have blu-ray so can't advise on that.
4. The only benefit to doing this is if you are running some graphic app which needs to suddenly demand max performance from idle perfromance. Due to the latency of switching based on demand from idle perf to max perf, you'll experience a slight "stutter" delay in which your graphics and audio will skip.
Up to the reader to decide. This methodology mostly benefits weaker GPUs whose idle speed is barely enough for general Vista performance apart from games.
But...
To make your life simpler, in case you want to try things out, here's a program that does all the reg entry stuff for you: http://forum.notebookreview.com/showthread.php?t=273276
Tested with 185.81 after just adding reg entry Dword 32 PerfLevelSrc and setting it to hexadecimal 3322 to turn powermizer off. Remember that your default setting is powermizer on. -
Thanks very much HC. That worked a treat for the Quick Launch buttons.
Just gotta suss out the driver issue with the fingerprint reader now.
CJ -
Right guys, I've got the fingerprint reader working by following the instructions found here.
CJ -
So wait..if we overclock using the frequencies you posted in NVIDIAs performance settings it won't scale/clock based on demand..? It will only scale perpetually at the same frequency? I'm confused..ESPECIALLY about the powermizer talk. Could you please clarify what you mean?
-
If you overclock, it will scale the frequency on demand just like your CPU does. While idling, your GPU core/mem will run @ 200/100mhz. Ocassionally bump up to 275/301 and 383/301mhz while doing certain graphic tasks in windows. While gaming, it will scale between 200/100, 275/301, 383/301, and 500/799(or your overclock). But mostly between just 383/300 and your top frequency. You won't notice when its doing it.
If you set powermizer to off there will be no scaling and you will always run @ max Core and Mem. So if you overclock, your GPU will only run at that overclock frequency. If you're not overclocked, it means you will always run @ 500/799mhz core/mem.
The ones who discovered this benefited by it because when their GPU (8400M/8600M) was at idle it doesn't have all that much processing power for Vista's aero. So they wanted to set the GPU to max core/mem to maximize their performance. We don't have that problem because at idle, we have plenty of power for Vista's aero.
The other concern which props its head up occasionally with powermizer set to default on is that there is some latency when switching frequency. For the most part, it goes unnoticed because not many things you would do would be affected by the latency. Some things are. Some types of streamed and unbuffered web video will stutter either while it switches frequency or lags behind at the lower frequency when it needs to switch sooner in order to ensure smooth playback. I haven't noticed it on buffered streams which most streams are.
It can also occur in some games. I don't have any games in which it occurs because most modern games place high demands on the GPU so it doesn't switch out of top frequency often enough to matter. Most stutter comes @ auto saves and level loading and is a function of hard drive bottleneck. Like in Crysis when you reach a checkpoint. COD4 might stutter when there is triggered audio voiceover. RTS games would be harder to factor since their stuttter is a combination of things.
There's more to consider. For instance, no two Nvidia cards are the same as you can see with peeps overclock potential. There are two different RAm types with different characteristics used on the cards. I have 2 HDX's, each with different RAM - Samsung on one and Hynix on the other. They both perform slightly different depending on what they are doing. There's also HDD latency. Also number and types of services running in the background. Type of DDR2 RAM you have and how much. Type of CPU. Subtle differences all add up to a net level of performance range. That's why I always recommend BIG increases in perfromance for there to be a noticeable difference. You want to get clear outside your pre-upgrade performance range.
The original test submitted by rickiberlin to demonstrate the lag induced when switching frequency was this video:
http://www.dailymotion.com/video/x5ilq6_unter-falscher-flagge-independent-d_news
Thing is, I have a fast, low latency internet connection, only 48 processes running and relatively fast HDD. So I don't experience any stutter while playing in attached or full screen mode. And my GPU doesn't switch out of 200/100mhz during the video.
Use this video as part of your testing to see if you would benefit from turning powermizer off. -
My last post on this topic: (I will not repeat)
Had in my post of Powermizer Switch discouraged (for many reasons)
Quote: RickiBerlin-Attention: Please do not use "PowerMizer Switch"! This tool includes RegistryDwords changes, which are drivers for the 8800m GTS not available! (Post http://forum.notebookreview.com/showpost.php?p=4637465&postcount=5469
Powermizer works not 100% adjusted performance!
Example: GTA-SA GPU-Z log text about 45 minutes.
This is senseless up and down clocked.
And although Powermizer phased down taktet (500. .. 383rd .. 275th .. 200-MHZ, but goes up immediately to 500 MHZ. (Including 200MHZ)
This in turn generates latencies of up to 200 000 microseconds.
http://forum.notebookreview.com/showpost.php?p=4607226&postcount=5419
No matter what memory and what CPU they have, it comes to mistakes!
Because GTA is always music, this can be up and down clocking not miss. And this in every game, because the software is not the bar may determine.
In a vehicle, this would mean at 200 km / h switches the automatic transmission into a lower short course. (just to see whether he does) And this has always been stubborn, after exactly 30 seconds.
Who does not notice this, congratulations!
Bothers me each and every Microruckler crackling when I know it's not software related. (apart from Powermizer)
The example shows the video again and false bottom border of the supply voltage to 8800M GTS. (200MHZ PM "on")
This may of course HDX HDX to be different. Disks or other hardware will not have to do. Fault here is the attempt to minimize energy budget.
This leads, however, only in very rare and spletterecial cases (like this video, and not at all) to stutter. Pinballman for example, had this problem with his favorite game. (Powermizer "off", then ran it) That this question is no clocking, I shows me the video with 116/106/236 MHz Powermizer "off" without any problems can be seen. With Powermizer "on", the clock at 200Mhz under 1min (30s500MHz +15 s383MHz +15 s275MHz) and then starts to stutter. (Fullscreen only)
Ok, the end of the same forever letter.
My PM is "off" and remains so. Standard clock is 215/110/465. HDX is quiet, cool and here and jerky hacks nothing. The IE8 scrolls without bucking and without upclocking pointless to 500MHz. If I need more power, I say it to the computer with one click (and not this Powemizer crap!) Yes, and then there are no pointless downclocking if RickiBerlin wants to play.
I have paid for this computer and not Powermizer! -
All I can say is....priceless.
-
Probably why I never noticed that powermizer reg entries were driver dependent. I only OC when I get a new driver to test it out, and then back to stock. I have read plenty of posts where a new driver caused powermizer problems, but never correlated the issue in my brain.
A common occurrence with my brain, that.
-
All good. Just to recap...
If powermizer is turned off, you only get 1 frequency - the max frequency of the GPU at all times. Be it stock of 500mhz core/799 mhz Mem or whatever you overclocked it to. And this is irrespective of whether you are running a 2D app or 3D app.
If powermizer is on, default, the GPU scales through its available frequency steps - from its lowest of 200mhz core/100 mhz Mem to its top frequency of 500/799 or whatever you overclock it to. All depends on the graphics load the GPU experiences.
I just want to make that clear about powermizer so there's no confusion.
I'm not against turning powermizer off. Just want to make folks understand that if you overclock the GPU and you turn powermizer off, you will run at whatever you overclock to all the time. Depending on your overclock, I can't recommend that one does that. With powermizer on, you will only run at overclock when the need arises.
Anyone who's interested in benefits of turning powermizer off on the HDX has to experiment with it. For most there is no material benefit. If you're running emulators (game console variety) there would be as they do not do well with GPU frequency switching. -
LOL.
It's not exactly driver dependent in the sense of working with a driver or not. Some drivers simply do not add the entry to the registry. If one simply adds just dword 32 PerfLevelSrc and set it to either 3322 or 2222 then uses the powermizer switcher program it'll work. Not that the program is necessary. -
Ok, that's an elegant solution that got lost in translation:
What ricki has done is to make several profiles in Nvidia Control panel under performance/device. (could have used Riva tuner also).
This is necessary since if you disable powermizer you will lose its dynamic switching.
Here's the steps:
1. Disable powermizer.
2. Create two proflies. Name one gaming and set the clocks to 500/799/1250 or whatever you wish to overclock it to. Name the other standard. Set clocks to 215/115/465.
To achieve the 115 on mem clock, you will need to manually edit the "standard" profile.
Profile is found under:C:\Users\(your user name)\AppData\Local\NVIDIA Corporation\nTune\Profiles
You will need to set explorer to show hidden files and folders.
You will also need to view the file with notepad. right click it and choose "open with" then search for notepad in windows directory.
Change this portion of the file:
[GPUSettings]
GPUCOUNT=1
GPUNAME0=GeForce 8800M GTS
GPUNAMEINDEX0=0
GPUCOREMHZ0=200
GPUMEMMHZ0=115
GPUCOREMHZMIN0=215
GPUMEMMHZMIN0=115
GPUCOREMHZMAX0=1000
GPUMEMMHZMAX0=1320
GPUSHADERMHZ0=465
GPUSHADERMHZMIN0=312
GPUSHADERMHZMAX0=2500
GPUFAN3D0=0
GPUFAN3DMIN0=0
GPUFAN3DMAX0=0
Save it.
3. Use Nvidia NvProfile to switch between them in real time. Alternatively, you can make shortcuts to the profile file and launch/switch that way by double clicking on them. Even drag them into your quicklaunch bar on taskbar.
Reboot.
Use GPU-z sensor tab to verify that it works.
Have fun! -
BTW, Nvidia released 185.85. Signed and no longer a beta driver... like the 185.81 was...
-
Nice! Makes better sense to me now. Thanks for the explanasheeoon 2.0!
-
Imo it's a key to success.
And now tell us please what you've done that you have only 48 processes running
I have clean install of vista and 62 processes
and it's after I used some tweaks. Before it was 74.
-
thanks.
But maybe different drivers caused better scores in W7 -
I have 2 questions.
With the 185.81 from Nvidia, there is no entry in the reg named perflevelsrc, so how to disable powermizer?
Do you think clocks 215/115/465 are strong enough to play HD 720p and 1080p? -
It aren't the drivers (well, maybe a little bit). When I had the Windows 7 build 7000 32-bits, I was using the same driver as Windows Vista.
Even then, the average fps in Windows 7 were higher than in Windows Vista.
Grtz -
Long list. Some of which you may find you can't live without.
I don't use quickplay, anything. Disabled all its services. Uninstalled the software.
I don't use Media center. Disabled along with services.
I don't use fingerprint reader.
Sidebar disabled. (has to be done in registry [HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Policies\Windows\Sidebar]
"TurnOffSidebar"=dword:00000001 )
Services to cut off:
Diagnostic policy services & related services to it.
Windows error reporting
Flexnet
Ready boost
Superfetch
remote registery
internet connection sharing(ICS)
media center extender service
Nvidia display service
Quickplay Task sheduler
Intel matrix storage (Manual)
Bluetooth
Distributed transactions
net.tcp port sharing
remote access
Tablet PC Inut service
What to disable in startup: (can use ccleaner or jv16 to do it)
ehtray
flexnet connect (ISUSPM)
acrobat assisant
acrobat speed launcher
onScreenDisplay
Bluetooth.lnk
MC updater
Defender from startup and service
sidebar
Sunjavaupdate
HP wireless assistant
IAAnotif
HP software update
HP digital imaging monitor
Makes for a lean, mean, fighting machine. -
You have to manually add PerfLevelSrc to registry first. Just the dword 32. No hexadecimal values are needed as the program will add it. Or you can just add a hexadecimal value of 3322 to turn off powermizer while on AC.
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\{xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx}\0000
The XXXXX-XXXX... is the one with the most amount of entries that starts with _+menu.exe
3D
Add dword PerfLevelSrc here.
Should be in most cases. If it's not, just switch to max clocks. One thing, it will raise your idle temp about 2 to 3 degrees over your idle temp while powermizer was on. Powermizer idle clock is 200/100/400. -
Hi there
my first time on this forum so sorry if I change the current direction a little. I would just like to finally draw a line under the issue of the apparent 4gb limit for my new HDX Dragon / 9000T. My manual says it can take max 4gb, the HP wesbite says 4gb limit, and my system itself I think also says upto max 4gb. Yet, there are people here who seem to have 8gb in their dragon!
Can I safely assume that I can go ahead and buy extra ram and my laptop (Vista 64) will see and utilise it? If so, why would HP list a 4gb max in their material? Would I need to do anything to the motherboard, bios etc... first?
There are a few versions of the Dragon, do some have motherboards that support more than 4gb and some that don't?
Any info would be greatly appreciated,
regards -
At the time the manual and the HDX was created, 4GB was the max available even though the chipset was designed to handle 8GB RAM.
8GB works in all HDX's. -
Brilliant! That's great news - thanks for clearing that up.
Regards -
rockin 8gb here with win 7 and loving it.
-
Got 2x 4GB GSkills in my HDX as do CyberVisions, Lancorp, and Computrekkie from what I recall.
*HP HDX DRAGON Owners Lounge, Part 1*
Discussion in 'HP' started by J-Bytes, Sep 14, 2007.