Ah, by RAM i mean the VRAM on the Graphics Card![]()
There was already some pads there put by Dell - seemed to cover mainly the RAM and some other components. I placed some in the same spots. I'm having a hard time determining how important the thermal pads actually are.
I suppose the best thing for me to do is take it apart again, and ensure that the thermal pads are actually i contact with the head sink.
-
wow i woud leave that to the rev. he is one of the xperts and great guidline. I go for a good dutch sleep.. speak soon
-
After I verified that the thermal pads were fine (they were), I re-ran the tests, and also tried 900/750/1800, and my score was P9861 3DMarks. Highest recorded temperature was 70c.
I only have the free version, so it didn't give me a specific GPU score like in your screenshot. I'm assuming that the score on the webpage is (CPU+GPU)/2.
It took me a moment to realize what the 750-part meant - at first I set the memory clock to 750, which yielded some pretty disappointing results
-
*Edit*
Nevermind I realized as I raise the Shader clock the GPU clock goes up automatically.
Nvidia inspector wont allow me to modify the GPU clock. Only Memory Shader and Voltage. What gives? -
I see you have the built-in 1080p screen(in your signature) is that correct ? That may impact ur performance/score.. I have the 900p.
All benches (you have to do it a couple of times to get a good score) I ran with native laptop = no externals attached.
Not too hard to find the full version of the 3dm tests.. Look around on the net a bit-) -
Ah, well in that case my score is probably as good as we'll get :> have you had any luck running higher than 900/750/1800?
-
see the attached picture with an overview of those that passed the test with the scores along
Attached Files:
-
-
Judging from this I think I'm at a pretty sweet spot with this! This is so sweeeet - I'm going to game all weekend!
And you would say that upgrading the CPU wouldn't really bring any gain to my configuration as it is now? I know that running the XM series processors won't help much, since they have power issues in this laptop, but what about stuff like the 840QM or something? -
Well you have to experiment a little, take it easy. I am eager to learn about your results. I dont have 840 so can not give you any advice on that. But i know its just stupid if I turn up the 940 that the 560m starts to throttle. imagine you turn up the cpu and that runs just fine also on high multipliers but ur GPU throttles.. I found further if you run Furmark that throttles even faster. I suspect our driver has some apps blacklisted that bring the most damage. We dont have a choice. Thats all decided for us. There are ways to edit the vbios with tools such as NiBiTor but I did not dare yet to go that way. Anyway my still believe is, that there must be still some real power left under the hood not visible yet. Thats when I found that nvidia (and AMD as well but still having some choice +20 -20) implemented some sort of OCP read Over Curent Protection in the GPU driver software. Its there that not so many people blow up their cards and rma it, as Its bad for the name of the company
But that as a side note.
I like the 560m very much. Its a great card.
If you raise the stereoscopic 3d lets say from standard 15 to 55% you play ur games at great depth. Its like being there URself awesome
-
Well the m15x has been beating around the bush with the 150W limit since forever. Back when I got it, the GTX260m would throttle like a mo-fo together with an XM CPU until a BIOS Update fixed that by limiting the turbo capability of the XM series processors (that's where ThrottleStop came from initially). That's why I won't consider spending 400USD on a 940XM processor for this one.
I don't believe there is a way to circumvent the 150W limit really.
I think I might snack up a i7-840QM from ebay at 200USD with free shipping thouigh. That's 1/4th of the price of what I've been able to find in Denmark. If anyone reading this could provide any insight into if this is worth it.
Man this is pretty great. I've had this laptop for 1½ year since Dell gave it to me for free as a replacement to my XPS 1530m which was over heating constantly. Then I stopped playing WOW in november 2010, and I didn't turn it on before Skyrim.
I think I'll go buy an OCZ Agility 3 drive to attempt to reduce some load time, and then see how far I can push this over clock. -
You might also wanna reduce any tasks running in the background such as antivirus sw I disable that completely as well as my network connection when benching
-
Alright - I just played skyrim for two hours on this, and it wasn't stable at 900/750/1800, unfortunately. After 4 minutes of playing at these clocks, the screen went blank for a second, and then the clocks reset back to defaults.
It seems to be stable at 850/700/1700 though, and getting a good 30 FPS at High Preset with 8xAA - not shabby. My 260M could hardly do this game at low settings without sucking at it.
Is there any ratio between the Shader Clock and Memory Clock that can be applied when I try to raise them again towards 900/750/1800? -
Yes also here the driver resets itself after some time gameplay when @ overclock, depend ofcourse how high the oc is. Anything around 900 core or more crashes the faster.
(smart-crashing its not that ur system halts or anything like that, just 1-3 secs of freezing while at gameplay and than it just continue ur game @ stocks)
Some oc are very stable though, find the sweet spot..
Re: ratio i dont know but we have to investigate 4sure yes. Ah, well you have the whole weekend.
Who would want for more
-
Here is my log for tonight (time to go hit a bar!).
These tests are all done with Skyrim, since it seems to keep the GPU busy at around 99% most of the time.
As a temperature reference, running Skyrim at stock clocks for 40 minutes, the max recorded temperature was 68c.Code:900/750/1800: Unstable. Drivers reset after a few minutes of gameplay. 890/738/1780: Unstable. Drivers reset after a few minutes of gameplay. 890/725/1780: Highly Unstable! Computer froze 880/733/1770: Unstable. Drivers reset after a few minutes of gameplay. 875/725/1750: Unstable. Drivers reset after a few minutes of gameplay. 865/750/1730: Seems stable. Played skyrim for 40 minutes. Max temp recorded was 73c. Mean temp around ~71c
Tools used are: GPU-Z for logging, nVidia Inspector to set the clocks.
I don't know if the instability in the unstable results are because of the memory clock or something. I feel like I have no idea what I'm doing when adjusting these frequencies
-
try a few more
if hitting a bar we do it tomorrow
-
865 stable when gaming not bad
-
Well, it was only for 40 minutes, and only that game. But it seemed stable and played awesomely.
I just ran a Furmark test while waiting and getting ready to leave, and it just shut down the entirely after 5 minutes at stock clocks. I'm guessing that it's just drawing too much power and the BIOS or whatever is shutting down. I hate Furmark - I'll just permanently leave that tool to the stationary rig builders. -
Furmark is blacklisted(by the driver I suspect) 3dmark is not. mmmm I want to learn about the process that controls Furmark
-
Now I got a question for you.. I read the 740/840 TDP is 45W and the 940 TDP is 55W. 560m TDP is 75W. When I count this together all is wel below 150W.. either 45+75 or 55+75
Who would be the most power hungry you think ? I cant figure why the GPU will throttle... unless it takes much more power than the advertised 75W
-
I don't think it's using more than is specified (thought might be when over clocked), but still, at 55+75 = 130W only 20 remain and there still need to be enough power for devices like HDD, Display Backlight, Fans, Motherboard and whatever else is in there. I'm assuming that it's the over all system usage that cannot exceed 150W, and not just CPU+GPU.
But then again, I'm not an Electronics Engineer (but a Software one). -
Yea and i am in sales pretty much. In the US I would be systems engineer but here in the dutch mountains they call it sales engineer.
A well, we might attrack a real electronics engineer with this
Do you know whether Throttlestop locks the power to cpu ? -
Hey just wondering guys which program do you use of OC?
-
We both use Nvidia Inspector: OrbLog Blog Archive NVIDIA Inspector 1.95 – Tool
The page and blog post is in German, but there is a download at the end of the article - and the program is in English.
I don't know if you know how to use it, but here is a short guide:
Start the program, and press the "Show overclocking"-button. It'll show four sliders:
1) GPU Clock (grayed out)
2) Memory Clock
3) Shader Clock4
4) Voltage
You're only adjusting 2+3. The first one can't be adjusted, since it's slaved to the Shader Clock (GPU Clock = Shader Clock/2).
When you see a notation like 900/750/1800 it means that the sliders has been set to the following values:
Memory Clock: 1500MHz (click the "Unlock Max"-button to adjust the memory clock this high)
Shader Clock: 1800MHz
The format of the notation is GPU Clock/ Memory Controller Clock/ Shader Clock.
So when you see someone saying that the Memory is overclocked to 750MHz, then thats 1500 in Nvidia Inspector (750*2 = 1500). I think the difference in notation ratio has to do with how you technically read the clock. Something with DDR memory being dual channel or something. In Nvidia Inspector the slider is the full duplex frequency - meaning that to get 750MHz, you must set that slider to twice that.
I have good results with overclocking to 865/750/1730. But these values all vary from card to card, and from game to game.
You just move the sliders to the values you want, and click "Apply Clocks & Voltage". These values are reset after a reboot. You can always revert to the defaults by pressing "Apply default" and then "Apply clocks & voltage" again.
To track the values while I play, I use GPU-Z and activate "Log to file" in the Sensors-tab (and also check the check box beneath it). When an overclock is unstable the driver usually just switch back the clocks - but in some cases it can freeze up the computer.
Usual "use at your own risk"-warning applies to everything in this post - obviously. -
or grab it at guru3d - description here in English
NVIDIA Inspector 1.9.5.9 download from Guru3D.com
What I always do is create an OC shortcut and safe that to the desktop.
After each successfull run/score I safe that shortcut to a folder. This way you can nicely keep track of ur positive results - see the pictureAttached Files:
-
-
The Revelator Notebook Prophet
Remember, those power parameters are based on stock configuration or settings. No question that a 150W PSU provides sufficient power to run a 920xm (55W TDP) or 940xm (62W) with a stock 560M (75W), together with the remaining components (monitor, drives, mobo, etc.). But who buys an extreme CPU to run at stock clocks? Both a 920xm and a 940xm overclocked to 26x for example will draw over 90W under full load, which obviously reduces the power available to the 560M. The 560M can do fine as long as it is under partial load, but when it goes to full performance and wants the full 75W, something has to give. And Nvidia cards are notorious for being powersinks when heavily overclocked. For example, the 580M is a 100W card, but will draw up to 140W+ under load when heavily o/c'ed. I don't know what the 560M draws when o/c'ed, but dutchess63's experience suggests it maintains the tradition, so it's probably safe to say more than 75W. TDP limits the power available to the CPU; ThrottleStop is the means to the end by enabling the user to adjust TDP levels for extreme CPU's. Plus the BIOS power tables limit overall power use and, in some cases, individual component use. Power budgeting and arbitration between components for limited power resources is a dance and part of the joy of overclocking and unauthorized upgrading. Crank 'em.
-
Thanks rev. You guys also wanna read here, check the article comments @ geeks3D
http://forum.notebookreview.com/alienware-m15x/623031-gtx560m-throttling.html#post8056984 -
Hey guys
Ive been following this thread for a while, thought it was about time to make an account and contribute
I installed my eurocom GTX560M tonight after flashing my bios to A09 and EC to 1.18. Modified nvdm.inf, and the drivers didnt recognise my card ...
Edit: im running the 285.62 drivers
From the example you provided dutchess, i saw that there were slight differences with your outlined changes, for example, my changes are in red:
%NVIDIA_DEV.1251.03% = Section012, PCI\VEN_10DE&DEV_1251&SUBSYS_048F1028
%NVIDIA_DEV.1251.04% = Section012, PCI\VEN_10DE&DEV_1251&SUBSYS_04901028
%NVIDIA_DEV.1251.05% = Section059, PCI\VEN_10DE&DEV_1251&SUBSYS_04BA1028
%NVIDIA_DEV.1251.06% = Section060, PCI\VEN_10DE&DEV_1251&SUBSYS_02A21028
NVIDIA_DEV.1251.04 = "NVIDIA GeForce GTX 560M "
NVIDIA_DEV.1251.05 = "NVIDIA GeForce GTX 560M "
NVIDIA_DEV.1251.06 = "NVIDIA GeForce GTX 560M "
As you can see, the section numbering is different, the relative entry in my inf file to yours says Section059, whereas yours is Section060. Could this be the reason my card is not being recognised by the drivers?
Additional notes, the section business is within the [NVIDIA_SetA_Devices.NTamd64.6.1] area of the inf file. As opposed to the [NVIDIA_SetA_Devices.NTamd64.6.0] area.
Your help would be greatly appreciated as i really want to get this card working, and BF3 it up! -
I just re-installed my entire system on an SSD, and it worked just great. Here is a link to the driver bundle (latest version, downloaded off nvidias website, and modded less than two hours ago): http://ge.tt/9ccSSSC?c
-
irfan wikaputra Notebook Consultant
hey guys, sorry if i am not an alienware user.
but i am also a gtx 560m user. this card is quite lovely for its price
Uploaded with ImageShack.us -
is it worth upgrading from a 5850 to a 560M? i don't want to bother overclocking either of the cards...
and, most importantly, would the 5850 bracket fir on the 560M so that i can mount the cooler? -
well its not hard.. just make ur sections match.. and in both [NVIDIA_SetA_Devices.NTamd64.6.1] + [NVIDIA_SetA_Devices.NTamd64.6.0] area of the inf file.
Ur working inf should look like this :
%NVIDIA_DEV.1251.03% = Section012, PCI\VEN_10DE&DEV_1251&SUBSYS_048F1028
%NVIDIA_DEV.1251.04% = Section012, PCI\VEN_10DE&DEV_1251&SUBSYS_04901028
%NVIDIA_DEV.1251.05% = Section059, PCI\VEN_10DE&DEV_1251&SUBSYS_04BA1028
%NVIDIA_DEV.1251.06% = Section012, PCI\VEN_10DE&DEV_1251&SUBSYS_02A21028
%NVIDIA_DEV.1251.06% = Section059, PCI\VEN_10DE&DEV_1251&SUBSYS_02A21028
NVIDIA_DEV.1251.04 = "NVIDIA GeForce GTX 560M "
NVIDIA_DEV.1251.05 = "NVIDIA GeForce GTX 560M "
NVIDIA_DEV.1251.06 = "NVIDIA GeForce GTX 560M "
Dont forget to safe the modded inf file 1st - before running setup.exe -
It's the same heat sink (and the x bracket has to fit the heat sink's screws). So I think it'll fit just fine.
Edit: I don't know if the 560M out performs the 5850 Mobility for gaming purposes. But it should. -
Yes its possible in a sandybridge system, I saw a similar Toshiba crancking up the core of 560m to a 970 stable in the 3dmarkvantage online scores
-
What is this EC you've updated to 1.18? (I don't know - so I'm wondering if it's something I have missed, and should update)
-
Thanks for that dutchess
However i took a different approach. I only modified the [NVIDIA_SetA_Devices.NTamd64.6.1] area with the section060 tag like you previously mentioned. And then listed my GPU at the bottom of the listNVIDIA_DEV.1251.06 = "NVIDIA GeForce GTX 560M "
Then i modified the ListDevices.txt with the appropriate card ID tag in the nvdm.inf section of the text file.
Ran setup.exe, got an unverified driver warning from windows, told it to proceed anyway. Installed, rebooted, looks like its working, HWiNFO64 recognised that it was a GTX560M, so theres a good sign.
About to have a game with it now, just to check that everything is in tip top shape.
Edit: what i was doing wrong was trying to install the nVidia redistributable package, not run setup.exe might make a note of that somewhere, took me a little to figure it out
-
The EC version comes with the A09 BIOS update. After the BIOS is flashed, it then asks if you want to update the EC, where you type the command "y" (Without brackets) and it should update your EC aswell.
-
Just make sure you load all sections, the correct way as explained above, otherwise you might encounter bad performance from ur just installed modded driver
-
Would you be able to send me your inf file so i can see the differences between yours and mine?
-
irfan wikaputra Notebook Consultant
u should upgrade to at least 6970m from 5850m
560m is like 1k more in 3d mark vantage without overclocking -
cant you see it now? I mean LOOK at the example above..
Thats all you need - to do the perfect install..
Just stream the section numbers to be the same with the other 560m hardware id already in there, and Done..
O no not completely.. do a happy dance for 5 minutes
-
Download this one: http://ge.tt/9ccSSSC?c
I used it yesterday and it works perfectly. Just open the nvdm.inf in the zip file and compare. -
Yes but thats only 1 driver 285.62 .. remember.. each driver is different as nvdia keeps changing the sections nrs per driver
In the end its better you learn Urself how to apply this simple trick..
Cheers to self-engineering a bit, we are proud M15x owners hehe
-
I more wanted to review it out of self interest, as my way is slightly different, and seems to get the exact same result. Ill take a look at that zip file tho.
(I'm a computer engineering major with electronics engineering as my minor
, can never learn too much
)
-
Ah I see sure
In that case look here:
Drivers | laptopvideo2go.com
They also use a different approach for the .inf which works well. -
Hola how are youre benches now. Any nice scores lately ?
-
It's all stable. The best bench I got in Vantage was 9900-something. I didn't really try to push those tests much - I focused more on getting to the highest stable clocks I could while gaming (Skyrim and SWTOR at the moment). The highest I got in that context was 865/750/1730.
-
The Revelator Notebook Prophet
Congrats on getting it stable and tweaked. Nice setup.
Good to see things are still active over here in my old stomping ground. I sure loved my M15x, but the mobo finally gave out, and it made no sense to replace it. -
Yea it's pretty nice. I was getting into the phase where I was preparing my self mentally for purchasing a new machine (probably an m18x). While I do have the money to throw after a computer with that sort of price tag, it requires mental exercises for me for a long time to actually do it. I like to turn every penny before spending it.
I'm very happy that just buying this graphics card upgrade will be able to postpone that for a year or two. -
Iv been following the thread and just recieved 560m and will be installing it tonite in my M15x,940xm. Just had a few questions for 940xm owners
1. Do i need to update the vbios of the card? If so from where and method
2. Are people with 940xm faceing throttle in games with 560m stock clocks or only with oc? -
Thanks for that link dutchess, now to delve into the world of driver modifying, and why all these different approaches seem to end up with the same desired effect.
There must be some pro's and con's for doing it different ways, or some reduced functionalities ... it makes no sense to me that many different ways do the same thing. (In the programming world there are many ways to skin a cat, but there is always an optimal method)
M15x and GTX 560M
Discussion in 'Alienware M15x' started by dutchess63, Aug 11, 2011.

