Hello,
I bought a few weeks ago a Acer Aspire 6935G laptop and noticed that it seems to have a high quality cooling system, many cool features and so on. Overall it is a great gadget![]()
The specs of this laptop:
Model: Acer Aspire 6935G
CPU: Intel Core2Duo P7350 (2GHz, 1066MHz FSB and 3MB of L2 cache)
GPU: Nvidia GeForce 9600M GT with 512mb of GDDR3 memory
Display: 16,4" with 1366x768 (HD 16:9) resolution with 8ms response time
Memory: 4GB of DDR3 memory @ 1066MHz
Hard disk: 320GB 7200rpm
Drive: Blu-Ray
Speakers: True 5.1 with Tuba bass boost
So, however, I decided to take everything out of this beast in terms of graphic card overclocking. Firstly I tried nTune and Rivatuner and noticed that the it was impossible to overclock this GPU at the way I used to do on my desktop PC. So, I googled a bit and fount two great little programs called Nibitor and nvFlash. Nibitor is a program to scan your current BIOS graphic card settings, and save it to your harddisk so you can modify practically every single setting on your GPU which is worth of doing it and some which are not (Like the boot text, though you can impress your friends by editing it![]()
Once you have modded the BIOS on your PC, you can update it to your real bios settings using a tool called nvFlash. In case you want to do this, I would suggest to search some tutorials from google like I did. You will find everything you need to perform the overclocking. All you need is a empty USB flash disk with at least 16mb space which wont be a problem at these days![]()
So, about my overclocking experiences a bit...
The factory clocks of this card are:
Core: 500MHz
Shaders: 1250MHz
Memory: 1600MHz
Firstly I clocked this card to match with the GeForce 9700M GT card which has a core speed of 625MHz, shaders at 1550MHz and memory was the same 1600MHz. The 9700M GT card itself is exactly same as the 9600M GT, but with a bit higher clocks. After two hours of ATItool's stablity tests and running a temp monitor at same time I noticed that the temp of my card haven't got higher almost at all.
So, I decided to push my card a bit more. I raised the core speed to 650MHz, shaders to 1600MHz and memory to 1700MHz and ran the same tests. After two minutes of testing my computer crashed and I turned my clocks back to match with the 9700M GT card which ran smoothly.
After this I noticed that the memory speed was the reason of this crash, so I decided not to overclock it at all and turned the core back to 650MHz and shaders to 1600MHz.
Suprisingly, the test ran over 2 hours without any error and the record tempature according to GPU-Z was only 66C! I were happy with these clocks for a while, played GTA IV many hours in a row at the LANparty without noticing any errors / unstability. Now a few days ago I decided to push my card even more.
I setted the core clock and the shader clock 25MHz more after running 30 mins of the tests. I went up to 750/1675/1600 and after around 10 mins of testing my computer crashed. Then I went back a bit to 725/1650/1600 and ran the tests for a bit over a hour. It didn't crashed, but I noticed some graphic bugs and the heat went up to 80C. This was too much of problems, so I turned the card to 700/1625/1600.
At this point I ran the same test for over 100 minutes without any erroring as you can see here: http://hosti.us/oc/loadtest.jpg
The max tempature recorded was only 71C and everything went just fine. So, I decided to keep these clocks and ran some benchmarks.
I am amazed how much I managed to get out of my laptop!
I got a 3dmark06 score of 6783 which totally is more than I expected.
Here you can see the proof: http://hosti.us/oc/markrecord.jpg
Now, my computer performs better than the ones equipped with a Radeon HD4650 card and almost as well as ones with a Radeon HD4670 card!
I ran the 3dmark06 8 times in a row, played GTA IV, Flatout 2, CoD4 MW2, CS:S and other games I have installed to this laptop and all I noticed was a HUGE performance boost compaired to the factory clocked 9600M GT card. I am getting 30fps in GTA IV benchmark on medium settings, over 100 FPS in CS:S benchmarks while topped everything to the maximum including 16x AA and 16x filtering (Though I lowered them a bit to get FPS of 200 for fluent online gaming) Overall, I played with my laptop 8 hours non-stop and hence, I just couldn't get the temp above 71C, which is completely acceptable tempature on load! I am extremely happy with these clocks and the stability im getting.
I can say that overclocking this GPU on this laptop is a real deal. You can gain a HUGE performance boost while keeping the temps on acceptable level.
Although this will void your warranty, keep that in mind.
Thanks for reading, this is my first post at these forums![]()
I wish you happy overclocking and low tempatures![]()
Greetings from Finland, SanZoR
PS. My GPU-Z validation can be found from here: http://www.techpowerup.com/gpuz/ebpnr/
-
Great Thread
I myself have this laptop with the same spec as yours.. and after reading this guide i did the overclock myself with the same speeds as yours
Currently stress testing as i write this at 59c!
I might be tempted to up a little more but to be honest.. its a very nice overclock as it is
-Regards -
Just an update!
Running 700/1650/1900 Stable
after 2 Hours testing! now how insane is that! and under 70c
might see if i can go 2GHz on the Memory! but not just yet
http://www.techpowerup.com/gpuz/2mng/ -
Hello to all. I am to proud to own one of those 6935g machines
. Have oced mine for months now and even modified my heat sink on gpu. And here are two pics one to show result of job and one of my gpu with red copper plate which I placed
-
i sooooo wanna know if this damn card will work in an extensa
-
Glad to hear that my thread has helped people!
Just a little update, the card becames unstable on dx10 gaming, so I would suggest instead 675/1600/1600 clocks. Those are 100% proven to work like a dream. However, if your card works just fine with the higher clocks, no problem
-
Meaker@Sager Company Representative
I moved to the 4650 DDR2, it may loose in 3dmark 06, but in modern titles it comes out ahead, as shown by the 2500 3dscore in vantage vs the 2000 of the 9600 (both overclocked).
-
Actually 4650 DDR2 is a lot worser than 9600M GT DDR3... If it would have been DDR3 version, then it would be okay in my opinion
-
Meaker@Sager Company Representative
Actually i've owned both and the 4650 DDR2 is better, same machine, same CPU. -
It seems that the newest version of nTune (that has been renamed to nVidia System Tools), version 6.05 works properly at least with 32-bit Windows 7.
I was able to achieve the same overclocking result with my 6935G simply by installing nVidia System Tools 6.05 and overclocking the GPU core & shader Mhz, saving it to a new profile and setting that profile is loaded every time when the Windows is started.
This sollution is much more convenient as you don't need to play with Bios at all.
However, for some reason my results in 3dMark06 were little bit lower. The default 3dMark06 was 5730 marks and after overclocking 6509 marks. Though I settled with 700/1600/1600Mhz (Core/Memory/Shader). Still impressive!
(This was with 32-bit Win7 upgraded from 32-bit Vista, 195.62 nVidia drivers and v. 1.13 bios)
I was just wondering could the Core2 Duo Penryn P7350 processor be overclocked at anyhow a little bit. Tweaking it could allow hitting the 7k 3dmark06 limit.
Overclocking the CPU is in theory possible with laptops too, but I guess nobody has done it with this specific laptop - or has one..?
Hard to believe, as speed difference between DDR2 and DDR3 is tremendous. Usually same GPU:s are ca. +30% faster with DDR3 memory compared to much slower DDR2.
But I wouldn't argue as I have no first hand experience about the speed difference of those GPU's. -
Meaker@Sager Company Representative
A bit of a difference is that the 4650 DDR2 is defaulted to 500mhz, not 400mhz like most. Also with overclocking it will usually reach 600mhz.
Most Acer P45m laptops are SERIOUSLY locked down, you have to break out the soldering iron on the PLL to overclock them unless it has bios options. The 6935g is locked down. -
Meaker@Sager Company Representative
Here is a picture of my set:
http://i4.photobucket.com/albums/y143/Meaks/Photo0103.jpg -
Surely yes, but I still think the bottleneck in graphics is usually graphic memory, as much not the GPU. So even tough the HD4650 may be a bit faster GPU compared to 9600M GT's, it's DDR2 memory will be the bottleneck and makes 9600M GT (with DDR3, of course) a faster combination.
However, if you have any proof that stands otherwise (like 3dmark06 results), I'd be glad to see those.
Also, 6935G's 9600M GT's GPU is overclockable from default 500 Mhz to 700 Mhz, which makes the GPU run +40% faster. Combined with DDR3 memory this results easily tremendous increase, from 5100-5700 to 6500-6700 3dMark06 points, which is imho not possible with 4650 DDR2 or any DDR2 -mem based mobile GPU, as the bottleneck is not the GPU but the slow DDR2 memory.
I am sure not even the difference in CPU's (Intel Core2Duo P7350 @ 2.0 Ghz vs. Core 2 Duo P8700 @ 2.53ghz) will have so dramatic affect on the score, though it may give some extra hundread 3dMark06 points to HD4650 version of 6935G.
However, if you can show me otherwise with screenshot of 3dmark06 results, I'd gladly see
So overclocking the CPU is simply not possible? Not even with manipulating bios with some tool like NiBiTor etc.?
However, does overclocking CPU result with any difference in gaming experience or is the bottleneck usually the GPU?
I also would argue with this older statement of yours. I believe that 3dMark06 shows pretty well what kind of performance the rig has, as it runs some tests that most of the games playable with these kinds of machines are possible.
If you refer some newer DX 10/11 techniques etc I guess those are simply irrelevant, as these laptops cannot simply run any of the "high-end games coming within 2 years" anyways. So it simply doesn't matter does HD4650 DDR2 gives 4 frames per second where 9600M GT DDR3 gives only 3.5 frames per second, as you simply cannot play the game with those low frames at all. -
Meaker@Sager Company Representative
Its true even in 3dmark 06, the score is in 3 parts:
SM2 (DX9)
SM3 (DX9c (this is in most modern games))
CPU
With the same CPU I got the following scores:
SM2: (9600 = 2870, HD4650 = 2166)
SM3: (9600 = 2545, HD4650 = 2653)
CPU: (roughly the same)
As you can see while it looses rather handily, its at least not worse in SM3. Most modern titles now use SM3.
As for screenshots, here are the compares (I own 3dmark06)
HD4650:
http://service.futuremark.com/compare?3dm06=11772001
9600m:
http://service.futuremark.com/compare?3dm06=11603740
I also own vantage,
HD4650:
http://service.futuremark.com/compare?3dmv=1341933
9600m:
http://service.futuremark.com/compare?3dmv=1284100
PhysX was enabled so ignore the CPU differences. -
As you can see my 9600m gt gets over 3000 points on sm3, and i dont believe ati 4650 can achieve same atleast not ones with ddr2 memory
`quote Meaker As you can see while it looses rather handily, its at least not worse in SM3. Most modern titles now use SM3. -
I started to wonder why am I getting about 200 points lower 3dMark06 score than others in this thread.
I mean its reported that you get over 6700 marks on 3dmark with clocks 700/1600/1625.
For some reason I am getting only like 6550 with the same clocks.
Could the reason be that my Windows 7 is 32-bit (and also upgraded from Vista) compared to others that have 64-bit Windows 7.
The difference is not much but still I'd like to squeeze those 200 units too. (and after that maybe try to overclock the memory too to achieve 7k scores)
Can 32-bit Win7 be slower than 64-bit in 3dMark06? Or can the upgraded Win 7 be slower than fresh install?
I also switched to same older nVidia 186.81 drivers than others (as the 195.62 has problems as it disables the brightness control functionality), but this had no affect on the score. -
To answer my own question:
http://blog.tune-up.com/windows-ins...mance-check-upgrade-install-vs-clean-install/
It seems that there are some differences between upgrade and clean install.
To get those missing +200 3dmark06 point I should format the c and try fresh install.
However, does anyone know can I use the same upgraded license (original vista license + ordered upgrade from Acer) for clean installing too? Can I find an proper Windows key somewhere from this machine? -
It seems that I was getting a little fixated to this :=)
I did clean install of Win 7 64-bit (supprisingly my Acer Upgrade Key from Vista to Win 7 worked also with cleanly installed 64-bit Win 7).
After customizing the Windows I installed and ran 3dMark06 and suprisingly got a little bit better results:
On default settings:
3DMark Score 5800
SM 2.0 Score 2522
SM 3.0 Score 2368
CPU Score 1799
Then I set 700 Mhz GPU clock, 1650 Mhz shaders and 850 (1800Mhz) Memory:
3DMark Score 6820
SM 2.0 Score 3150
SM 3.0 Score 2884
CPU Score 1768
Wow! I'm only short of 180 points from the 7k mark I am trying to reach
It seems that older 6935G's may have better clockable GPU DDR3 memory as I have no trouble setting 850Mhz (the newer ones seems to overclock none). Once I set 900 Mhz and unfortunatelly 3dmark06 crashed.
How the heck TechMeh got the memory running 1900 Mhz stable? You had to set 950Mhz there? What was the 3dmark06 score? Did you went over 7000?
I hope I could somehow tweak this rig to reach 7000 marks stable
However, getting already 3dmark06 score over +1000 better is not bad either and should add a lot of gameability in GTA4, which is the title I was suffering a little bit with this machine lately.
-
Hello people,
Glad to hear that you guys have succesfully OC'd the laptop!
However, I had a terrible accident a while ago, and managed to brick the GPU of my laptop. Anyway, I just ordered a new 9600M GT for 170 USD incl. shipping, and should arrive within this week. Remember, DO NOT change the model or brand of this card in nibitor! It will cause your BIOS to screw up, it wont even post. Therefore, It cannot be fixed.
I am also eager to OC my CPU to 2.4GHz, which would give me some extra FPS point in gaming, and should be quite safe OC aswell.
To gain the best 3dmark06 score, remember to use the newest BIOS from acer and a fresh 64bit Win7 installation.
My SM 3.0 score is 2900 and SM 2.0 3115. This does beat the 4650 DDR2 card in any matter, so I would say that this card is much better for gaming.
Also, this runs suprisingly cool and stable under those relatively high clocks. -
Sad that you screw the GPU with nibitor. I guess it's just much more safer to overclock with nVidia System Tools 6.05, as it seems to work properly with Win 7 (both versions, 32-bit as well as 64-bit - I can confirm). I got the thing to work stable with 700/850/1650 and runs max temp of 70C (max 68C after I hoovered the bottom holes). (However, P7350 runs very cool, max of 54C and I was able to undervolt it too after I found the ThrottleStop tool from this forum as for some reason at least with 1.13 bios the CPU startet to throttle after 10 minutes playing GTA4.)
I have to say that I got into this too - the thing is that CPU propably cannot be overclocked withouth soldering the motherboard or making a pin mod, which I have no experience or had not found any instructions to this.
However, what I did was that I orderer Q9000 from Verkkokauppa.com (as I'm from Finland too, btw.) and shall report here how does it work (or does it). However I managed to make deal that I can return the CPU if it doesn't work. It seems that at least Acer 8930's are sold with Q9000 processor so it may also work with 6935G.
But at least P8700 should be compatible with, it's just not worth of upgrading from 2.0Ghz P7350 to 2.53Ghz P8700 for 250 euros.
I guess even the Q9000 works the improvement isn't much in 3dMarks but for e.g. GTA4 may benefith more with four cores. I'll let you know what will happen.
I would be interested in knowing is the GPU replacable as for e.g. 260M or 280M would give much more "torque"...?
However this is quite crazy as purchasing the new Asus G60VX for 1200 euros would give the 260M - if I replace any more parts I have spent the same sum for this old rig too
I updated the 1.13, should I downgrade back? How much was the difference in 3dmark06 between these bioses? -
Hehe, I think I saw you in murobbs some time ago
Anyway, I am very interested to know does that CPU work on this laptop! Also, I fount this CPU from ebay: http://cgi.ebay.com/NEW-INTEL-CORE-...52QQcmdZViewItemQQptZCPUs?hash=item335999ccc8
It is 3x cheaper and has higher clocks and more memory. And the TDP is also same. If the quad core you bought works, this should work aswell (?).
I haven't noticed that the 1.13 BIOS was already released as I have had my laptop unused for a few months now due to my GPU breakdown. I would suggest you to keep the newest bios on it. I'll correct my last post aswell.
[finnish]Löytyykö sulta meseä? Olisi mielenkiintosta jutella tästä läppäristä enemmän![/finnish] -
dude if it works, thats awesome... im also upgrading to a Q9000 cpu...
but keep us informed man, id like to hear a good successfull upgrade !
Masterleaf out -
Too bad Q9000 didn't worked on this laptop.
You should buy X9100 instead, it is even more powerful in games, and cheaper. And it works unlike the quad
-
Yes, unfortunatelly 6935G's bios didn't recognize Q9000 as I tested it (blank screen, not booting).
However, the best upgradable options should be X9100, T9900 or T9800 - which ever you get of those cheapest.
Also, the maximum stable clocks with default voltage for 9600M GT were 680Mhz GPU / 800 (1600) Mhz Memory and 1610Mhz shaders. (Higher clocks works also on DirectX 9.0/9.0c tests, like 3dmark06, but if you try to run anything DX10 (like Call of Pripyat Benchmark), you'll get blank screen.
So stay with 675/800/1600 (or 680/800/1610 tops) if you want to have 100% stable GPU. This should result something like 6600 3dmark06's, so it's not bad either.
Also I am planning to volt mod the bios and try to achieve a little bit higher clocks. The plan is to find maximum settings but with 100% stability (that would run Stalker - Call of Pripyat test trough properly).
In addition to that, I shall try to achieve 100% stable 7000-7500 3dmark06 score by replacing X9100 some day in the future. (which should be possible, as X9100 should give something like 2800 3dmark CPU points, instead of P7350 giving like 1800). -
Hallo everyone
. After verrrrrrryyyy extensive work on my 6935g with 9600m gt gdd3 card I can tell you to not try volt moding on it especialy not vith nibitor. My stabile clocks are 620/992/1550. For mu gpu I can have also these clocks stabile 740/900/1600 but there is no really much improvement. So i just use first ones which are proportionally correct ones.
-
Could you please define more precisely why is that?
Do you mean that there isn't too much improvement achievable - or the risk bricking the GPU is too high? (Especially with Nibitor? why?)
The thing is that my stable clock for GPU with default volt is 680Mhz (also 700Mhz is working with DX9, but not with DX10).
Also I cannot set my GDDR3 memory go 992Mhz, not even 805Mhz as only 800Mhz seems to be stable with DX10 (850Mhz works fine with DX9).
Also, yor shaders seem to be lower, as I am able to get 1610Mhz without problems and 100% stable DX9/10. However, with DX9 stability shaders seems to go also 1675Mhz.
So - to me it seems - that there's some difference with our systems - you may have a GDDR3 chip version that goes higher, as mine won't go past 800Mhz.
Btw, did you also ran the DX10 benchmark (S.T.A.L.K.E.R - Call of Pripyat) - as it seems to be an ultimate test for GPU stability? (Even if 3dMark06 runs fine, it seems that Pripyat crashes on some clocks.)
For me, 680/800/1610 seemed to be the maximum for Pripyat to run fine. This would give something like 6600 points in 3dmark06 (when CPU being P7350, of course).
700/850/1675 went fine also on 3dMark06, but Pripyat crashed, so there were no stable settings. This would have gave unusable 6850 points in 3dmark06 (usable with DX9 games, but not with DX10 games).
I suggest everybody to download the S.T.A.L.K.E.R - Call of Pripyat benchmark for stability testing. It seems to be better stability tester than anything else..
But why should I not volt mod? If you managed to squeeze those clocks you mentioned with stable settings, I guess the improvement is worth it. Or what was to reason to not to volt mod with Nibitor? Any other tools then? -
Also, as mentioned before, I am planning to replace P7350 CPU (2.0Ghz / 3M cache) with X9100 (3.06Ghz / 6M cache) some day in the future. This would improve 3dmark06 CPU score from 1800 points to 2800 points, which with stable GPU clocks of 680/800/1610, would give something like 7000-7400 points in 3dmark06 (with my estimation) instead of current stable 6600 points.
However, in case the volt mod could help squeezing some more stable Mhz with the GPU, GDDR3 and shaders, I gues with X9100 even 8000 points in 3dmark06 wouldn't be very far away. (Possibly not achievable with 100% stability, but something like 7500-7700 could be).
We'll see as soon as I'll find suitable X9100 or P9900 from eBay..
Even X9100 should run pretty cool after lowering the CPU voltage. -
Hello again. Ok I will try to explain my experiences with this machine. My conf is 4Gb ram and p8400 gpu is known from above post. If you see my first post in this thread then you should understand that i have modded heat sink of my gpu which give my more oc stability on ram specially. Now why Im sceptical on volt moding. Have learned that if ichange voltage to 1.15v with nibitor and reflash vbios card will get very unstable like after windows restart it will either get black screen loop (black screen for 5-10 sec windows screen for 1 sec....) or strange lines on screen even with standard clocks with ram.
Which somehow make me think that card is very unstabil for volt mod or nibitor give more power then it should. I dont know but if there is way to increase from 1.05 to 1.07 or 1.08 ,ax 1.10 maybe.... to much maybe here.
Hope this gives you litle more to go further with yor OCing
.
-
Okay - and thanks for your explanation! It's very valued
So I guess there is nothing to achieve by volt modding if the GPU goes unstable.
Did you got any damage to your GPU by volt modding or did the stability come back after setting the original volt?
What I am wondering is how modifying the heat sink would give any more stability, as imho the temperature isn't the problem here (or reason for unstability), as with my maximum stable clocks the temp goes up to 68 C but not higher. (I guess the GPU should work even in 90-95 C).
So in my opinion modifying the heat sink should have absolutely no affect on on stability as the temperature isn't the issue at all in this case, but the GPU, memory or shaders just won't go past the Mhz I mentioned in my previous post.
Could you also explain what kind of modification have you made? In the picture you posted before it seems that you have added some kind of copper plate between the heatsink and the gpu? How should this affect on temperature as the current default heatsink has also copper bottom? -
Hallo. No there was not any issues after reverting to original voltages i guess there is other things to adjust if one should set volt higher like timings and others, but not shure at all. And original heat sink dont have any contact to memory modules so I have connected memory modules with copper plate which is visible on pic which again gives me litle overclocking room for memory to. And yes this baby runs pretty cool anyway. Didnt try your Call of Pripyat benchmark tool but I play crysis every day and this game can push gpu. With 700/1750/900 clocks I achieved 3017 SM2.0 point and thats best so far on my pc.
-
Okay, so it's plausible that memory modules go hot and crash and there may be a little room for ovetclocking them if cooling the modules. In my tests, however, the modules crashed instantly if clock was raised at all, so it really didn't support this theory.
The problem is, that from low the medium high settings Crysis run in DX 9 mode. Only the highest settings activate DX10, so it doesn't actually tell any DX10 stability even if it runs stable on medium settings.
If you are interested in enough, I suggest you to download the Stalker Call of Pripyat Benchmark as it runs complitely in DX10 and gives very exact result of stability.
It seems that even though 3DMark06, GTA4, Crysis or any other DX9 game will run hours withouth problems, activating the DX10 features in this GPU causes instant crash if its too much overclocked.
So I believe if you run Call of Pripyat you need to lower your settigs a little bit to achieve DX10 stability as it propably crashes at some point. The free Benchmark utility can be downloaded for e.g. from here:
http://downloads.guru3d.com/S.T.A.L.K.E.R-Call-of-Pripyat-benchmark-download-2433.html
I would really like to know what kind of settings are achievable with DX10 stability when the memory modules are cooled with copper plate
The stability should be 100% sure if the test runs trough (4 different weathers). If it crashes (=blank screen, need to reboot) before giving the results, then it's not stable -
I have to correct myself on last post in 3dmark06 sm3.0 which is dx10 i believe i got 3017 points with 1024x768. And you have make me to experiment some more today and here is what i got with new clocks in 3dmark06 with native 1366x768 resolution. will try to push it to 3000 points on sm3.0 with native screen res
. And Iwill try your benchmarking tool to. Also if im correct crysis show me dx10 mode nomather what settings I use, will come back on this.
-
Please don't change the default resolution of 3dMark06 - which should be 1280 x 768, as the problem is then that results are not comparable anymore (as most of us got only the free version of 3dmark06 which doesn't allow changing resoltion).
However, the most interesting part would be, if you'd download the Stalker Call of Pripyat Benchmark and try to run it. If it goes through with your clocks (especially memory over 900Mhz) I am really impressed and would immediately like to have more information about your copper plate memory heatsink mod
Before you haven't run the Pripyat Benchmark trough, I am still sceptical about the stability with your clocks. For me it seems that GPU won't go past 680Mhz, Memory over 800Mhz or shaders over 1610Mhz withouth Pripyat simply crashing to blank screen.
Please try that to see is your system more stable with higher clocks
-
As you can see for e.g. here (see four bottom images):
http://www.incrysis.com/index.php?option=com_content&task=view&id=511
The DX10 mode is used only when Crysis settings are on "Very high". Otherwise (Low, Medium, High) they're DX9. -
Hello again. Have downloaded your STALKER bench. tool and havent changed anything in settings there, here is result.
-
And dont think that crysis runs only on very high in dx10, not after that i found on google.
-
Here is 3dmark06 with standard settings result.
-
Well, I have to stand corrected. It really seems that your clocks are stable. Wow, amazing, over 7200 points.
Still the weirdest thing is how your memory goes 922Mhz, but GPU "only" 620Mhz. It seems that overclocking the memory would give the best performance. Edit: I just noticed that you had changed the GPU to 660Mhz and shaders to 1650Mhz - and the test ran through fine. Did this extra stability came after the heatsink modification? Weren't you able to go past 800Mhz without it?
Could you tell me about the materials how thick copper plate did you use? Is it simply put between the standard heatsink and memory chips & gpu? Where there no contact to memory chips with standard heatsink?
If I somehow could manage memory go over 900Mhz, GPU 680Mhz and shaders 1610Mhz, I think with X9100 cpu not even 8000 3dmarks wouldn't be very far away
-
I am afraid you come to suffer some throttling problems if you upgrade your cpu with one which has more TDP then original one you got your pc wiyh. Hope I am wrong anyway.
-
Hey guys, I'm having problems with overclocking my 6935g. I can't even run 3DMark 06 with 625/1550/1600 clocks. Just getting a black screen during the second test. And heat is not the problem. The temperature isn't even getting over 75 degrees.
I have the 6935g-944G32Bn model which has a T9400 @ 2.53 GHz and a 1920x1080 resolution screen. Currently I'm using Dox's Customized Forceware 195.62 driver.
Any ideas? I really want to play GTA IV. Ja Suomesta ollaan myös.
-
Sksm, there are many factors that could hinder your capability of OCing your video card, excessive heat is just one of them. Some cards will simply crash if you push the clocks too far, even if your temp is perfectly fine.
You can't compare your OC to another, since video cards are not made equally.
Also the video card drivers could also play a big part in your OC, if you can't OC well with one driver, you may as well try another and see if it works better for you. -
I have tried with other drivers too. I know video cards are made equally, but it's weird how others can push this card quite far but I can't even get it stable with same clocks as 9700M GT.
-
Well, those who can push the card very far got a quality piece of silicon. I have seen people with 9800M GS can get to 9800M GTS speed easily at default 9800M GS voltage, but I have to raise the voltage a bit in order to support the GTS clock like most of the 9800M GS owners.
If your video card can't OC as well as others, consider yourself unlucky. It's not something abnormal. -
For some unknown reason even my P7350 started to throttle after playing 10 minutes of GTA4. (This wasn't because of heat which was 46C max on load, but I guess there is just a fault thermal censor and/or the newest bios 1.13 simply has extremely low temperature settings to throttle.)
However, I read this forum and found this excellent tool called ThrottleStop, which, as its name describes, disallows cpu to throttle. (Which shouldn't be a big problem as it seems that P7350 won't go any higher than 46 C). Also ThrottleStop allows CPU to be undervolted, so temps shouldn't go as high as that.
Okay, what comes to X9100, it naturally creates more heat, but what I have read from this forum, some has replaced this CPU just fine and undervolting it a little bit the temp should't go any higher than 55 C.
So I cannot see much of a throttling/heat problem in CPU area.
But what I would be interested to know, what kind of modification did you do for the GPU? I found some copper plate (thin, 0.5mm thick folio actually) from my local harware store and wasn't sure does it has enough thermal capacity as it's so thin.
How thick was the copper you used? What was the measures of it? Does it go over memory chips as well as the GPU (or only on memory?) You needed to remove the aluminium cover around GPU complitely?
I think I could add this copper sheet to try if the memory chips are any more overclockable, however I think it's not about the heat as just memory specification, which could allow your ones to go higher.
Were you able to overclock the memory at all before modification? (My mem goes 850Mhz, but Pripyat chrashes. Only 800Mhz setting seems to allow Pripyat not to crash). -
Hello. I could not overclock memory before mod more then 825 so there is clear improvement with mod. I have used 1.5 mm thick plate dont know how good is going to be if you use 0.5 mm. I did removed thermal pads from memory modules applied litle ac5 and used two copper plates which I did cutted in like L profile so it get over all 4 modules. and on top the bigger copper plate which is visible on picture over core and rest of mxm module. I didnt removed original bracked which have bin used to cover over memory modules and thermal pads, just cutted hole in copper plate. I wish you luck in your modding and I`m sure you are goinng to improve your gpu performance after that.
-
Thanks for the info, though I allready got into this project before receiving your reply.
I have to say that first I was very sceptical about this - I though that no dramatic improvement would be achievable by modifying memory heatspreaders. But the results amazed me!
What I bought was this 0.5 mm thick copper plate (as no thicker were availabe today - Sunday - on the local store) and used also 30% silver paste from Nexus.
I removed the original pads on the GPU card that are sticky like, possibly not so good thermally conducting material, and cut two small plates that would go on two memory chips each instead. (Because the plate was so thin, I had to double the plates putting some silver paste between, just like making a sandwitch).
Anyways, this way the thermal conductivity would show some improvement, but I had no experience would it gain anything. When assembling the thing back together I also noticed that connection between GPU memory heatspreaders (the aluminium thing around the GPU that is screwed on the card itself) and the heatspreader itself (the thingy on GPU that where the copper heatpipe goes) may be bad/nonexisting. I decided to use some thin layer of silver paste also between this area.
After starting the machine and running Call of Pripyat Benchmark with 810Mhz GDDR3 memory and I was amazed. Wow! Some improvement.
Then I went to 820Mhz. That ran fine too. 830Mhz Also. What the heck? Then I went to 850Mhz and was surprised that It finnished properly.
What the heck is going on? Of course I went to more and put 900Mhz on! Now it's gonna crash for sure, I though!
Heck! The Pripyat Benchmark went just fine with 900Mhz clocks on memory
So far I didn't try any further, but I guess the limit could be somewhere between 900 and 920Mhz.
I couldn't believe that modding the heatspreader with small copper plates would give this much of amazing results
I guess 3dmark06 runs just fine over 7000 with my stable clocks of 680 Mhz GPU / 900 Mhz GDDR3 / 1610 Mhz Shaders
(Unfortunatelly the Futuremark ORB was down when running the test so I couldn't see the results).
The amazing thing was also that I wasn't able to get my GPU temp go over 64C after the modification even though I run several tests and played almost an hour of GTA4
. Previously it had been 68-70C.
PS. Also, before running the test we had a little speculation with SanZor that there are at least two types of memory: one is Samsung's and another is Elixiris, which of the first mentioned I have (that didn't go even mghz over 800Mhz stable) and the second one should be the overclockable.
However, I am very glad to see that even the Samsungs go bizarre when modding their spreaders with proper copper plates. Improvement from clock 800 (1600) to 900 (1800) Mhz is just amazing!
When I'll have X9100 there's the magical 8000 3dmark06 barrior to overcome after that
(Well, not, but 7600-7800 points could be possible).
-
Nice to hear that
. Me neither could ever expect this at first when I begin to think about for a year ago but you see
. I dont remember what type memory i have, didnt paid atention when i modded it. But anyway Im happy for you. Now I investigating how to modify fan rpm gears because as you say mine neither never gets hotter then 64 deg. cel. and with that temp fan is just at second of 5 rpm gear. And I am trying to get it at least in 3d or fourth or even on max (fan is not so loud anyway on max) gear then we could overclock even more
then we are able now.
-
Well, I am quite happy that fan is not loud - and I have no interest to try to push it run faster
Still, I shall try to push the memory bit higher, possibly something like you'rs, 920Mhz could be the stable maximum.
(However, my fan seems to use all of the gears from first one to fifth depending on the stress on the CPU & GPU. It will go up to 4th or 5th when running 3dMark06 or playing GTA4, but I think its normal, when it tries to cool down the CPU & GPU by increasing the rounds. And to me, it's not superloud on 5th gear but I wouldn't want it to run on it on always/on default.)
But what I am going to do next is to try to change X9100, as today my new CPU purchased from eBay was shipped and should arrive within couple of days. I am pretty sure to break at least 7500 3dmark06's now quite easily, possibly even 7800.
I will post the results soon as I'll get it. Hopefully it works (and runs cool & stable undervolted).
Also I should get a new 8-cell 4800mAh battery replacing the default 6-cell 4400mAh (that is already dead, after 8 months of using it). With my tests it seems to give a lot of more battery life, like over 3 hours overclocked and running desktop applications. On power save mode I think 4 to 4.5 hours are achievable. -
Well I am happy you are satisfied with your results. I dont think i would gain much more from little extra cooling but anyway would probably ensure even more stability of system. Even its very difficult to control fan on this model i will try to mod dsdt table so it kicks fan on full whwn temp vent over 60 deg. My max stable clocks for memory are 992Mhz and I dont expect more there. Have tried all from voltage to other mods but immediately after 100Mhz gpu gets vierd behaviour. Vaiting for your scores with new cpu
.
my Acer Aspire 6935G graphic card overclocking experiences
Discussion in 'Gaming (Software and Graphics Cards)' started by SanZoR, Nov 26, 2009.