yeah i opened my default bios in nipitor and it would only let me go to 1.03v, so i flashed both cards, dont really want to go higher because of damage i might do
and my gpx's temps are always below 80°C, idle is about 50°C
-
moral hazard Notebook Nobel Laureate
Did you open the notebook and check what the vram is really rated for?
You might get lucky and find that it's ok to go higher than 1ghz. -
According to this article that would be a 20% improvement, pretty impressive for a 12% oc. The CPU is they used is the same (i7-920XM), did you oc the CPU? Sadly they don't have any GPU score result, but u can always rerun the test at stock clocks to have a reference value.
-
An update:
I put my clocks as follows:
GPU - OC Voltage to 1.03
Stock; 600/1000/1500
OC; 672/1090/1695 - as of right now that's 12%/9%/13% increase.
I just did a 3D Vantage, it passed, but i cant see my latest scores because it was a trail version and you can only see 1 result, that sucks, but hey they got to market the thing.
So you can see the scores of my previous scores above, but i dont know the latest, unless i got on pirates bay and get a copy from there.
CPU - i7 920xm
Used RealTemp to alter multipliers and TDP/TDC
all 4 cores are set on x25, and the TDP - 75/ TDC - 62
**EDIT**
I just read that post/article Daniel, and i cant believe it only got 80% score, i know it gets poor results for battery life and weight, but you cant class a Sager/Clevo to a HP/Dell/Sony or others, even Dell's Alienware is finding it hard to keepup (hope i didn't offend anyone), ok its a bit on the heavy side, and the battery last less than a hour, but this is a proper gaming machine and media viewer, better than a XBox/Playstation. This is without a doubt a gaming machine with a LCD screen. All im saying is 80% score, what a crock of sh.it -
it seems that fx3800m stock core is already 675mhz
wondering its 3dmark result......XD -
-
hi all, can anybody help me out here
http://forum.notebookreview.com/sager-clevo/554761-help-overclock-my-np8760-2.html
trying to oc my gtx 280m and didn't goes very well
thanks in advance -
Hi
I'm wondering how safe it is to overclock the memory on a gtx 285m chip?
When overvolting to 1.03, the farhtest I dared to push it was 1100 Mhz, 10% over stock. The gain in benchmark was good but seeing that the memory chips of the 285m is rated at 1000 Mhz, this might not be a good idea?
I've seen people pushing the memory upto 1200 Mhz, though...
I guess what I'm asking is what would be within safe limits. -
I can give you some principles to follow:
There are two types of overclock. The overclock for benchmarks... in which you push the components to the limits of stability, run some benchmarks post the results and than revert back to the stock settings. This is usually the case for ppl. who OC for instance up to 1200 MHz. Usually this will not negatively affect your components because it's only done for very short periods of time with pauses between benchmark tests. However, if you want to run an OC 24/7, that's a different story and you should try to stay within some safety limits. -
-
And I want to know what is a safe voltage to overvolt??????...
-
A safe voltage to overvolt would be to 1.03v. NOT 1.3V!! Just remember to compensate for the extra heat, it can be 10-15 degrees higher than stock depending on the game and overclocks.
-
J.P.@XoticPC Company Representative
Those temps look a little too hot for my tastes. Even under stress your GPU shouldn't be going above 90C. Have you ever repasted your GPU, see if you can bring down temps? Or just simply clearing away dust from the exhaust ports?
-
Thanks, I'll open it up and take a look
-
How's Skyrim working for everyone?
-
Skyrim? ....
-
Can anyone explain why overclocking my video card seems to make it run slower?
I'm using nVidia System Tools and GPU-z and 3D Mark 06 report the clock speeds correctly but I'm getting P5552 on stock clocks and P2632 at 750/910/1550. Yeah the clock speeds seem high but I haven't been getting any artifacts or other glitches. -
-
gpu max 79, cpu max 92.
-
I could explain to you... but it's complicated.
In short:
Any CPU or GPU has built-in functions that correct the errors they create when processing information. When you overclock the card the number of errors increases. The GPU has too compensate by allocating more resources to correcting itself. If the performance gained from overclocking is not enough to compensate for the extra resources the GPU will need to correct itself, then you will get diminished performance. -
Artifacts come's even first than high temperatures, sometimes you get good temperature but the artifacts indicates you that the video card can't handle such overclock. -
moral hazard Notebook Nobel Laureate
If you set it to a ridiculous number like 750 then the driver will crash for a second and recover but will then stick to throttled clocks (monitor with GPUz, use the monitoring tab and click on the clocks so that they show you the average).
That's why your score is so low. -
I thought it was a little too good to be true.
But I've been going up incrementally and just game testing without running benchmarks and it either works or hard locks. Is there an actual way to tell how high you can go before the driver crashes without running endless loops of 3D Mark? -
Yes:
Download -
Great app thanks. But is the GPU stressor supposed to make the GPU core temperature keep rising? I mean I moved the safe temp cutoff all the way up to 95'C and it still got there within a couple of minutes. This is just on stock voltage.
-
YEs, it will get the core hot as hell. Generally I let it run for 10 minutes.
-
How hot?
Also does GPU-z correctly report the vgpu? Because I flashed a modded bios but it still only maxes at 1.000V -
When stress testing don't go above 100C. My current card has been as high as 103C. -
Is there another way besides gpu-z to check the vgpu?
Never mind it shows up correctly as 1.05v now. But how do you get OCCT to run for ten minutes?? My gpu went from 50'C to 101'C in 3:11!
The GTX 280M Overclocking and Benchmarking Results Thread
Discussion in 'Sager and Clevo' started by anothergeek, Apr 10, 2009.