650m, MSI GE60
I ran my overclocks almost half a year ago now. I hit i think around 1160MHz @ 60C° max LOL. My cooling in the GE60 seems to be really good.. When i tried pushing it further my screen would go black prolly cuz it wasnt having enough volts. Pitty because i felt there was enough room left for a higher clock..
I usualy only use the +135MHz OC. Temps are fine and will never be higher than 60C°.
-
-
Now that I finally figured how to go above the +135 Mhz overclock thanks to some advice from daCM here are my numbers:
Card Maker: HP
Model: GT 650M
Bios Version: Stock 80.07.35.00.2B
Stock Core speed: 745mhz/835mhz with boost
Stock Memory speed: 2,000mhz
Overclocked Core speed: 1,100mhz
Max Overclocked Core speed: 1,150mhz
Overclocked Memory speed: 3,100mhz
Max Overclocked Memory speed: 3,135mhz
Are these clocks your every day use clocks?: When running games
Are these your Max Clocks?: No
System: HP dv6t 7000
Max Temps: 85c Load -
-
1195Mhz core, 3000Mhz, memory. Did passmark 3d test (strugeling with 3dmark with my OC ram)
this score is higher than the average 670m scores.. -
did some fine tuning and the max i can run without any mods is Clock@ +135 Mhz and Memory@ +280 Mhz if i do 300 Mhz i get artifacts...
i get 1840 3D Points at the PerforamanceTest 8.0 with it
Driver 331.82 WHQL -
On edit:
James D,
I didn't mean to sound as if I don't believe or trust your opinion. I totally believe, and understand what you are saying. As it looks as if you have far more knowledge in this area vs my random poking around at things has given me. I just meant it as it's a risk I was willing to take and if/when things do go belly up, I will be prepared to accept it. As it is, this overclock on the GPU is giving this laptop some extra life in my eyes anyway. If I wasn't able to do it I would be replacing it as things where getting to slow on some newer games. This gave it that extra little umph that makes me OK with keeping this laptop around for another 6+ months. It has been a great (but very warm) laptop after all and personally I am bummed HP doesn't have a comparable replacement for it to take it's place. I will just have to move on to a different company. -
Then realize that your GPU diode shows die's temperature (core) and not memory chip's which can go far, far above 100C.
Nice thing to google is fried vRAM. -
3̶D̶M̶A̶R̶K̶ ̶1̶1̶ ̶s̶c̶o̶r̶e̶:̶ ̶<̶a̶ ̶h̶r̶e̶f̶=̶"̶h̶t̶t̶p̶:̶/̶/̶w̶w̶w̶.̶3̶d̶m̶a̶r̶k̶.̶c̶o̶m̶/̶3̶d̶m̶1̶1̶/̶7̶6̶8̶3̶8̶6̶1̶"̶>̶4̶2̶3̶4̶<̶/̶a̶>̶
G̶P̶U̶-̶Z̶:̶ ̶[̶I̶M̶G̶]̶h̶t̶t̶p̶:̶/̶/̶g̶p̶u̶z̶.̶t̶e̶c̶h̶p̶o̶w̶e̶r̶u̶p̶.̶c̶o̶m̶/̶1̶3̶/̶1̶2̶/̶1̶9̶/̶n̶n̶.̶p̶n̶g̶[̶/̶I̶M̶G̶]̶
C̶a̶r̶d̶ ̶M̶a̶k̶e̶r̶:̶ ̶L̶e̶n̶o̶v̶o̶
M̶o̶d̶e̶l̶:̶ ̶G̶T̶X̶ ̶6̶6̶0̶M̶
B̶i̶o̶s̶ ̶V̶e̶r̶s̶i̶o̶n̶:̶ ̶2̶.̶0̶7̶ ̶(̶U̶n̶l̶o̶c̶k̶e̶d̶,̶ ̶b̶y̶ ̶s̶v̶l̶7̶̶,̶ ̶w̶h̶i̶c̶h̶ ̶w̶a̶s̶ ̶t̶h̶e̶n̶ ̶m̶o̶d̶i̶f̶i̶e̶d̶ ̶a̶g̶a̶i̶n̶ ̶t̶o̶ ̶i̶n̶c̶r̶e̶a̶s̶e̶ ̶6̶6̶0̶m̶ ̶v̶o̶l̶t̶a̶g̶e̶ ̶f̶r̶o̶m̶ ̶1̶.̶1̶0̶0̶ ̶t̶o̶ ̶1̶.̶1̶5̶0̶ ̶(̶b̶y̶ ̶E̶r̶Y̶a̶n̶i̶
̶
S̶t̶o̶c̶k̶ ̶C̶o̶r̶e̶ ̶s̶p̶e̶e̶d̶:̶ ̶8̶3̶5̶ ̶M̶H̶z̶
S̶t̶o̶c̶k̶ ̶M̶e̶m̶o̶r̶y̶ ̶s̶p̶e̶e̶d̶:̶ ̶2̶5̶0̶0̶ ̶(̶1̶2̶5̶0̶̶ ̶M̶H̶z̶
O̶v̶e̶r̶c̶l̶o̶c̶k̶e̶d̶ ̶C̶o̶r̶e̶ ̶s̶p̶e̶e̶d̶:̶ ̶1̶3̶0̶0̶ ̶M̶H̶z̶
O̶v̶e̶r̶c̶l̶o̶c̶k̶e̶d̶ ̶M̶e̶m̶o̶r̶y̶ ̶s̶p̶e̶e̶d̶:̶ ̶3̶0̶5̶0̶ ̶(̶1̶5̶2̶5̶̶ ̶M̶H̶z̶
A̶r̶e̶ ̶t̶h̶e̶s̶e̶ ̶c̶l̶o̶c̶k̶s̶ ̶y̶o̶u̶r̶ ̶e̶v̶e̶r̶y̶ ̶d̶a̶y̶ ̶u̶s̶e̶ ̶c̶l̶o̶c̶k̶s̶?̶:̶ ̶H̶e̶l̶l̶ ̶Y̶E̶A̶H̶!̶!̶
A̶r̶e̶ ̶t̶h̶e̶s̶e̶ ̶y̶o̶u̶r̶ ̶M̶a̶x̶ ̶C̶l̶o̶c̶k̶s̶?̶:̶ ̶C̶a̶n̶ ̶g̶e̶t̶ ̶u̶p̶ ̶t̶o̶ ̶1̶3̶5̶0̶/̶3̶1̶0̶0̶ ̶(̶1̶5̶5̶0̶̶ ̶w̶i̶t̶h̶ ̶m̶i̶n̶o̶r̶ ̶a̶r̶t̶i̶f̶a̶c̶t̶s̶,̶ ̶w̶h̶i̶c̶h̶ ̶o̶n̶l̶y̶ ̶a̶p̶p̶e̶a̶r̶ ̶i̶n̶ ̶3̶D̶M̶A̶R̶K̶ ̶1̶1̶/̶V̶A̶N̶T̶A̶G̶E̶,̶ ̶I̶ ̶m̶a̶i̶n̶l̶y̶ ̶p̶l̶a̶y̶ ̶D̶O̶T̶A̶ ̶2̶ ̶a̶n̶d̶ ̶g̶e̶t̶ ̶n̶o̶n̶e̶ ̶i̶n̶ ̶t̶h̶e̶r̶e̶ ̶e̶x̶c̶e̶p̶t̶ ̶f̶o̶r̶ ̶r̶a̶r̶e̶ ̶s̶c̶r̶e̶e̶n̶ ̶f̶l̶i̶c̶k̶e̶r̶i̶n̶g̶ ̶w̶h̶i̶c̶h̶ ̶i̶s̶ ̶c̶a̶u̶s̶e̶d̶ ̶b̶y̶ ̶a̶ ̶c̶o̶m̶b̶i̶n̶a̶t̶i̶o̶n̶ ̶o̶f̶ ̶m̶y̶ ̶s̶c̶r̶e̶e̶n̶ ̶r̶e̶s̶o̶l̶u̶t̶i̶o̶n̶ ̶(̶e̶m̶u̶l̶a̶t̶e̶d̶ ̶1̶9̶2̶0̶x̶1̶0̶8̶0̶
̶ ̶o̶n̶ ̶m̶y̶ ̶7̶2̶0̶p̶ ̶T̶V̶ ̶a̶n̶d̶ ̶a̶n̶t̶i̶ ̶a̶l̶i̶a̶s̶i̶n̶g̶ ̶s̶e̶t̶t̶i̶n̶g̶s̶.̶ ̶H̶o̶w̶e̶v̶e̶r̶,̶ ̶j̶u̶s̶t̶ ̶t̶o̶ ̶b̶e̶ ̶s̶a̶f̶e̶,̶ ̶I̶ ̶t̶o̶n̶e̶d̶ ̶i̶t̶ ̶d̶o̶w̶n̶ ̶t̶o̶ ̶t̶h̶e̶ ̶a̶b̶o̶v̶e̶ ̶f̶o̶r̶ ̶e̶v̶e̶r̶y̶ ̶d̶a̶y̶ ̶u̶s̶a̶g̶e̶.̶
S̶y̶s̶t̶e̶m̶:̶ ̶L̶e̶n̶o̶v̶o̶ ̶Y̶5̶8̶0̶
M̶a̶x̶ ̶T̶e̶m̶p̶s̶:̶ ̶7̶2̶°̶ ̶C̶ ̶m̶a̶x̶i̶m̶u̶m̶ ̶d̶u̶r̶i̶n̶g̶ ̶3̶D̶M̶A̶R̶K̶ ̶1̶1̶,̶ ̶m̶e̶a̶s̶u̶r̶e̶d̶ ̶b̶y̶ ̶N̶V̶I̶D̶I̶A̶ ̶I̶n̶s̶p̶e̶c̶t̶o̶r̶ ̶s̶e̶n̶s̶o̶r̶ ̶m̶o̶n̶i̶t̶o̶r̶i̶n̶g̶.̶
Invalid result, see: http://forum.notebookreview.com/gam...erclocking-result-database-8.html#post9589503
Let me keep the Like XDsasuke256 likes this. -
I knew that lenovo was a monster i knew it !
-
sasuke256 likes this.
-
-
EDIT: Although, I'd be interested in knowing how to on my y580 if it's possible (which I don't think it is). -
What GPU-z says about quality of your chip? -
Initially, I had a HDD I moved from the standard sata slot to the optical drive via caddy, but then I bought an mSATA SSD (SX300), it's 256 GB, so that's the only storage/drive in my laptop, and using the HDD as an external HDD via USB. So, I never put the optical drive back in and now it's just empty. Does that answer your question..
And, I'm not sure what you mean by "quality of chip."
EDIT: Wait, are you saying I CAN put a second GPU in the optical drive?? Also, I just noticed that there is a a third msata/pcie slot which isn't being used (excluding mSATA, and wifi card) and after looking at some pictures of what mobile GPUs look like, the slot seems similar... so they're the same? How would you put a heat sink on it and since it'd be using more power would a higher wattage power adapter be needed or further adjustments? -
the y580 cant be moded with a second gpu in the optical bay
only the y500 and the y510p -
Really? Then I gave him false hope.
@blowntaha Read ASIC quality. -
lol i kinda already figured I couldn't. Where is ASIC quality? Isn't that a type of bitcoin miner
-
-
Mine reads at 77.5% What exactly does this number mean?
sasuke256 likes this. -
nice chip, it's quality of your gpu, the more high it is, the best it is
-
so an ASIC reading of 77.5% is a nice one? Would it have been 100% when I first receive it or is that number static and never changing. Do different voltage setting changes (via BIOS flashing) affect this number?
Also, sort of an off-topic question, but I was looking to buy a new charger for my laptop (to keep in two different locations in case I forget to bring it) and I came across an ASUS G75 charger which is 150W. Standard chargers for y580 are rated at 120W. I know it fits because G75 can fit in ASUS K-series, and K-series will fit in y580. So, my question is, if I get this one, will it be able to provide better stability at higher voltages? I was also thinking about bumping my voltage to 1.175 or 1.15 for everyday use (I brought it down to 1.125 coz of some weird glitches).
Here is the link for the charger: Asus G75 G75VW TS71 G75VW TS72 G75VW RS72 19 5V 7 7A 150W AC Adapter Cord | eBay -
if you got glitches try to lower your memory clocks
and the number is staticASIC quality is pure chance
The charger should work but be wareful you cant OV as you want to -
They're both 19.5 V, but the ASUS one is 7.7A, where as the y580 one is 6.15A. So that makes the ASUS one (19.5) X (7.7) = 150 W. From what I've researched, whenever you use different chargers, you have to make sure the voltage is the same and amperage has to be same or higher since the system will only pull what it is needed.
But, what I'm asking is would it contribute to better system stability if I were to increase voltage from 1.125 V to 1.175 V && AND && use throttlestop to lock the CPU multiplier to stay in turbo mode (or rather keep it in turbo mode for a longer duration and more frequent bursts).
What I noticed from the 1.175 V and the 1.15 V was whenever I'd use throttlestop for skyrim/benchmarking/etc in addition to light OCing (e.g. 1250-1300 core 2800-3100 mem), I'd get driver crashes and had a few BSODs (all of which were the ones where something timed out for the GPU, I forgot the exact message). -
Just an update, yes it does provide better stability, xD
-
3DMARK 11 Graphic score: 3267
Card Maker: Lenovo
Model: GTX 660M
Bios Version: 2.07 (Unlocked, by svl7), overvolted using erYani mod to 1.125V
Stock Core speed: 835 MHz
Stock Memory speed: 2500 (1250) MHz
Overclocked Core speed: 1290 MHz
Overclocked Memory speed: 2850 (1425) MHz
Are these clocks your every day use clocks?: If it isn't above 80 C during game, might lower it a bit.
Are these your Max Clocks?: Yes, it has some artifacts during 3Dmark11 but no problem during game (Borderlands 2)
System: Lenovo Y580
Max Temps: 84° C maximum during 3DMARK 11, 78° C maximum during Borderlands 2 (+-80% GPU load)
ASIC Quality 82%
I love this laptop and GPUThanks to modders, we can OC this card to its maximum potential.
I wish I could reach as far as Blowntaha (1300/3050) but my temp is reaching outside of my comfortable zone so I think I will stop at 1.125V and be grateful to what I got -
3DMARK 11 score: 3983.
GPU-Z:
.
System: Lenovo Y580
Card Maker: Lenovo.
Model: NVIDIA GeForce GTX 660M.
Bios Version: I modified svl7's 2.07 unlocked BIOS by increasing voltage in VBIOS by ~10.23%. e.g. 1.10 V ==> 1.2125 V (reads as 1.2120 V in GPUZ && NV Inspector which I assume is actual).
Stock Core speed: 835 MHz (changed base clock to 1200 MHz in VBIOS).
Stock Memory speed: 2500 MHz (changed base clock to 2800 MHz in VBIOS).
Overclocked Core speed (Benchmarking/Everyday): 1400 / 1385-1390 MHz
Overclocked Memory speed (Benchmarking/Everyday): 2800 / 2800 MHz
Are these clocks your every day use clocks?: See above ^.
Are these your Max Clocks?: See above ^.
Temps: Temps for these applications =
- Idle: 42°-50° (if my window is open
)
- 3DMARK 11: 68°-70° C.
- Skyrim: 74°-75° C.
- Crysis 3: 76° C.
- Dota 2: ~73° C.
- Absolute max temp I saw was 82 but this was because: (A) My window was closed; (B) I forgot to take my laptop's bottom cover off; (C) It was a hot day.
- TLDR; 42°-82° C.
Please ignore my earlier benchmark of 4234. It should, by all means, be considered false due to virtu MVP. Now, when I ran this benchmark, both "Hyperformance" and "Virtual V-Sync" were BOTH OFF but virtu MVP 2.0 was still "enabled." For those of you who don't know, this program actually changes how graphics are rendered and sort of combines optimus and SLI; and works great for the 3% of games it actually supports (Don't believe their "supported games" list). Anyway, I wanted to try out virtu mvp once again to (A) give it another chance and (B) to see if it influenced my 3dmark 11 score. I turned off both h-performance and virtu-vsync and then turned off virtu mvp 2.0 via control panel -> restarted and my score was really ty. Then I repeated those steps, only this time I didn't disable virtu mvp itself.
- Idle: 42°-50° (if my window is open
GT 650m GTX 660M overclocking result database
Discussion in 'Gaming (Software and Graphics Cards)' started by bignaz, Aug 23, 2012.