There is a lot of information about this GPU spread out in many different threads, but I thought it would be nice to consolidate some information here for 580M owners like the 570M thread. Feel free to brag about benches, clocks, etc. For those of us running alternative vBios feel free to post up a copy to share with others.
So, I am copying some of the same questions from that thread here:
Those who have overclocked their 580M, can you please post the following details:
1. What are your OC frequencies and voltages?
2. Which laptop are you using?
3. Max temperatures under load?
4. What tool are you using for OC?
5. Are you using anything apart from stock config for cooling?
6. Which VBIOS are you using?
7. Are you having issues with throttling?
8. Any other tips? and post your benches!
As I said, I know this has been discussed, I just thought it would be great to have a one-stop thread for this information.
Many thanks!
To start us off:
740/1480/1600 in a MS-16F2
85-87C in furmark, 75C max in any game
I use nVidia inspector to OC, and afterburner to monitor temps
Stock cooling
Stock vBios
No throttle problems
I can hit 750mhz without any noticeable artifacts in furmark or atitool, but 3dmark11 will randomly crash if I go over 740mhz. All my games seem stable, but I don't want to push the card too hard right now. Default settings, with CPU physics I scored 3990, which seems a bit low to me.
Generic VGA video card benchmark result - Intel Core i7-2670QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P3990 3DMarks
-
Very nice thread, I'll be keeping an eye out on it. Just ordered some PK-1 thermal compound to reapply, afterwards I'll be trying my hand on the overclocking.
Have been running some slight OC's this afternoon, not really pushing anything.
Some results
3DMark Score 16547 3DMarks
Graphics Score15908
CPU Score18817
740/1480/1600
3DMark 11
740/1480/1600
P3974
Granted, I didn't push it hard, but it was completely stable. Running cool. -
Can either of you check the ASIC quality on the 580Ms you've got?
Download GPUz 0.5.9 to check that for us.It'll let us gauge what kind of % we need for the same overclock.
Also, you both know that you can flash the bios to allow a higher voltage right? -
I'm ASIC Quality of 79.2%
But still overclocks to those clocks easily. I get high 70's for my temps. Could push it further, but I'm going to wait. I'm pretty happy with what I've gotten. -
Oh... Well at least you're not on a 460M with a quality rating of 74.6%.
-
Mine is 84.9%.
-
As for the bios, yes I know about the higher voltage, but not ready to try it out just yet. I think I will wait a bit for that... -
The stock voltage used may or may not be higher due to this rating. But even if it's the same as most other GPUs of the same line it may not be making full use of that voltage it's been running at. For example, I'll give the two examples of the first GTX 460 and second 460 I got after RMA:
(Voltages were read via GPUz and altered by MSI Afterburner)
-Original GPU-
GTX 460 #1 has a stock voltage of 0.962v which is fairly normal in terms of a generic GPU. This is the factory normal setting that you will see for a reference GPU at the minimum. The ASIC, though, is around 82.7%. So @ stock clocks, 675/1350/3600, and @ the stock voltage of 0.962v we can assume that the GPU is running stably at the stock clocks while only utilizing a true voltage of only 0.788v. Mind you, this GPU could overclock to 850/1750/4000 @ 1.087v, yet with the ASIC quality in mind, the GPU was only managing 0.891v, lower than the originally rated stock voltage.
-Replacement GPU- (Home was stuck by lightning and the surge was enough to fry the power strip it was connected to. The PSU, GPU, and motherboard were fried in the process. This was recent, within the last two weeks. I paid for next day shipping.)
GTX 460 #2 has a stock voltage of 0.962v just like the previous GPU. I check the ASIC quality and it's rated at 96.3%. I go to overclock the GPU then shortly afterwards since the rating was considerably higher. I'm still hindered by the voltage cap of 1.087v, however I try something different this time. Instead of pushing the voltage to the maximum, I try to push the GPU as far as it will go without increasing the voltage. I manage an overclock that is fairly decent in my books for stock voltage, 780/1560/3800, and decide to push the GPU further with voltage this time. I end up with an overclock, hardly stable for 24/7 use, of 925/1850/4400 @ 1.087v. Vantage runs fine with a single run, yet I don't care to continue to run it since artifacts were clearly seen through the entire test. I eventually backed down to 890/1780/4200 @ 1.087v.
The voltage being regulated through the second card would have remained at 0.926v for the stock settings, considerably higher than the previous card, and the overclock would have been managing 1.046v rather than less than 0.9v from the previous GPU.
This is all just speculation but I think should be taken with a grain of salt. Personally, it's been proven that two flawlessly perfect GPUs at the same voltage will be able to overclock equally to one another. Using this logic, if one card has a lower voltage leak through the circuit board than compared to a brother card, it should be capable of reaching a higher level than the hindered card. -
Meaker@Sager Company Representative
I'm not sure you can apply the percentage to the voltage and come up with a number that means anything, just the higher the leakage the more current you waste and the hotter the chip gets quicker at the same voltage.
-
That thread looks like they were all speculating because the asic quality thing was brand new. -
If that's the case, mine falls right in the middle, lol.
C'mon 580M owners, lets get some Overclock feedback in here! -
Meaker@Sager Company Representative
Well I have 92.6% and one of THE highest clocking 570Ms in existence.
It may be different for mobile chips since they run at lower voltages so the profile shifts might be skewed. -
insufficiency of PSU ?
like GTX470M vs 480M
the GTX570M has higher cost/performance than GTX580M
MSI-GX660
I7-920XM @ 3.5GHZ
GTX570M @ 840MHZ / 914MHZ
3DV P15400+
3D11 P4200+
I'm pretty sure the OCed 570M will outperform an OCed 6970M
-
Meaker@Sager Company Representative
Imagine that each percentage chip has a different curve where the clock you get at a certain voltage rises then falls, higher % chips rise and fall more quickly. While the lower percentage chip may clock better at higher volts it will use MUCH more power at that point.
At mobile voltages (at a guess) all chips are still on the upwars slope. -
@mitsuhide this thread is to compare GTX580m scores. We can all go see the 570m thread to be impressed by your score. Thanks.
-
-
I just used Svet's tool and upped the voltage to .92 and changed the clocks to 800/1600/1600 and it worked great.
3dmark 11 4200+
Generic VGA video card benchmark result - Intel Core i7-2670QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P4239 3DMarks
I'm gonna see if I can push it a little higher, this was just my first run. -
Looks like 820 is my max. Any higher and the machine simply shuts off under load. I have a 180 watt PSU so maybe I need a larger one, or my card simply can't go higher.
Generic VGA video card benchmark result - Intel Core i7-2670QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P4338 3DMarks -
Interesting. I'm thinking about trying it too... but I'm not even sure there's a need to, other than the "want to" portion.
-
-
yep pretty hard to resist...
Guys, have you tried gaming with 800+ core clock? I know some 180w adapters didnt really like it, esp with heavy games like BF3 that's why I ended up swapping for a rock 220W LiShin.
Meaker, seems like you got someone to compete with 570Ms -
-
This appears to be my max:
-
My old 180w adapter (LiShin) suddenly shutdown during a Vantage test @ 840mhz (power shutdown) so I got a bit frightened that the 580M was overkilling my adapter. In addition, weirdly, while I was playing BF3 @ 800/[email protected], the game seemed less smooth than @ 750/[email protected] due to the larger power required by the o/v.
As I also wanted to o/c my CPU with the bclk, I thought a 220w adapter would be more appropriate to my use and now as long as temps are well handled (less than 85°C), I could play/game safely with even a higher o/c than 822 but It might be a bit overkilled.
Also my machine never shutdown itself, just the PSU.
edit: when I got back home, I will def try the new 29x drivers as from what I heard they give some perf boost on some games and 3dmarkVantage -
-
83 seems all right to me, i got 82 max gpu temps while running vantage at 835 core clcks without the fan mod and if i remember last time got 78 or 79 with the fan spinning at 100percent from 65degrees.
However my cpu is getting less hotter than his since his max power limit is 65w instead of 55w for my qm.
Also i dont recommend you running furemark with 0.92v vbios whatever clcks since many alienware users reported their 580m death after too much benchmarking, furemarking and 0.92v vbios.
Shaun bear in mind we also drilled holes onto the back cover and added ram heatsinks to help lowering temps. -
Did this earlier:
Max after 5 minutes in furmark was 96°C.
EDIT: 3DMark11 score at the same clocks:
http://3dmark.com/3dm11/2887583 -
so you guys oc using MSI afterburner ?
May i know can i OC while gaming and switch back to default while not gaming ?
Is it safe to do that always -
There is also a tool called Svet's vbios tool which lets you program clock directly into the vbios of your card. Then the card runs at the clocks you specify for a given power state all the time, without needing software.
I use the overclock all the time, when the GPU isn't under load it clocks itself down anyway. For normal web browsing, and other light uses 90% of the time it runs at 50-70 mhz. -
I also use NVIDIA Inspector. I use a bat file to overclock, run throttlestop, max out the fan, and set the power plan to "High Performance" when benchmarking or gaming.
Did this before the earlier 3DMark11 run:
I don't think I can go much higher. -
where to download nvidiainspector ?? -
i thought i got keyloggered -
I drilled holes in the bottom of my case as well and it improved temps, but I hadn't added heatsinks to the RAM. Unless you are talking about the RAM on the 580m in which case mine was done with thermal pads.
-
-
Was talking about small aluminium ram heatsinks onto the copper sinks of the gpu heatsinks. I am a bit scared to run furemark but what are your max gpu temps after vantage and gaming? You do have turbo fan on?
-
May i know how come MSI based notebooks cannot oc more than 740 for gaming or benchmarks ? they go crash or freeze .
But i see people here with barebones go up to 800 which is insane -
As I expected, new drivers give a small boost on Vantage graphics score:
Generic VGA video card benchmark result - Intel Core i7-2720QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P18959 3DMarks
Max GPU temps 83°C, CPU 79°C
CPU score decreased though..
3dmark11 score a bit lower, max GPU temps 74°C, CPU 71°C.
Generic VGA video card benchmark result - Intel Core i7-2720QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P4613 3DMarks -
Im not aware of MSI notebooks crashing or freezing above 740, that's news to me but maybe someone else can answer. -
I could go over 740 mhz with default vbios but it crashed above 750mhz @ 3dmark11 or Vantage, can't remember.
I used aluminium heatsink not to replace thermal pads (I use 0.5mm Phobya 5w/mk pads and AS5) but to dissipate more hot air from the GPU and CPU copper sinks rather than leaving it to the fan.
http://forum.notebookreview.com/msi/639104-ms16f2-along-gtx580m-dell-rev3.html
Re-run 3dmarkVantage without the heater on next to my laptop: max gpu and cpu temps = 79°C and:
Generic VGA video card benchmark result - Intel Core i7-2720QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P19036 3DMarks
Did your gpu die ?? What happened ? -
Ah I see what you mean now. A picture is worth a thousand words
I got some 5w/mk pads as well. How come you not using IC Diamond? Yes my GPU died... At least that's the indication. I have another one coming, hopefully here soon then I will know for sure if it really that or the motherboard. Just decided to die during a Furmark run. Was only 72 C and running for some 30-60 seconds. A bit strange.
-
Shaun, many gtx580m died because of the excess use of furemark and benchmarking. It happened since when the 29x drivers came out and with the use of the 0.92 overvolted vbios. About 15-20 M17xR3 owners reported the death of their 580m under odd conditions: playing bf3 with 800/900 clocks, running furemark at stock clocks but 0.92v etc.
I do not advise to run furemark then. -
Also why OCing at 740MHz cause more lag spikes ?
I tried default clock , although is 5-10fps lower in game but it stays green instead spikes -
I would have to agree its a killer. I was running at stock voltage. I must admit I had run Furmark countless times, perhaps too many. I got another GPU yesterday, and let me say a massive "THANK YOU" to Ken from GenTechPC who has been really very helpful throughout this entire process, and now my system is back up and running perfectly. Will I ever run Furmark again? No way... Not even once. I'll stick to the regular benching from now on. We're all here because we're enthusiasts, we like to get the most we can out of our systems and we use the tools the industry hands down to us. But did I ever think a program could kill the GPU like that which has a good (air) cooling system? Perhaps I am ignorant but I didn't think that. Learnt the hard way though I did!!! Im certainly interested to find out what component failed on the GPU so if I get some more info I'll certainly share it.
-
Glad to hear you got your new 580M
Let's bench that thing again ow that you're fully equipped and forget about furemark -
-
how did yours get damaged ?
Please tell me so that i will avoid doing them -
GTX 580M Overclock
Discussion in 'MSI' started by sparker, Feb 25, 2012.