ok i will help you overvolt your card, be patient with me my right arm is in a cast and its hard for me to type, also im using the mouse left handed har har. let me gather the resources necessary for you to overvolt.
@ nissangtr786 please be quite no one cares about what you have to say about power consumption. also have you considered how much power you are wasting by trolling this thread, and all other threads with your bad opinion on the 675m?
-
Its not a bad opinion though is it. I just seen youtube videos with 650m and its amazing. The graphics look great on lots of games as kepler has more advanced features then fermi. By the way I know for a fact someone who games for 1 hour on there 675m will take more then if I used my laptop 8 hours a day trolling or I could game for 12 hours in 1 day and it will take similar power consumption to a 675m user doing a 4 hour gaming session.
-
ok here is the stuff you will need, just extract the folder to your desktop or wherever for now.
download the attached .zip
download nvflash from here and put it in the same folder as the .bat files from the .zip Download NVFlash 5.118 for Windows | techPowerUp
download nibitor from here and put it in the same folder as the .bat files from the zip Nvidia BIOS Editor (NiBiTor) 6.06 - TechSpot DownloadsAttached Files:
-
-
Make sure you monitor temps. OV'ing a card would generally increase temps more than OC'ing it would.
-
Would the increase be large? I'm not sure then because it is summer in California and the temperatures will only be increasing. I would really like to keep my temps at 85c.
-
it increases from whatever stock is to .92 volts, and its easily reversible.
remember to run all of these as an administrator.
basically run the backup.bat file, once you have extracted everything i gave you into the same folder then use nibitor and open the backup.rom then if it says it cant determine which device or something, click the 580m in the list, as it is the same card, then click tools, fermi voltage editor, and once that opens click the list farthest to the right and move it from whatever stock voltage is, to .92 volts. then save it as vbios.rom and open flash_vbios.bat. if it gives you and error and beeps message me back, it means you have a mismatched board id of sorts(this will probably happen). this mismatch can be rectified in nibitor.
if the mismatch happens, go ahead and try and get a print screen of nvflash is trying to tell you, as it goes away very quickly, this could take a few trys.
post back with results! -
I expected it to more steps, shouldn't be difficult. I'll be back in a bit and test out Battlefield 3 and Alan Wake.
-
ok cool, yes its easy, but people make it hard, also i cant play right now since my arm is broken but whats ur bf3 username? add me mine is bulletdropped
-
Sure, I'll add you. Haven't played much of BF3 since I played it a ton on PS3 before trying PC gaming. Damn the difference between PC and PS3 BF3 is huge. I was pretty damn good on the PS3 but PC is a different story. I'll probably start playing again once I purchase Premium but I'm not sure if Premium is worth $50. Anyways I'll add you right now.
-
I am just asking you to be cautious because you said that you were already at high 70s/80s with stock voltage and OC right? Plus we got a heat wave here (not sure about CA/you AC though).
If you can monitor them and make sure the GPU isn't going crazy, you should be fine. As to answer your actual question, I have no idea sorry. I don't have any hands on exp. with a 45nm card so I don't know how much it will increase by. -
Its a 40nm card and imo no point overclocking.
-
Yeah I'm going to monitor the temps with MSi Afterburner in-game. I will also have HWmonitor in the background to check the temps of everything else.
-
The minute you reference YouTube videos for calling out how good kepler looks compared to Fermi, or even for discovering image quality at any rate, is the moment you cease to have any credibility if you had it to begin with. Welcome to my ignore list.
-
Dude let it go. What is your problem? Are you being serious or are you trying to troll? This forum is going to the dogs because of people like you exist and are allowed to randomly spew their nonsense unchecked.
-
Out of all the forums I'm on, I've seriously never seen someone be allowed to run amok for so long. Stating personal opinion as fact in a derogatory manor while claiming a superlative position based on nothing more than flawed logic and hopes and dreams is no way to go through life.
On topic, I really need to download MSI afterburner and see what my 675m can do. Even though, you know, there's no point to it. I can barely afford my card's insatiable appetite for juice as is! I can't afford the projected $6,248 a month extra it'll cost if I up the speed. -
Let me know how that works out for you. I'm not really aching for any more performance, seeing how my GPU is around 84-86C under load depending on if I have the AC cranking or not (IT'S SO HOT IN NYC RIGHT NOW IT'S CRANKING).
-
lol, you say that a card is 45nm when I say the correct 40nm process and I am the one talking nonsense.
Here is my issue with 670m and 675m buyers:
1. new technology like 7970m was out and 680m and 660m are out as well now. 680m came out like 2 months later.
2. 660m performs virtually the same as a 670m so why does anyone get a card that takes double the power consumption of a 660m just to get a 670m.
3. 675m is like buying qx6850 65nm when something like the qx9770 came out and at similar top end price 675m users where buying when I saw you could get 7970m at some places cheaper.
4. overclocking a 675m, when the performance increase will not be much but power consumption will rocket. Its actually like buying a q6600 when power consumption doubles if you OC it slightly while the new 45nm OC it the same and it takes 100w less.
5. 2010 100w card for same performance 2012 50w card for same performance 2014 maybe probably less then 25w for the same performance as a top end fermi.
6. Nvidia ceo main goal is to increase performance per watt and fermi is one of the wost generations for performance per watt increase over there 55nm or 65nm cards. Amd at 40nm were miles a head. -
I tried to run those out of curiosity but just got an error..
-
Finaly i've got my bios updated...
This is what i needed to do...
First u need the latest version of nvflash and put the files on your DOS usb or dvd.
Then download these files.
Create a new folder in your DOS usb or dvd named "dpmi" and ad the BIN and MANIFEST folders in the new created folder(dpmi).
When booting DOS type this,
"\dpmi\bin\cwsdpmi -p" and hit enter.(do this before u start nvflash)
"nvflash -b backup.rom" this is for a backup of your .rom bios
"nvflash -i0 -4 -5 -6 (romname.rom)" this is for updating your new .rom bios
I hope it will help for some people
I'm not responsible for any damage or errors -
Dude you need to calm down. Not everyone has the same opinions as you.
-
the reason its not working is because an id mismatch, you need to match the id in nibitor, just take a print screen of the error and go into nibitor and change it so that they match. im not sure why you need to do this, but it does need to be done.
-
thanks but that is so small any chance you have a bigger version so i can read it?
-
No need for this, that's an error with nvflash... it happens even if you don't touch the ID, and I doubt he touched the subsystem ID.
Just force the flash and confirm. -
Hey mate, attempting to give this a go with my GTX675m however having one or two problems. First one is after loading the backup.rom I get this:
I left it on GeForce8 Series anyway - was that correct?
Next problem I have is you say go to tools menu and fermi voltage. That appears to be greyed out no?
Also the device and vendor drop down box options are empty by default. I can find GTX580M in the device box and I just chose Nvidia as vendor. Is that the correct thing to do? And as the fermi voltage option is greyed out, where do I go from here?
Thanks! -
well you need to try and make it think its a 580m somehow, so im not sure, although i think the new nibitor will come out soon. could you zip the rom file and attach it so i can see? thanks. also try the foil tape mod if you can, it really improved my temps.
-
View attachment backup.zip
There you go mate.
I've managed to find a thread with some modified ones already:
http://forum.notebookreview.com/alienware-m17x/656685-guide-flash-your-gtx-580m-gtx-675m-r3.html
What's this 'foil tape' mod? Please elaborate! Thanks! -
here this is it, im not sure how you would do it on an m17x, but basically you use aluminum foil tape to make it so the air goes through the heat sink fins and doesnt re-enter the intake, by sealing the holes in the vent with foil tape, not the exit/entrance hole, just everything else.
http://forum.notebookreview.com/sag...0em-7970m-gpu-cooling-performance-foiled.html -
ok just use this bios everyone that has an m17x r4, not for clevo as im not sure its the same! 675m92v.rom use the flashing method i had outlined before.
-
I've left it for now though. I was slightly put off by the warning comment by the author of that other topic I linked to saying not to run a benchmarking tool as it'll kill your card!
I think for now, I'll just see how far I can push the clocks up without too much heat gain.
Sent from my iPhone using Tapatalk -
haha. be wary dont turn on your computer, eventually it will die...
-
Oh I'm not concerned with overclocking effects on longevity of graphics card, but it is more concerning when somebody says 'over volt, but don't run this application because it will kill your card'.
Sent from my iPhone using Tapatalk -
Been toying with MSI Afterburner. I can push the Core Clock from 620 to 770 MHz and not see any problems in Furmark.
However, the second I increase my Memory Clock from 1500 MHz to anything above, I start to see artefacts and tearing etc.
Is Furmark the best way to see how your overclock is performing? Because I could load up Skyrim with Memory Clocks higher then 1500MHz without seeing the artefacts and tearing etc.
Temperatures peaked at 67'C. -
Just out of curiosity, what is your ambient? Do you use a cooler or have you raised the back of your laptop or anything? I'm curious because 67 is really good.
Furmark is a good way to check the stability of your GPU under stress. To see how much of an increase you get in terms of performance if your temps stay low in Furmark, you can run other benchmarks. 3DMark11 is one of the most common ones.
To be honest, I don't think you (or anyone else for that matter) needs to keep a GPU or any other component overclocked while playing games, unless you cannot play a certain game at stock.
To check for performance, by all means, OC until you get artifacts or even crashes. That will tell you the limit to which, two years from now, you might want to OC to play games. If you can play all games at stock, keep those clocks. -
Heya, my ambient is around 40'C to 45'C. I live in the UK and we've forgotten what heat is this year lol
The M17X is sitting on a CoolerMaster U3 with the three fans placed under the laptops exhausts. To the touch the air getting exhausted is quite cool. However when it gets up to 67'C top middle portion keyboard side gets quite hot, yet the exhausted air remains quite cool when I expected it to be quite warm.
I agree with your thoughts on overclocking though, and to that end I've left MSI Afterburner running at startup. When I want to overclock I can easily do so by right clicking the icon and choosing the OC profile.
Sent from my iPhone using Tapatalk -
You are getting 67 degrees in FURMARK with an ambient of 40 on a 675m?????
-
Yup. Used three applications to monitor temperatures, MSI Afterburner, Speedfan and the readout that FURMARK gives. Evening now and temps are lower still.
Unless I'm doing something wrong, I'm reporting what it's telling me!
Overclocked using MSI Afterburner 730/900 (stock 620/750).
EDIT: I appreciate the burn in test has not been running long in the screenshot above. It took me several attempts to get the screenshot, so had Furmark running several times consecutively, the temperatures were always <65'C regardless of how long it had been running. -
Is this at that stage at the beginning or towards the end? I am not trying to question you. I am just surprised that the card is running so cool at that ambient. It obviously means that the cooler is doing a great job. I am just surprised that it is doing THAT good of a job.
-
The screenshot shows the benchmark in it's early stages, but this is only because I had to go in and out of Furmark a few times to get a screenshot (F9 didn't seem to work, had to use Print Screen). Just prior to the screenshot it had been running for about 5 minutes and still didn't creep above 65'C.
I don't know what else to say
-
Msi afterburner should create logs telling you min/current/max temp. The temp will go down extremely quick, it's the max temp that is relevant.
-
Yup. Min 32'c Max 65'c during that session. I'll do another benchmark later and log the temps to file so you can see for yourself.
-
Here's the log from MSI Afterburner. Log from start to finish of Furmark's 1080 pre-set benchmark which runs for 15 minutes.
View attachment HardwareMonitoring.txt
One thing I am curious about though, which I'm sure someone can probably answer, is why the core and memory clocks drop periodically from the overclocked values of 730/900 to an underclocked value of 400/750? Is this some form of throttling? -
THERE we go. I KNEW the 675 wouldn't maintain 65ish with an ambient of 20 degrees, let alone 40 degrees. Yes your GPU is throttling. Notice how it goes up to about 63-64 degrees with OC'ed clocks and then drops to 2D clocks only to OC back up? That is why your card never heats up beyond 65. Have you tried gaming on that?
-
Yeah gaming is fine for the games I run, which granted, with the exception of Skyrim, aren't that intensive. The Secret World, Diablo 3, World of Warcraft, Starcraft 2, Skyrim and Mass Effect 3 all run at highest settings comfortably. I've not yet tried Witcher 2 which I've yet to install.
I assume there is a method of unlocking the throttle? -
haha my 485m is overclocked more than your 580m, now that is funny. seriously you should overvolt it, nvidia throttles its cards in furmark anyways.
-
*tosses a cookie*
Marginally. Your core is 3MHz more. And you've had to do more then simply adjusting clocks i.e. thermal paste, custom cooler, foil tape, over volting.
Like I said previously, when it comes to a point where I can't play games comfortably, I'll start tinkering more. What's the point in potentially shortening the cards lifespan when the games I'm playing at present are running fine? It really doesn't matter if X game is running at 200FPS or at 50FPS, you're really not going to notice that 150FPS difference, it's just bragging rights, which don't interest me in the slightest. -
well i run my screen at 90 and 120hz for 3d so it does make a little bit of difference, but the other thing is, my card is running so cool that i have actually extended its life. beside when the gtx 780m comes out or whatever i will upgrade at that point so i only need it to last till then, not for 10 years.
-
Just as an FYI, overvolting itself does not reduce the GPUs lifespan. Overvolting allows for the clocks to be pushed much higher which increases heat even more (power increases as square of voltage and linearly with frequency) so in all, you have added a lot more heat into your system.
-
Hence the extra coolint via ic diamond and foil tape mod and max fans, also i have ordered some extra blowers to add in and see if that will make any difference.
Sent from my DROID RAZR using Tapatalk 2 -
Oh yeah, I'm aware that the degradation to life expectancy is not the result of over-volting per se, but rather the increases in temperature it causes.
Sent from my iPhone using Tapatalk -
Sorry I should have quoted. That wasn't for you. Many times people read that increasing voltage will damage components. That is true, if you increase by a lot, beyond its max.
Overclocking a 675m?
Discussion in 'Gaming (Software and Graphics Cards)' started by esq, Jun 24, 2012.