i have a couple OCing questions that i've been sitting on for awhile probably because i don't really know how to articulate them, so i apologize in advance if my "question/s" are decently convoluted.
i'm playing BL2 right now at 770/1650 and i'd imagine that a ~28% core OC is fairly significant, but i could be wrong. i've actually been able to bench at 780/1950 without too much artifacting, but it definitely wasn't stable. and if my limited understanding serves me, the only way to get a more stable OC is with an OV--and the only way to OV is by flashing a new vbios. but i'm not sure where to even find a modded vbios. techpowerup maybe, and i'm guessing this is stock vbios, yea?
anyway, so it seems like certain titles respond to OCing better than others. although a decent OC might be stable with one game, that very same OC causes serious artifacting with others. and, of course, the temp goes up accordingly. the highest temp i've ever seen is 83C, but i don't remember with what game or at what clocks. in any event, i don't know, it all seems very random to me and i'm constantly tinkering with the clocks +/-30-50 MHz depending on the game. i'd almost be happier if i could maintain 99% utilization and have the temp climb to 90s+ because then i'd feel like i was actually pushing the card. i mean i understand that sometimes everything processes in a given scene/section and the card gets a break, but then again doesn't that mean i should be able to increase my settings?
i guess my biggest question has to do with utilization. one would think that a game whose graphic settings are set higher than stock clocks can handle, that utilization under OC would be a steady 99-100%. but that's usually not the case. it's the minority occurrence in fact. playing BL2 with said OC and the peak utilization is ~78%. minimal artifacting, e.g. the occasional tear here and there during firefights, but i'm maintaining playable FPS (i.e. 40+, which is good enough for me). ironically when i minimize the game to add to this post, utilization goes up to and holds at 99%. more confusing still.
can anybody help me with some conceptual knowledge? does all of this simply have to do with how a given title is optimized for PC hardware? i know i've seen posts where meaker talked about pushing a 570m to a stable 800+ core clock. i assume that's with an OV, so can anybody point me to a vbios that'd allow me to OV? and i've read all the various answers to this question, but what's the correlation between core clock and memory clock? where should one be set in relation to the other? and where does shader clock fit into that picture? really looking for general knowledge here more than anything. even relevant links to wikipedia welcome.
-
HaloGod2012 Notebook Virtuoso
Check out techinferno forums for a bunch of modded vbios. There are different vbios with different voltages, check to see if your gpu has one available there. If not you can request one.
Also the oc does depend heavily in the game. Play some crysis 3, it's the only game that truly tests my overclocks, more so than any benchmark. Also, optimization does matter. Bad console ports tend to kill the cpu more than the gpu, where a game that's been coded well uses the gpu to the max. I've seen some games go from 99 percent usage to 60 percent in certain scenes, seems like a cpu bottleneck or poorly optimized section of the game or something, happens in tomb raider alot. If you get 60 percent usage in a game at stock, overclocking will actually lower that usage and still give you a fps boost, however something else is bottlenecking the gpu or limiting it from being used 100%(could be vsync in a non intense game or the cpu being used to calculate something it can't handle).
This is frustrating and I have to always alter my clocks depending on the game. Crysis 3 handles a high oc for my 680m but Bioshock infinite needs lower clocks to be stable. Every game engine handles the oc different. I use crysis 3 as my test since it makes the most use of any gpu and stresses the shaders heavily. It's all trial and error.
For your other question, always OC the core to its max first , get stable, then play with the memory. Sometimes a memory OC can hurt a core OC so don't over do it. -
You may wish to try nVidia Inspector or MSI Afterburner to see about OVing before flashing a vbios. Secondly I see you have a 670M which is fermi. I'm not sure how overvolting that card might work, but good luck with doing it. Just try not to keep your cards at 90 degrees. Also, screen tearing isn't artifacting. Artifacting is when you get random crap on your screen that isn't supposed to be there. At all. Like this one time I installed a beta driver for my 280M and the screen went psychadellic on me. Or a time my card crashed and videos were getting green screens instead of playing. Screen tearing isn't really much to be afraid of. As for your OCing ventures, I wish you luck and keep your card cool =D
-
-
failwheeldrive Notebook Deity
Screen tearing is a product of high framerates, not instability. Artifacts are typically little triangles or lines that flash across the screen. Like this What happens when you overclock your GPU too far - YouTube
-
I know this is subjective, but unless you're on the verge of constant 30-35 fps, I wouldn't overclock at all. -
OC help
Discussion in 'Gaming (Software and Graphics Cards)' started by mattcheau, Apr 15, 2013.