Always except when MOAARRR money is involved, and then I have to ask permission, be on my best behavior for a few weeks and do quite a few special favors.
-
-
Hello What is going on with 960M GPU now? 350.12 is the first driver where 960M is included as supported GPU in Nvidia datasheet. Will the 347.88 driver recognize the GPU?
-
It might. If not, you'll need to mod the .inf to install it.
-
@LunaP How big are your crevices? You always make my day funnier!
Is it bendable? lol -
-
Coool, may I ask which inf and which part to what do I need to change?
-
Besides not having any cooling problems, we've already had all of the fox kits we're going to have, and we've both already been fixed by our veterinarian. So, there's nothing to worry about in terms of accidental sterilization.TomJGX likes this.
-
-
Do you mean because it'll run hot, or it's thin enough to cause mechanical instead of thermal damage?
-
Perhaps both
I dare not to use this sterilization tool. NEVER... Although I have 3 kids, I don't want to be sterilized anyway .... Perhaps this Razer laptop can become a substitute for states in usa who lack poison to executions. Can be used to cut off the head - heat the crematorium oven, and you can write a death certificate on it. 2 birds with one stone. Or 3.
Last edited: May 3, 2015Ashtrix, TomJGX, Robbo99999 and 1 other person like this. -
-
Robbo99999 Notebook Prophet
Is that a quote from somoene with an A17 R2? They should be alright with a 240W brick, especially if they use some of the later system BIOS', because the earliest system BIOS didn't allow the PSU to be used to 100% - it would make the system throttle when getting to about 85% PSU capacity (from my memory). He should use the latest system BIOS and that 240W brick - should result in zero throttling at stock settings. -
Yes... But the newer bios have poor fan profile. Did you remember witch bios it was in the Alienware 17R2 Mr Fox tested ? Mr Fox run the Aw17R2 all the way up to 280w during benchmark tests.
-
Robbo99999 Notebook Prophet
No, I don't remember unfortunately, but if I was that guy I'd just use the latest version of the system BIOS on Dell's website. The first (earliest) system BIOS is definitely not the one to use though.Papusan likes this. -
But most of the users of Aw17R2 have temperature problems with all other bios. Poor fan profiles like previous Aw17 and 18. You can also get Trottle problems with Aw17R2 because of high temperature
-
Robbo99999 Notebook Prophet
Haha, I see! Looks like they'll just have to test different BIOS' to see which are best for them. If I had the A17R2 though, I'd repaste to maximise cooling & then choose the BIOS that didn't limit power from the PSU. -
@Mr.Fox, I guess it was the 1st BIOS right? A00??
-
Please continue conversation about BGA/PGA here - http://forum.notebookreview.com/threads/bga-vs-pga-continuation.775786/
I split it as best I could. Now let's keep this thread for nVidia and their vbios overclock blocking, thanks.
As of now, 347.88 ignores the bit and 350.12 does not.Last edited: May 9, 2015 -
Thanks, this topic was way off, difficult to find and read what is actually useful to me. As I am new Nvidia GPU user and 350.12 driver installed, I wonder if others can undervolt with Nvidia Inspector (with earlier driver)? I have only option for memory overclock, but undervolting would be more interesting.
-
Maxwell cannot undervolt because the vBIOS has huge fluctuations in its voltage tables already (meant to keep power draw in check) so undervolting at all would cause SERIOUS crashes.Atom Ant likes this.
-
Yep. First thing I recommend doing when you get Maxwell is to go into the voltage table in the vbios and remove the dynamic voltage adjustment nonsense.
-
I will go ahead and check if I can remove the vbios from the main one. Hope if those dynamic nonsense things will be removed I am good to go for undervolting.
-
Well... technically... if you remove the dynamics and force it to a high voltage then proceed to undervolt, the benefit is lost AND you will draw more power/run hotter all the time. It'll definitely be stable, but if you are looking for say... a straight improvement in battery life or power draw or heat output, keeping it stock would likely work best. Just be wary of the tradeoffs.
-
Hi guys. I was told by the OP that my video card, NVIDIA 620m in my UX32VD is not affected by this "clockblocking".
I seem to have a separate but maybe related problem. The core clock slider is still available to me (tried on 347.88 and 350.12) but when I hit apply, the clock does NOT actually change to the new value. The memory clock adjustment works fine. Thoughts? This is on Windows 8.1 Pro. OC was previously working fine on Windows 7, but a much older version of MSI Aferburner (not sure which). -
Dump afterburner and get Nvidia Inspector. It is the only tool that works reliably on mobile cards.
-
I should have mentioned that I also tried that, and the same thing happens. I can adjust (the shader clock in this case, GPU clock which is grayed out goes up with shader clock), but it doesn't apply. The "Current Clock" remains at 624/625MHz. The memory can be adjusted on the other hand.
-
Downgrade the driver. If the problem goes away then nVidia has in fact blocked your card. If it remains then your vbios doesn't support core frequency adjustment.
Actually that's a Fermi card. I'm not sure how those worked for overclocking.
A quick search said to use Afterburner 3.0.0 - you can get it here at the bottom of the page.
Keep in mind that overclocking is going to have a very minimal increase in actual performance with such a low end card and Fermi already runs hot so watch your temperatures.Last edited: May 10, 2015Mr Najsman likes this. -
I thought it was a kepler card? http://en.wikipedia.org/wiki/GeForce_600_series
-
Nope, it's a Fermi GPU. GF117 (28nm die shrink of GF108): http://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_600M_.286xxM.29_Series
A lot of the low-end 600M, 700M, and 800M cards are still Fermi. -
GTX 850M with 350.12 driver on HP Envy 15 isn't blocked, i think.
http://imgur.com/gYn07JP -
Overclocking is not driver blocked by 347.88 or 350.12 - the difference is systems that shipped with a locked vbios can indeed overclock in 347.88 as it ignores the vbios overclock bit, however nVidia either left the override out or intentionally reinstated it in 350.12
It is the older 347 drivers that completely blocked Maxwell overclocking, even with a modified vbios. -
675M is also Fermi.
-
As is 670M
-
Wow, noted. Ok so Afterburner 3.0.0 indeed works at setting the core clock! Why would the newer version not support the older cards? Looks like the card also auto-boosts on stock clocks from 625MHz to 716MHz (91MHz).Overclocked +100 it automatically boosts from 725MHz to 810Mhz (85MHz). Is this normal?
Is there an older version of NVIDIA Inspector that would also work? Afterburner limits the GPU clock to +162MHz but it seems to be perfectly stable there, and I noticed there is no limit in NVIDIA Inspector. -
Afterburner is functioning as intended. Your vbios is indeed limited to +162MHz. I am not familiar with Fermi boost table behavior but it doesn't sound abnormal. What are your temps like though? Fermi has the nickname "Thermi" because of how hot it can run so keep an eye on it.
As for why it isn't supported in a newer version, I'd imagine it was just deprecated. Fermi is an old architecture. It goes all the way back to the GeForce 400 series. nVidia just insists on using them for their lowest end cards still. -
Not exactly, here's what I've found:
1) I was finally able to set the GPU clock with NVIDIA Inspector if I first run the commandline: nvidiainspector.exe -forcepstate:0,0. This allows me to go past the 162MHz that was the end of the slider in MSI Afterburner. Boosting no longer happens, whatever is set, that's what's used.
2) Once pstate 0 is forced by Inspector, Afterburner ceases to be able to apply the core clock (reverse of original problem)
3) The voltage automatically adjusts itself with higher clocks (from 0.925V stock to 0.975V at 890MHz).
4) The temps are about 85C. I'm guessing 90C is the bare limit. It's also about 27C room temperature here right now.Last edited: May 10, 2015Ethrem likes this. -
Congratulations on figuring it out. I was going by what I read online.
Your temps seem okay. Fermi desktop cards with ridiculous fans were known for 90s under load so yeah. -
The problem is that even with the GPU at stock clocks and with 0 GPU load, since the thermal pipelines are connected, the GPU will quickly bring the CPU to 90C if the CPU is under max load. Without the GPU active at all, the CPU tops out at 85-87C. This is with the TDP limit removed.
-
I would keep the lowest frequency state near the highest and undervolt these. Power draw, heat and fan noise output minimization is my goal. The 15W TDP CPU is already undervolted and I save like 1.5-2.8W under full load, which pretty cool. I think an ~50w GPU could run 5-10W cooler and it would mean real difference.
-
What clock speed does a new i7 need to be in order not to bottlenech a 980m in any game? Im underclocking and undervolting my 4720hq in my p35x v3 in order to get the temps down.
Sent from my GT-I9300 -
Robbo99999 Notebook Prophet
I guess you're aiming for 60fps gaming, if it was 120fps then the CPU would make a big difference. I can only talk from my own experience with my old Sandybridge CPU that maintains 2.6GHz accross all 4 cores (8 threads). None of the games I play are limited by my CPU at 60fps: F1 2012, Titanfall, Metro Last Light, Metro 2033 Redux, Batman Arkham Origins, Grid 2, Bioshock Infinite, Elite Dangerous, Far Cry 4, Far Cry 3, Tomb Raider. The Metro games are the ones that hit my CPU the hardest, but doesn't limit my fps. I might find my CPU would limit me in Battlefield 4, but I don't have that one. So given my CPU is OK at 2.6GHz, and given that there have been performance increases per clock since then in comparison to modern CPUs, then I think we can say for 60fps you'll be fine if you can maintain 2.6Ghz, and probably OK with 2.2GHz even. (I've heard GTA V hits the CPU hard, so you might need higher clocks for that one.) -
Im aiming for 60fps at 3k resolution, will that make a difference? In war thunder, my fps seems to be lower when the clockspeed goes under 3.2ghz...
Sent from my GT-I9300 -
Robbo99999 Notebook Prophet
Well, you might need to maintain 3.2Ghz then. You'd have to monitor your CPU usage and clocks on each thread & also your GPU usage & clocks to be sure. (The 3K resolution shouldn't make a difference I don't think, it's the fps that you achieve that determines your CPU usage - 60fps is 60fps.) -
Just note: higher resolution forces a GPU bottleneck more, but it isn't a guarantee. If a game is single-thread limited, you can still hit a bottleneck. Games like GTA V and BF4 are rather prone to CPU bottlenecking, but at 3K you might not run into them before you hit a CPU GPU bottleneck.
Honestly? As of GTA V's release, and considering 1080p, I'd average you'd need somewhere around 4.3GHz or higher to never go below 60fps.Last edited: May 14, 2015 -
I want to read GPU there, else Im lost in your statement.D2 Ultima likes this.
-
fixed, thanks for that. I must have been distracted when I typed that XD.Mr Najsman likes this.
-
Well, if the new 352.84 Win 10 driver is any indication OC won't come back anytime soon for OC blocked vBIOS:
http://forum.techinferno.com/genera...t-overclocking-mobile-gpus-28.html#post133276Last edited: May 17, 2015 -
We lost the fight. It's pretty obvious that nVidia is going to do what they are going to do and we either take it or stop gaming on laptops since AMD can't compete at all.D2 Ultima likes this.
-
Robbo99999 Notebook Prophet
I'm gonna be upgrading to Win10 when it's officially released - good to have DX12, but I'm concerned that the OS is gonna be more locked down than Win7 in terms of GPU overclocking perhaps. like this new 352.84 driver for instance. I'm wondering if even Kepler GPUs will become restricted too. Given the overclock on my Kepler GPU, that's my concern. Anyone think Win10 is gonna be more locked down for overclocking than Win7 somehow?
Nvidia clockblock: vBIOS (unblocked in 353.00)
Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Feb 23, 2015.
