I also wanted to add that my 880M is very finicky. I had it running great at one time at 1170 on the GPU for my daily over clock.
Then I got the bright idea that I should upgrade drivers. Bad idea. It has been a struggle to get it back stable even at stock clocks.
-
-
Yep, that is why I cap it at 60. In Mechwarrior: Online the stupid game would cause a very high FPS on the main menu and while loading a map. The GPU was cooking itself for no reason.joluke and reborn2003 like this.
-
Yeah i was using same hack on HD6990M with RadeonPRO. Set fps cap at 45 and enjoy full fluent gameplay without cooking your gpu.
reborn2003 likes this. -
How do you guys cap the gpu? I got dual 880M's
-
What? I have dual 880M's too. Fluent AC Unity ultra settings on stock bios reduced to 850 MHz. Don't know if you mean underclocking by "cap". If yes then dump the bios, edit it using Kepler Bios Editor and flash it to the cards, it's easy.
EDIT::
Ah sorry I've just understand what are you asking for. Go to Nvidia Control panel, choose the game and force VSYNC "Adaptive". FPS will be capped to 60
joluke likes this. -
Another way to do this is to install RivaTuner Statistics Server. Any value above 0 in the "Framerate limit" field will limit your frame rate to that value so its a little more flexible in that respect. Set the value to 0 to remove the limit.
-
I've tried that. It stutters, really much. I prefer adaptive Vsync from nvidia.
-
Meaker@Sager Company Representative
I look forward to when the freesync/gsync combo finally hits notebooks properly. It's the way it really should be done.
-
Gsync already works in hacked nvidia drivers on Alienware M18xR1 AFAIK.
-
I found recently that when I went back to the stock vBIOS the driver crashes on *everything*. The silicon in my 880M is already degrading?
-
Even after DDU/clean driver install? Did you try more than one driver?
-
No way am I messing with an 880M that is working. I just went back to the modded vBIOS.Mr. Fox likes this.
-
Sounds like a better plan anyway. I hate stock NVIDIA vBIOSes. I think the last time I used an NVIDIA GPU with a stock vBIOS was back in the day of 8800M and 9800M. Funny that the most unreliable NVIDIA products I have ever owned are those that I ran at stock clock speeds with no firmware mods, LOL.
-
Dell better remove the 330W limit on the 18 before it's too late.
Imagine the load with DX12?
I imagine a 4940MX + 880M SLI won't even work with any DX12 game and will shut off the system immediately, with its power trip.
Mr. Fox likes this. -
Don't release bios that allows to fully use the computer. Dellienware 2015 way
-
Meaker@Sager Company Representative
At least a dual 240W brick option would have been nice and still pretty portable.
-
I would love to see one massive 700W AC adapter for the Alienware and Clevo SLI beasts. It would be more convenient (and take up less space) than two fumbling around with two smaller ones and converter box. I would not even be interested in owning the smaller capacity if I could get monster AC adapters with the output equivalent to two.
-
Something like a desktop psu wired with the right plug and the id chip (whatever it is called) from the 330w psu would be sweet!Mr. Fox likes this.
-
Im getting about 115 FPS in BF4 ultra. Two 880M @ 850MHz clocks plus i7-2920XM (4 cores 3,4 GHz)
No problems so far, im very satisfied with this upgrade. Total upgrade cost with two cards was about 750$joluke, deadsmiley and Mr. Fox like this. -
You have a problem. Stock clocks for those cards is 954MHz and the boost is 993MHz. You're throttling like mad. As long as you're happy though.
Sent from my Nexus 6 using TapatalkGodlikeRU likes this. -
Meaker@Sager Company Representative
Yes have a look at temperatures (BF4 will always get things toasty).
-
But I've manually lowered the clock to esnure long lifespan of the gpu
Don't need more power than I currently have, it's already overkill.
Max temperatures are 77C -
My temperatures in BF4 with the high setting is 91°C. I have the VBIOS Mod and I do not throttle but I have a serious problem of temperatures. I'll have to make a repaste GPU and CPU but I believe that little will serve.
It is an interesting option down the clock to the useful life of my GPU. -
Another option is to cap the fps. I cap mine at 60 fps. This way the GPU can run at full speed when it needs to and will throttle itself down when it needs to. I seldom see more than mid 70's in the games I play by doing this.
-
Yeah I did the same thing. I was running 850 at 0.887v if I remember correctly. May have been 830 because one of the cards was weaker than the other. Wasn't happy with that. I bought a system that was supposed to go much faster. I'm much happier now that I have 980M SLI.
Sent from my Nexus 6 using Tapatalkdeadsmiley likes this. -
Repaste with some Gelid GC Extreme.. Get some new Fujipoly or Phobya XT pads... Phobya are much easier to get..
Also another solution.. I personally don't cap because I can't be bothered but if you have a 60Hz monitor, anything more then 60fps is useless in effect lol..deadsmiley and Mr. Fox like this. -
I always use FPS cap to 60. Just to benchmark some games I turned it off
Mr. Fox likes this. -
Best to cap at 62. Capping at exactly 60 FPS on a 60 Hz screen causes more tearing. For me it's always a slow tear at the bottom of the screen which is really distracting.
Mr. Fox likes this. -
I really appreciate not having to cap anything, not having issues like screen tearing, and no serious challenges with GPU thermal management. Just set the clocks, bump the fans if/when necessary in a warm environment, and let it rip... all fun, no hassle. I haven't really found myself faced with a need to take any extra steps for an amazing gaming experience since 6970/6990 CF and 580M SLI... maybe that's called being spoiled, but I sure do like it, LOL.
-
Nope, and it is not likely that I will ever spend money on that kind of eye-candy stuff. I don't feel any need for it and it's way too expensive. I'd rather spend money on stuff that makes the machine go faster. Don't get me wrong... I'd like to have it, but I am just not willing to pay extra for it.
-
Sorry, I asked you which of the two you used and all I got was an ambiguous "nope."
-
Sorry, I should have said "nope" twice. I ignored your first question by accident. So... to be clear, nope, I don't use vsync either... don't like it. The exception was back when I was still playing Skyrim. Without vsync it was unplayable due to horrible glitches with screen flashing and spastic/convulsive people and animals. Hopefully, that is less ambiguous.
-
So how do you get no tearing w/o V-Sync? It's not technically possible.
-
There might be some now and then, but almost always none apparent or too minor to care... rarely ever severe enough to qualify as an "issue" to me. Once in a great while I do run into what I would call an "issue with tearing" that can't go unnoticed. Besides Skyrim, at one point it was really horrible with Batman: Arkham Asylum. That got fixed with drivers. Maybe I am just not sensitive to it. It has to be more than a minor glitch before I notice or care. At least I don't have to worry about having seizures from a flashing screen or anything like that, LOL.
I always have vsync disabled, with only a few rare exceptions like the two mentioned. I enable it when I have a serious problem to resolve. But, I know I am not the only person that hates vsync. I've seen many others post the same opinion. It's all a matter or personal preference unless it's really bad. Those that have it always enabled, on every game, may end up being more sensitive to really minor tearing or other artifacts that can occur when it is not enabled.Last edited: Mar 2, 2015octiceps likes this. -
Yeah, same here. I'll never play any game with V-Sync. The input lag is just too unbearable, esp. since I play a lot of FPS. Like you said, tearing is a more acceptable compromise, and there are ways of reducing it such as capping FPS or running a high enough uncapped FPS, but I can't wait for the day where I'll get a no-compromises solution like G-Sync or FreeSync.
Mr. Fox likes this. -
If they had these technologies supported native in the laptop display I might be tempted to toss a few extra bucks their way for it. Not a lot extra, but a little would be OK. Mainly because I don't really consider that I have a problem that needs to be fixed. It would be a "nice to have" enhancement as long as it did not cost me much.
Since I virtually never play at a desk, connected to a monitor, that's the only way I would have it. Maybe running a really high uncapped framerate is the reason I seldom see major issues. If you're always pushing between 75 to 150 FPS there's probably not much to complain about.
Yes, it sounds like we both play lots of FPS. With very few exceptions (usually TPS games the likes of Max Payne or Ground Zeroes are the exceptions) I play almost exclusively FPS. I play Strategy and RPG and MMORPG very close to never... probably less than 20 hours in the past 20 years, simply because I don't care for those game genres. I like watching the marketing trailers for those games, but playing them is like watching paint dry to me. And, the distant camera angle with small characters on a large map perspective makes me angry for some reason, LOL. Don't know why... my boys think it's funny. Can't say it's because I'm a old dude because I hated it just as bad when I was a young buck. Start adding in dialogue, looking for clues and solving puzzles on any kind of game and I start counting sheep about as fast I do trying to watch an afternoon soap opera... BORRRRRRINNG. -
Meaker@Sager Company Representative
I think people are sensitive to each issue more or less. Like you I can take some tearing but can't stand input lag. G-sync on my external monitor though is very nice
Mr. Fox likes this. -
Personally I don't feel tearing annoying. I hardly see it. Maybe that's because I still remember Riva TNT and Pentium 3 times
Mr. Fox likes this. -
On Sunday buy online the Gelid GC Extreme. I believe that receive tomorrow. The thermal pads the'm looking for. Thanks for the advice.
-
Hey all, would you recommend upgrading the vbios on this card? I have this card with 8 gb memory on a MSI gt70 dominator pro.
If so, is it a difficult process? Or is the risk not worth it? -
Yea I pretty much play with vsync turned on exclusively. It's the one thing about nvidia drivers for linux that really piss me off, even with vsync enabled you still get tearing, nothing major mind you but my eyes are very sensitive and I notice when the screen tears.
-
Ok after tens of request im uploading this vBIOS again. If someone thinks that svl7 is better then I don't give a damn and stop spamming.
GTX 880M vBIOS Dell 80.04.F5.00.07 (last) for last UEFI 880M batch mounted in AW18 but works in any. Doesn't throttle in my system and never was. SVL7 vBIOS gives a bad flash and bricks last 880Ms.
Stock clocks: BIOS.zip
780M clocks (Reduced heat, longer lifespan): BIOS780Mclocks.zip
Im currently using 780M clocks but I force normal clocks when playing Witcher 3. Choose what you want. Never observed declocking.
Witcher 3 bottlenecked by CPU:
Attached Files:
Last edited: Oct 31, 2015 -
-
I noticed that The WItcher 3 is very GPU intensive as well. It fully loads a single GTX 970M even with a (relatively) lowly i7-920XM. The CPU is working at about 50%.
Thanks for sharing. -
hummm, looking at these clocks kind of defeats the purpose.
-
@deadsmiley I don't know how it can load only 50% of i7-920XM. May'be on lowest settings. This is ultra and it's nearly constant 100% at 3900 MHz
EDIT:: Ah ok. I know why it showed 50% to you. Because you have hyperthreading enabled but this game does not use hyperthreading.
@johnksss Sorry but I don't understand. -
@GodlikeRU I can't run at Ultra with a single 970M. Low or Medium at best if I want to see 30ish fps.
-
Dropping clocks to be stable.
-
@deadsmiley It may be your cpu as you see it's pretty cpu intensive
@johnksss Still don't understand... -
@GodlikeRU did that on purpose because of 880M having some weirdness and 780M was not for sale anywhere in the country he lives in. He wanted 780M and could not get it anywhere. He is having better results with 880M running at 780M clock speeds. He also could not buy the correct GPU heat sinks he needed in his country and modded his own (I think it was 6990M if I remember correctly) by soldering on extra heat sinks to provide coverage for the extra components that need to be cooled for the Kepler cards. He had a thread with photos of the heat sink mod some time ago, but it's probably buried back a bunch of pages somewhere in the M18 sub-forum now.Kade Storm, joluke, Johnksss and 1 other person like this.
My Nvidia GTX 880M Test Run Review
Discussion in 'Alienware' started by Johnksss, Feb 26, 2014.