AMD isn`t any better. They had a ton of Enduro and Crossfire problems with their 7970M cards, while GTX 680M was a plug and play card that ran cooler and just worked. Not to mention an interesting number of dead 7970M cards posted on this forum from owners.
AMD have pushed out 3 identical cards, 7970M - 8970M - R9 M290X where the last one, M290X, got a tiny 50MHz bump. Atleast Nvidia gave us something new with GTX 780M with more cores and a more agressive clocks. But they should have stopped there instead of getting greedy and pushing out a broken rebrand card operating on the edge of whats possible in a notebook.
pidge, although that he bought an Alienware notebook with the 880M to try to investigate, I find it strange that Nvidia have to go out and buy notebooks, instead of doing testing inside their own facility where they surely have a ton of these dead old GK104 chips laying around in piles.
Unless they have more than less given up on using too much effort and energy on fixing a 2.5 year old architecture. Its nothing wrong with the Kepler architecture, but like I said in my previous posts, GTX 880M aka GK104 @ 1GHz should have never been put out and launched for Average Joe to buy and expect to run all cool and stable. For enthusiasts that know the pro`s and con`s of such clocks, maybe, but I believe its a thin line between GPU Boost working on 990MHz and throttling inside a little notebook, which seems to be the case for many people.
People should be warned to not buy the GTX 880M. OEMs should refuse to out them inside their notebooks. Not gonna happen though, Nvidia trick people with their rebrand marketing, OEMs are there to make money, its sad, because that could have pushed forward the launch of Maxwell GTX 880MX/980M which again probably will be as stable and cool as a GTX 680M.
Instead we are waiting. Waiting for AMD to make a move, hopefully to push Nvidia to release the damn chip, because shure as hell Nvidia won`t lead, but rather follow and one up AMD like the cowards they are. I know when it is launching, and it sucks to wait while Nvidia got the chip ready to go inside a notebook.
-
-
Here is an update from Nvidia after me linking this post and some other posts about 880M problems with overheating [Your case is being escalated to our Level 2 Technical Support group for further attention. The Level 2 agents will review the case notes to troubleshoot the issue and find a solution or workaround. As this process may take some time and require a good deal of testing and research, we ask that you be patient. A Level 2 tech will contact you as soon they can to assist or point you in the right direction ].
They seem to be taking this a bit seriously now but again i request u guys having 880M to post their problems to nvidia directly through the below link Support Login will keeps u guys updated.Ethrem likes this. -
lol I wonder what they would say about my card puking on itself then magically coming back to life again.
Thanks for sharing, please keep us posted.
Just submitted my own inquiry.kamlesh likes this. -
It's still not hitting 99% GPU utilization while still struggling with FPS in some games (i.e. Bioshock) but, in others, Crysis 3 for one, it stays maxed out and I'm getting a comfortable 50-60FPS on all High at 1080 which is astoundingly good.
Mixed feelings about the card so far. Had two afternoons wasted fixing what I assume to be driver issues where the GPU utilization would yo-yo from 48% to 75% for no reason. Everything works for now, minus BF4 which still does that. On the other hand, my performance on DOTA, Bioshock, Crysis 3 and Titanfall are all stellar. 60 FPS at 1080 on Ultra or High at the very least with decent AA. The 8GB of RAM on the card may be a marketing gimmick but it's certainly nice to have the option to always toggle a decent amount of AA as it seems to take no performance hit up to a point. Have never had that with a card before (and was pretty much stuck with FXAA at best). -
As for the utilization, that's generally a software problem, not a hardware one. BF4 is among the worst when it comes to optimization, along with Watch Dogs. I don't play BF4 but I have a 780 Ti in my desktop and I compared utilization of my 880M SLI vs my 780 Ti in Watch Dogs and they both fluctuate between 85 and 95% and Bioshock Infinite goes between 95 and 99%
I also received a response from nVidia... Telling me to update my system BIOS and do a clean driver install... Obviously I'm not going to get anywhere with them but lets see... -
I just ordered an Alienware 17 with a GTX 880m.
I´m scared. -
-
I'll ask xotic about it though when I get the RMA info how long it would generally take to get a repair and see what works best. Getting the materials to ship just the card back would be a bit more of a pain than sending it back in the box it came in for sure though...
Sent from my HTC One_M8 using Tapatalk -
Contacting them was an epic waste of time
" Regarding the drivers of a Laptop : NVIDIA driver is just the basic chipset driver, for a Notebook and to get its full functions we always recommend to run the OEM drivers (which provided by the Laptop manufacturers). Please be informed that we always recommend the Laptop users to get the compatible driver from their respective laptop manufacturers as we have limited support for laptop drivers. Notebook GPUs use drivers that have been customised by the notebook manufacturers to support hot key functions, power management functions, lid close and suspend/resume behaviour. NVIDIA has worked with some notebook manufacturers to provide notebook-specific driver updates, however, most notebook driver updates must come from the notebook manufacturer.
Regarding the clock speed : As you know in Notebooks the manufacturer will make adequate changes in the clock speeds and other specs especially in SLI set up. If you think that the hardware malfunctions then you can report the issue to the OEM and can get it repaired/replaced as per their warranty policy."
Gee, tell me something I didn't already know... -_-
I like how he just blew off my question about the clocks and said to go to the OEM... Again passing the blame but offering nothing about the guidelines for what the cards should be doing and not even addressing whether or not it's normal for these cards to not even run at stock clocks!!!
Sent from my HTC One_M8 using Tapatalk -
And FWIW, before I quit desktops in 2004, I bought ATi exclusively and they all worked like a charm without me having to do fiddle to get things to work. Even my now 9-year old Compaq laptop with Radeon Xpress200M gave me no trouble at all. Kind of ironic all the problems began when AMD took over ATi -
Most of AMD issues were due to Clevo cards, not the 7970m itself. That GPU was great from the get go from any vendor that did not deal with Enduro. Enduro simply sucked, and I think it still causes some problems to users. That happens when you release an unfinished new software haha
But issues with GPUs are normal. 680m had issues with drivers often locking up certain mhz for core and you had to go back to a previous driver or use a newer one.
As for the identical cards, yeah AMD basically rereleased the same GPU three times (M290x was the shameful one). Only a single 50mhz bump in speed. nVidia did it too, 680mx, 780m and 880m are the same GPUs. The main difference would be the actual clockspeeds. 780m would be the winner in the end, for they don't push it as hard as 880m.
Which is weird, since I run 780m at similar clocks than 880m and I don't have the same issues. I wonder what happened there? -
Cloudfire likes this.
-
-
Oh well. Hopefully a new GPU will come in the next months to fix this bad reputation!transphasic likes this. -
The 880M is broken because of nVidia's greed plain and simple. It is 780M silicon binned and pushed to the absolute limit. I bet if the stock voltage was bumped by 10 or 20mV most of the stability issues would go away, but of course then you'd a Haswell-esque GPU on your hands which would be hard to cool properly.
Cloudfire likes this. -
*on with the tinfoil hat*
GTX 860M (MXM), GTX 870M and GTX 880M are all GK104. 3 rebrands. Nvidia didnt bother to put the Maxwell GM107 on a MXM card (a shame really) but instead used GK104 for 3 rebrand SKUs. Why?
Because they are trying to remove GK104 inventory. Not all chips are bad, but I wager that some of them are not meeting the GTX 780M standards in terms of silicon quality, Just go back to when GTX 780M was released: May 2013.
At that time, Maxwell was faaar away. Kepler was their priority, especially mobile Kepler, aka GTX 780M, since Nvidia have been doing great in mobile with a growing % of market share. So naturally they want no problems and want top notch chips for those cards. GK104 are manufactured for a wide selection of cards, both for desktop and mobile chips. Some GK104 chips doesnt meet the strict criteria to be put on a mobile chip. Off to desktop cards they go. The finest chips, the chips that are binned, they go to the mobile bin, and ends up as GTX 780M.
GTX 780M owners see the benefit from this with great overclocking capabilities, reduced leakage which in turn means lower temps than a not so stellar GK104 chips that didnt meet those sorts of criteria.
So in summary:
GTX 880M are made from whatever GK104 inventory Nvidia have laying around. This is why we have a slightly mixed opinion about the card. Why GTX 780M @ same clocks as GTX 880M runs cooler (less leakage due to better silicon) and more stable (better silicon). Why some have no issues (got lucky with the GK104 inventory lottery).
GTX 880M was made to keep the stock holders at Nvidia at bay since Nvidia have not released any new graphic cards for a while. To keep the company running with an income, while the engineers had moved on to Maxwell chips a long time ago. Less details and effort are put in to the GTX 880M chips, including fixing problems (which is why we had poor response from Nvidia with the 880M issues).
Thats my view on the current situation and that is why I feel Nvidia need a swift kick to the head, why customers should avoid the GTX 880M, why OEMs should not offer the card in their notebooks, and why we need GM204 Maxwell aka GTX 980M right now.
April 2012, 2 months before the GTX 680M official launch, our first Kepler first high end, we had GTX 680M benchmark leaks.
We are soon at August 2014 and have still not seen any leaks regarding GTX 980M. They are still using the 28nm node like they have been doing for almost 3 years now, and are taking much longer time than Kepler, which was based on a brand new node (28nm instead of 40nm).
Screw this. Screw NvidiaD2 Ultima, transphasic and DDDenniZZZ like this. -
Everyone who is planning on buying an 880M must read this.
In fact I think I'm going to make it part of my signature to warn people about the dangers of the dog poop known as 880M. -
*puts on tinfoil hat*
I'm with cloud on this one. There is absolutely no way in the universe that logically, this following list makes sense
780M versus 880M
1536 cores || 1536 cores
120W power drain || 120W power drain
1.012-1.025v (Ov'd) || 1.00v
950-1000Mhz core || 954-993MHz core
5800-6000MHz memory || 5000MHz memory
Good heatsink || better heatsink
~83 degrees max for most without cooling mods || easily hits 90+ degrees & throttles in the same games
^ the above doesn't make a lick of sense. Not even close. There has to be some other insane variable at work, and raw statistics don't seem to make sense. If silicon binning is the deal, then sure. Makes perfect sense to me, because the rest of it doesn't. There is no way you take less voltage, better cooling, lower some of the clock speeds across the SAME underlying hardware and get MORE heat.Cloudfire likes this. -
Silicon binning is real, both of Ethrem's 880Ms have an ASIC quality of over 80%, while both my 780Ms hover around the 72% mark. So better silicon = crappier results makes no goddam sense
-
-
ASIC quality means that the higher the ASIC % is, the lower voltage is required to run the chip.
So we can look at it two ways:
GK104 inventory being sold - All sorts of chips with various quality being sold to customers. Lucky lottery to receive good ones with less leakage and good silicon that runs stable.
GTX 880M being made, but few have the silicon quality (ASIC quality needed) to make the GTX 880M stable to run those clocks with 1.0V voltage. Throttling, leakage and high temperatures occur.
With D2 Ultima`s excellent overview of the GTX 780M against the GTX 880M, you can`t help but wonder either way that GTX 780M is the only one that cut it in terms of right voltage and less leakage overall combined with the right core clocks for the right temperatures. Id like to wager again that if Nvidia increased the voltage on the GTX 880M (due to bad silicon quality), the temperature will skyrocket as well, leaving an already hot chip hotter....
I believe Ethrem tried out a vbios from SVL with higher voltage than stock voltage, which led to pretty high temps...
Either way you look at it, GTX 880M should never have been made. Nvidia are stretching what could be possible inside a notebook and playing on a thin line, where customers are the ones who fall short and are stuck with a chip that doesnt work like it should -
ASICs are usually binned in several categories, as far as I've gathered from the Hawaii binning process thus I'd imagine the NVIDIA ones are similar:
1. Low leakage: These are desirable for laptop chips as most of the floating gate and insulator structures are well built thus there is less loss during operation. This bin is typified by an ability to run at a desired clockspeed with a nominal voltage (i.e. near switching voltage). Ironically enough, these don't really respond well to overvoltage, needing much more voltage to attain higher clockspeeds beyond the normal operating range.
2. High performance: These tend to be the leakier chips, though not always, but its because the transistors are capable of a much higher switching speed at a given voltage. These are the ones that tend to get put in Desktop chips. You tend to find that these ones respond great to overvoltage conditions (i.e. they need less voltage per clockspeed step). The very best of these are the ASICs of choice for the extreme overclockers
3. Broken units: These are the ones that have units that are not functional but the chip is not yet compromised (e.g. Hawaii Pro chips salvaged from Hawaii XT chips)
Thus when a chip is "binned" in one of the 3 categories above, it does not necessarily mean it doesn't have the characteristics of the other bins. The Bins are more of a continuous envelope than distinct categories. For example, it is quite possible to have a "broken" ASIC with one less functional unit that is in category 1 (these are the desktop Hawaii Pro cards that can get to 1150 Mhz with no voltage increase).
Likewise, a good overclocker on the laptop would be a low leakage part that has high performance characteristics. A terrible card on the Laptop side would be one that was pulled from the High Performance bin instead of the Low leakage bin.heibk201 likes this. -
-
With that being said my testing lately has had max fans and the slave card doesn't pass 80C with my new GC Extreme repaste i did today and the primary hits 79C with the old IC Diamond paste job it's had since like 2 weeks after I got it.
As for the ASIC, my problematic slave card is 86.3% and my master card with no issues is 80.5%
I thought that an ASIC over 80% was not a good thing or does that just apply to desktop cards?
I wish there was an explanation for the heat from these cards but the 780M and 880M use the same heatsink in Clevo machines so... These cards are just plain broken.
Sent from my HTC One_M8 using Tapatalk -
OK, how the hell do I view my vBios number? Google is not helping.
-
See the BIOS version?
nVidia Inspector will also show you the BIOS version. -
Most of my games seem to be running fine still - The Witcher 2, Cyrsis 3 and Titanfall all keep a near locked 60FPS at 1080 on Ultra (or High for C3). That being said, Watchdogs has that awful stuttering I always heard about but never experienced on my 680m (even with textures set to Ultra). There it seems to be doing the BF4 thing of having GPU utilization drops for a second which is really, really noticeable. We're talking a drop from 80% (30FPS lock is on) to 45% for no reason. The graph on MSI is all over the place. -
Try setting adaptive vsync in the nVidia control panel instead of using the game vsync and see if the behavior changes. It has less overhead than regular vsync. -
so did anyone try the new 340.52 driver and see if there're any changes?
Mr. Fox likes this. -
For me that dint seem to improve anything it seem to be concentrating only on improving geforce experience with shadowplay and Nvidia Shield and some other stuffs.
-
Seems to be working fine on my 780M SLI setup, so I am crossing my fingers and hoping for the best for 880M, too.
-
transphasic Notebook Consultant
Agreed for the most part. What some people do not understand and are being unrealistic about, is that every now and then, GPUs go out and unexpectedly.
Sometimes, **** happens,
In my 15 years of owning gaming laptops, I have had a Nvidia card go out after many years of loyal service, and I have had an AMD card go out after a year.
I would personally rather put my stock on Nvidia than AMD, and it's for obvious reasons. I have seen literally dozens of people have their AMD cards go out rather quickly (7970m), and it has been a costly one when out of warranty for many of us. The 7970m cards were bad to start with, and they stayed that way.
I have had many people tell me how bad AMD's driver support is, including many people in high positions with Laptop reselling companies.
AMD has always had bad driver support, and having left AMD to go back to Nvidia, I have made the right and correct choice.
My 880m has been great, with no problems to it at all, and I expect this to stay this way for years to come. -
transphasic Notebook Consultant
Well said, Cloud. You nailed it right on the head in your analysis.
Nvidia wanted to delay, and squeeze out more blood from the turnip, and 880m was what we got as a result. It was done strictly for financial reasons for the shareholders to get every last ounce of money out of us gamers, mostly because AMD has not been keeping up with the pace of innovation
release.
The 780m is the same as the 880m in every way but one- an extra $200 dollar profit for Nvidia at our expense. -
Sent from my HTC One_M8 using TapatalkMr. Fox likes this. -
New bios just got released by Alienware A13 for alienware 17 lets see if it fixed the problem.
-
As for the utilization, I expect it to drop when it hits the V-Sync threshold (30 or 60FPS) but not when the card is struggling to hit 25FPS and is not even at 50% usage. That's just weird. Watch Dogs is also the only one that I used the lock for. I do tend to use V-sync as I hate tearing but I'll try adaptive anyway just in case.
Again though, it's just that game and BF4 that have the framerate halved periodically for no reason. I can't imagine that's a V-sync issue as I use that with every other title to no detrimental effect. Still, it's worth checking out. -
-
But AMD not keeping up with the pace of innovation? Yessh some people are quick to ignore history and how the battle between both always brought the best. If it wasn't for AMD and the 7970m, we wouldn't have gotten the 680m we got. nVidia played it safe with a small increment in power, and got that swift kick to the head. Afterwards we got 680m and quickly the 780m. But we are at the current limit for these GPUs. Neither can bring something out right this moment, so they are preparing an upgrade. But it's easier to squeeze the customers out of their money by selling names like 880m.
Both AMD and nVidia are preparing their next upgrade. While it won't be as large as a dieshrink upgrade, the new architecture will bring a decen increase in performance in the next couple of months.
Personally I am much more in favor of AMD, but in laptops their presence is much reduced, considering they have less than half the available products to even compete. I use both nvidia and AMD extensively, both laptops and desktops, and I have both in high regards. Both have failed me hard, but also both have offered me excellent products for the last 10 years.
I am very much looking forward to M295x and 980m -
-
Would like to remind everyone that Ubersampling is a 20x sampling bonus. At 1080p base, 4x super sampling = 4k res. 16x SS = 8k res (7680 x 4320 resolution). 20x ubersampling is beyond even that, so yeah. Don't expect it to run well on pretty much any single GPU out there XD.
As far as witcher 2 goes, 1 card between 40-60fps @ 1080p maxed seems right, because with my two 780Ms at stock, that's what I get in 3D (half fps or less) -
-
Oh well, you learn something new every day. Might as well force it to 1440p and play in 3D -
AA is still just a gimmick for the most part. It's an artificial softening of the image which has so many mixed results. Unless you're running non native resolutions, there's really not much need for AA, other than 2xMSAA just to smooth things out a bit. If you're looking at a still screenshot you might be able to pick apart the apparent improvements, but in reality, it's hard to discern and all comes down to personal preference.
n=1 likes this. -
I can't recommend Ubersampling though because it is bugged. Turning it on disables the in-game MLAA (which is really really nice and sharp, among the best PPAA I've ever seen) as well as the sharpening filter, which means some textures are actually sharper with Ubersampling off. Also, while Ubersampling does a terrific job of smoothing out most jaggies, there are some (ex. HDR lighting) that strangely aren't AA'ed at all.
TBH, I think the best combo of IQ+performance is in-game MLAA with 16x AF forced in driver (Witcher 2 has no AF by default). If you must have your graphics p0rn, forgo the Ubersampling and use driver downsampling, although this will shrink the HUD. And I take back what I said earlier about SGSSAA as it doesn't work properly in this game due to causing excessive blurring. Neither does MSAA, which causes artifacts. Damn deferred shading. -
The only AA that "artificially softens the image" are the post-processing techniques (e.g. FXAA) which are nothing more than a simple blur filter. "Fake" AA, if you will. "Real" sampling-based AA such as MSAA and SSAA actually increase detail.
And in reality, AA is most noticeable not in static screenshots but in actual gameplay due to the crawling/shimmering while in motion caused by temporal sub-pixel aliasing.
I personally can't stand playing any game without AA unless it's on an extremely dense PPI screen, and there's really no excuse not to use AA in this day and age considering the cheap PPAA techniques that work with every game and cost almost 0 FPS. -
THEY DONT WORK WITH THE 880S IN SLI OR OTHER WISE.. base score 0f about 18k in sli with those drivers.. driver errors out every 2 seconds and with these driver notebook seems to just hang and be unresponsive. -
Also its friday pidge we are all awaiting this update from you!
-
So July 24th was the last time pidge was here.
Is the problem fixed or did he run away once the silicon quality speculations etc was brought up?
Maybe it was more of a "to hell with these tinfoil guys. Go f yourselves. Im done with this &¤#%!#"
You are free to explain that we are wrong pidge. I`m sure you know a ton more about these details than any of us doTBoneSan likes this. -
Hmmm .. Silence can't be good.
880m owners are getting bent over big time on this one.
-1 Rep NvidiaCloudfire likes this.
Pidge from Nvidia has asked that user experiencing problems with the 880m list them here..
Discussion in 'Gaming (Software and Graphics Cards)' started by DumbDumb, Jul 16, 2014.