The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Pidge from Nvidia has asked that user experiencing problems with the 880m list them here..

    Discussion in 'Gaming (Software and Graphics Cards)' started by DumbDumb, Jul 16, 2014.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    AMD isn`t any better. They had a ton of Enduro and Crossfire problems with their 7970M cards, while GTX 680M was a plug and play card that ran cooler and just worked. Not to mention an interesting number of dead 7970M cards posted on this forum from owners.
    AMD have pushed out 3 identical cards, 7970M - 8970M - R9 M290X where the last one, M290X, got a tiny 50MHz bump. Atleast Nvidia gave us something new with GTX 780M with more cores and a more agressive clocks. But they should have stopped there instead of getting greedy and pushing out a broken rebrand card operating on the edge of whats possible in a notebook.

    pidge, although that he bought an Alienware notebook with the 880M to try to investigate, I find it strange that Nvidia have to go out and buy notebooks, instead of doing testing inside their own facility where they surely have a ton of these dead old GK104 chips laying around in piles.
    Unless they have more than less given up on using too much effort and energy on fixing a 2.5 year old architecture. Its nothing wrong with the Kepler architecture, but like I said in my previous posts, GTX 880M aka GK104 @ 1GHz should have never been put out and launched for Average Joe to buy and expect to run all cool and stable. For enthusiasts that know the pro`s and con`s of such clocks, maybe, but I believe its a thin line between GPU Boost working on 990MHz and throttling inside a little notebook, which seems to be the case for many people.

    People should be warned to not buy the GTX 880M. OEMs should refuse to out them inside their notebooks. Not gonna happen though, Nvidia trick people with their rebrand marketing, OEMs are there to make money, its sad, because that could have pushed forward the launch of Maxwell GTX 880MX/980M which again probably will be as stable and cool as a GTX 680M.
    Instead we are waiting. Waiting for AMD to make a move, hopefully to push Nvidia to release the damn chip, because shure as hell Nvidia won`t lead, but rather follow and one up AMD like the cowards they are. I know when it is launching, and it sucks to wait while Nvidia got the chip ready to go inside a notebook.
     
    D2 Ultima, Ethrem, TBoneSan and 2 others like this.
  2. kamlesh

    kamlesh Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    3
    Trophy Points:
    6
    Here is an update from Nvidia after me linking this post and some other posts about 880M problems with overheating [Your case is being escalated to our Level 2 Technical Support group for further attention. The Level 2 agents will review the case notes to troubleshoot the issue and find a solution or workaround. As this process may take some time and require a good deal of testing and research, we ask that you be patient. A Level 2 tech will contact you as soon they can to assist or point you in the right direction ].

    They seem to be taking this a bit seriously now but again i request u guys having 880M to post their problems to nvidia directly through the below link Support Login will keeps u guys updated.
     
    Ethrem likes this.
  3. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    lol I wonder what they would say about my card puking on itself then magically coming back to life again.

    Thanks for sharing, please keep us posted.

    Just submitted my own inquiry.
     
    kamlesh likes this.
  4. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    Oh, well, lucky me I guess. Assumed this was the card throttling but guess not.

    It's still not hitting 99% GPU utilization while still struggling with FPS in some games (i.e. Bioshock) but, in others, Crysis 3 for one, it stays maxed out and I'm getting a comfortable 50-60FPS on all High at 1080 which is astoundingly good.

    Mixed feelings about the card so far. Had two afternoons wasted fixing what I assume to be driver issues where the GPU utilization would yo-yo from 48% to 75% for no reason. Everything works for now, minus BF4 which still does that. On the other hand, my performance on DOTA, Bioshock, Crysis 3 and Titanfall are all stellar. 60 FPS at 1080 on Ultra or High at the very least with decent AA. The 8GB of RAM on the card may be a marketing gimmick but it's certainly nice to have the option to always toggle a decent amount of AA as it seems to take no performance hit up to a point. Have never had that with a card before (and was pretty much stuck with FXAA at best).
     
  5. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I'm curious what vbios version you have in your machine - I wonder if they updated it somewhere along the way. I'm on 80.04.F8.02.04 and I seriously wish I could get even 954MHz out of my machine, 980 would be a dream come true.

    As for the utilization, that's generally a software problem, not a hardware one. BF4 is among the worst when it comes to optimization, along with Watch Dogs. I don't play BF4 but I have a 780 Ti in my desktop and I compared utilization of my 880M SLI vs my 780 Ti in Watch Dogs and they both fluctuate between 85 and 95% and Bioshock Infinite goes between 95 and 99%

    I also received a response from nVidia... Telling me to update my system BIOS and do a clean driver install... Obviously I'm not going to get anywhere with them but lets see...
     
  6. ThisIsBrutus

    ThisIsBrutus Notebook Consultant

    Reputations:
    20
    Messages:
    251
    Likes Received:
    30
    Trophy Points:
    41
    I just ordered an Alienware 17 with a GTX 880m.

    I´m scared.
     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    If your slave card is the only dead card, you could probably run it with a single GPU while you get it warranty-replaced. Sager can do that if I remember right; simply replace the broken piece.
     
  8. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I would rather just send them the machine, besides I'm going to have them fix my trackpad while they have it. I can live without the laptop for a while and just pull out the old Inspiron 17R.

    I'll ask xotic about it though when I get the RMA info how long it would generally take to get a repair and see what works best. Getting the materials to ship just the card back would be a bit more of a pain than sending it back in the box it came in for sure though...

    Sent from my HTC One_M8 using Tapatalk
     
  9. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Contacting them was an epic waste of time

    " Regarding the drivers of a Laptop : NVIDIA driver is just the basic chipset driver, for a Notebook and to get its full functions we always recommend to run the OEM drivers (which provided by the Laptop manufacturers). Please be informed that we always recommend the Laptop users to get the compatible driver from their respective laptop manufacturers as we have limited support for laptop drivers. Notebook GPUs use drivers that have been customised by the notebook manufacturers to support hot key functions, power management functions, lid close and suspend/resume behaviour. NVIDIA has worked with some notebook manufacturers to provide notebook-specific driver updates, however, most notebook driver updates must come from the notebook manufacturer.

    Regarding the clock speed : As you know in Notebooks the manufacturer will make adequate changes in the clock speeds and other specs especially in SLI set up. If you think that the hardware malfunctions then you can report the issue to the OEM and can get it repaired/replaced as per their warranty policy."

    Gee, tell me something I didn't already know... -_-

    I like how he just blew off my question about the clocks and said to go to the OEM... Again passing the blame but offering nothing about the guidelines for what the cards should be doing and not even addressing whether or not it's normal for these cards to not even run at stock clocks!!!

    Sent from my HTC One_M8 using Tapatalk
     
  10. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Intel systematically crippled AMD's R&D infrastructure through dirty and backhanded tactics. Short of AMD receiving a $20 billion R&D grant, AMD is forever relegated to playing the second fiddle.

    And FWIW, before I quit desktops in 2004, I bought ATi exclusively and they all worked like a charm without me having to do fiddle to get things to work. Even my now 9-year old Compaq laptop with Radeon Xpress200M gave me no trouble at all. Kind of ironic all the problems began when AMD took over ATi :rolleyes:
     
  11. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431

    Most of AMD issues were due to Clevo cards, not the 7970m itself. That GPU was great from the get go from any vendor that did not deal with Enduro. Enduro simply sucked, and I think it still causes some problems to users. That happens when you release an unfinished new software haha :p

    But issues with GPUs are normal. 680m had issues with drivers often locking up certain mhz for core and you had to go back to a previous driver or use a newer one.

    As for the identical cards, yeah AMD basically rereleased the same GPU three times (M290x was the shameful one). Only a single 50mhz bump in speed. nVidia did it too, 680mx, 780m and 880m are the same GPUs. The main difference would be the actual clockspeeds. 780m would be the winner in the end, for they don't push it as hard as 880m.

    Which is weird, since I run 780m at similar clocks than 880m and I don't have the same issues. I wonder what happened there?
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The 680MX cannot be counted as it was made for iMacs only and WAS a 100W card; due to only having 720 core clock and no boost-type software ingrained into it. The 780M was essentially a re-done 680MX, sure, but the substantial clock boosts plus the added SLI ability (which the 680MX didn't have, as far as I remember, most likely due to pure iMac status) make it a substantially different card. The 880M should be the same deal, really. But I don't know. The fact that a 780M OC'd and Ov'd to 880M levels (which have less voltage) is purely stable and runs cooler (on worse heatsinks in clevos, mind you) means that nVidia somehow broke the 880Ms. I tell everyone to ask for 780Ms if they can. But in theory, the 880M is a solid standard bonus over a 780M, and if it worked right and cool then people would be super happy. It's sad it doesn't.
     
    Cloudfire likes this.
  13. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    I'll google how to check my vBios and let you know! Mine stays at 993Mhz pretty consistently so whatever it is, it seems to be working. Plus, the temps remaining at lower than 86 seems to be an anomaly for most 880m users. I mean, I did pay for a repaste with a better solution but this is still a Sager, and those are known for having less than ideal cooling systems.
     
  14. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    Yeah I noticed 880m gets slammed down a lot on these forums and around the web. It's a shame, because 780m is still a fantastic card and as you said, 880m is a solid bonus over 780m.

    Oh well. Hopefully a new GPU will come in the next months to fix this bad reputation!
     
    transphasic likes this.
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The 880M is broken because of nVidia's greed plain and simple. It is 780M silicon binned and pushed to the absolute limit. I bet if the stock voltage was bumped by 10 or 20mV most of the stability issues would go away, but of course then you'd a Haswell-esque GPU on your hands which would be hard to cool properly.
     
    Cloudfire likes this.
  16. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I have a theory:
    *on with the tinfoil hat*

    GTX 860M (MXM), GTX 870M and GTX 880M are all GK104. 3 rebrands. Nvidia didnt bother to put the Maxwell GM107 on a MXM card (a shame really) but instead used GK104 for 3 rebrand SKUs. Why?
    Because they are trying to remove GK104 inventory. Not all chips are bad, but I wager that some of them are not meeting the GTX 780M standards in terms of silicon quality, Just go back to when GTX 780M was released: May 2013.
    At that time, Maxwell was faaar away. Kepler was their priority, especially mobile Kepler, aka GTX 780M, since Nvidia have been doing great in mobile with a growing % of market share. So naturally they want no problems and want top notch chips for those cards. GK104 are manufactured for a wide selection of cards, both for desktop and mobile chips. Some GK104 chips doesnt meet the strict criteria to be put on a mobile chip. Off to desktop cards they go. The finest chips, the chips that are binned, they go to the mobile bin, and ends up as GTX 780M.
    GTX 780M owners see the benefit from this with great overclocking capabilities, reduced leakage which in turn means lower temps than a not so stellar GK104 chips that didnt meet those sorts of criteria.

    So in summary:
    GTX 880M are made from whatever GK104 inventory Nvidia have laying around. This is why we have a slightly mixed opinion about the card. Why GTX 780M @ same clocks as GTX 880M runs cooler (less leakage due to better silicon) and more stable (better silicon). Why some have no issues (got lucky with the GK104 inventory lottery).

    GTX 880M was made to keep the stock holders at Nvidia at bay since Nvidia have not released any new graphic cards for a while. To keep the company running with an income, while the engineers had moved on to Maxwell chips a long time ago. Less details and effort are put in to the GTX 880M chips, including fixing problems (which is why we had poor response from Nvidia with the 880M issues).

    Thats my view on the current situation and that is why I feel Nvidia need a swift kick to the head, why customers should avoid the GTX 880M, why OEMs should not offer the card in their notebooks, and why we need GM204 Maxwell aka GTX 980M right now.

    April 2012, 2 months before the GTX 680M official launch, our first Kepler first high end, we had GTX 680M benchmark leaks.
    We are soon at August 2014 and have still not seen any leaks regarding GTX 980M. They are still using the 28nm node like they have been doing for almost 3 years now, and are taking much longer time than Kepler, which was based on a brand new node (28nm instead of 40nm).

    Screw this. Screw Nvidia
     
    D2 Ultima, transphasic and DDDenniZZZ like this.
  17. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Everyone who is planning on buying an 880M must read this.

    In fact I think I'm going to make it part of my signature to warn people about the dangers of the dog poop known as 880M.
     
  18. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    *puts on tinfoil hat*
    I'm with cloud on this one. There is absolutely no way in the universe that logically, this following list makes sense

    780M versus 880M
    1536 cores || 1536 cores
    120W power drain || 120W power drain
    1.012-1.025v (Ov'd) || 1.00v
    950-1000Mhz core || 954-993MHz core
    5800-6000MHz memory || 5000MHz memory
    Good heatsink || better heatsink
    ~83 degrees max for most without cooling mods || easily hits 90+ degrees & throttles in the same games

    ^ the above doesn't make a lick of sense. Not even close. There has to be some other insane variable at work, and raw statistics don't seem to make sense. If silicon binning is the deal, then sure. Makes perfect sense to me, because the rest of it doesn't. There is no way you take less voltage, better cooling, lower some of the clock speeds across the SAME underlying hardware and get MORE heat.
     
    Cloudfire likes this.
  19. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Silicon binning is real, both of Ethrem's 880Ms have an ASIC quality of over 80%, while both my 780Ms hover around the 72% mark. So better silicon = crappier results makes no goddam sense
     
  20. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well honestly I've no idea. It could be possible Ethrem simply has bad heatsinks and also refuses to use max fans, causing his stuff to overheat like crazy. But even so, the numerous other people with broken 880Ms in the same fashion are so high, that can't be it. Unless the extra vRAM is somehow the problem? I don't know how that could be it... but it's the only real variable if silicon binning is in favour of the 880Ms.
     
  21. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    ASIC quality means that the higher the ASIC % is, the lower voltage is required to run the chip.

    So we can look at it two ways:
    GK104 inventory being sold - All sorts of chips with various quality being sold to customers. Lucky lottery to receive good ones with less leakage and good silicon that runs stable.
    GTX 880M being made, but few have the silicon quality (ASIC quality needed) to make the GTX 880M stable to run those clocks with 1.0V voltage. Throttling, leakage and high temperatures occur.

    With D2 Ultima`s excellent overview of the GTX 780M against the GTX 880M, you can`t help but wonder either way that GTX 780M is the only one that cut it in terms of right voltage and less leakage overall combined with the right core clocks for the right temperatures. Id like to wager again that if Nvidia increased the voltage on the GTX 880M (due to bad silicon quality), the temperature will skyrocket as well, leaving an already hot chip hotter....

    I believe Ethrem tried out a vbios from SVL with higher voltage than stock voltage, which led to pretty high temps...

    Either way you look at it, GTX 880M should never have been made. Nvidia are stretching what could be possible inside a notebook and playing on a thin line, where customers are the ones who fall short and are stuck with a chip that doesnt work like it should
     
  22. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    1.25V is really high, this is desktop territory (the stock voltage for the GTX 770).

    ASICs are usually binned in several categories, as far as I've gathered from the Hawaii binning process thus I'd imagine the NVIDIA ones are similar:
    1. Low leakage: These are desirable for laptop chips as most of the floating gate and insulator structures are well built thus there is less loss during operation. This bin is typified by an ability to run at a desired clockspeed with a nominal voltage (i.e. near switching voltage). Ironically enough, these don't really respond well to overvoltage, needing much more voltage to attain higher clockspeeds beyond the normal operating range.
    2. High performance: These tend to be the leakier chips, though not always, but its because the transistors are capable of a much higher switching speed at a given voltage. These are the ones that tend to get put in Desktop chips. You tend to find that these ones respond great to overvoltage conditions (i.e. they need less voltage per clockspeed step). The very best of these are the ASICs of choice for the extreme overclockers
    3. Broken units: These are the ones that have units that are not functional but the chip is not yet compromised (e.g. Hawaii Pro chips salvaged from Hawaii XT chips)

    Thus when a chip is "binned" in one of the 3 categories above, it does not necessarily mean it doesn't have the characteristics of the other bins. The Bins are more of a continuous envelope than distinct categories. For example, it is quite possible to have a "broken" ASIC with one less functional unit that is in category 1 (these are the desktop Hawaii Pro cards that can get to 1150 Mhz with no voltage increase).
    Likewise, a good overclocker on the laptop would be a low leakage part that has high performance characteristics. A terrible card on the Laptop side would be one that was pulled from the High Performance bin instead of the Low leakage bin.
     
    heibk201 likes this.
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    WHOOPS. I meant 1.012 to 1.025; I keep forgetting that 1.1v is the max these things take. I'll go edit my original post.
     
  24. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    If you download GPU-Z, it will tell you as soon as you open it. I'm also curious what your ASIC quality is. When GPU-Z is open, click the little card icon in the top left corner of the window and click read ASIC quality. Note the first one and then change the drop down box in the lower left to your other card and grab that ASIC as well please.
    Max fans aren't necessary with the stock vbios, temperatures don't go north of 87C before the fan kicks in and then they hover around 81-82C with much less noise than max fans.

    With that being said my testing lately has had max fans and the slave card doesn't pass 80C with my new GC Extreme repaste i did today and the primary hits 79C with the old IC Diamond paste job it's had since like 2 weeks after I got it.

    As for the ASIC, my problematic slave card is 86.3% and my master card with no issues is 80.5%

    I thought that an ASIC over 80% was not a good thing or does that just apply to desktop cards?

    I wish there was an explanation for the heat from these cards but the 780M and 880M use the same heatsink in Clevo machines so... These cards are just plain broken.

    Sent from my HTC One_M8 using Tapatalk
     
  25. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    OK, how the hell do I view my vBios number? Google is not helping.
     
  26. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Download GPU-Z and run it. It will look like this, here's mine:

    [​IMG]

    See the BIOS version?

    nVidia Inspector will also show you the BIOS version.
     
  27. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    Ah, I've used that before. That's the one problem with getting a new laptop every year or two, I forget what programs I need to reinstall... Will have a look later tonight.

    Most of my games seem to be running fine still - The Witcher 2, Cyrsis 3 and Titanfall all keep a near locked 60FPS at 1080 on Ultra (or High for C3). That being said, Watchdogs has that awful stuttering I always heard about but never experienced on my 680m (even with textures set to Ultra). There it seems to be doing the BF4 thing of having GPU utilization drops for a second which is really, really noticeable. We're talking a drop from 80% (30FPS lock is on) to 45% for no reason. The graph on MSI is all over the place.
     
  28. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Don't those framerate locks cause that kind of behavior? These cards are beastly. Witcher 2 maxed out minus ubersampling runs over 100FPS on mine and Bioshock Infinite maxed out is over 200FPS in some places! Utilization is obviously going to go down with vsync because it pulls back to keep the framerate stable.

    Try setting adaptive vsync in the nVidia control panel instead of using the game vsync and see if the behavior changes. It has less overhead than regular vsync.
     
  29. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    so did anyone try the new 340.52 driver and see if there're any changes?
     
    Mr. Fox likes this.
  30. kamlesh

    kamlesh Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    3
    Trophy Points:
    6
    For me that dint seem to improve anything it seem to be concentrating only on improving geforce experience with shadowplay and Nvidia Shield and some other stuffs.
     
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,784
    Trophy Points:
    931
    I was wondering the same thing... Wonder if more people can try it and report back how it is behaving?

    Seems to be working fine on my 780M SLI setup, so I am crossing my fingers and hoping for the best for 880M, too.

     
  32. transphasic

    transphasic Notebook Consultant

    Reputations:
    195
    Messages:
    225
    Likes Received:
    128
    Trophy Points:
    56
    ___________________________________________________________

    Agreed for the most part. What some people do not understand and are being unrealistic about, is that every now and then, GPUs go out and unexpectedly.
    Sometimes, **** happens,
    In my 15 years of owning gaming laptops, I have had a Nvidia card go out after many years of loyal service, and I have had an AMD card go out after a year.
    I would personally rather put my stock on Nvidia than AMD, and it's for obvious reasons. I have seen literally dozens of people have their AMD cards go out rather quickly (7970m), and it has been a costly one when out of warranty for many of us. The 7970m cards were bad to start with, and they stayed that way.
    I have had many people tell me how bad AMD's driver support is, including many people in high positions with Laptop reselling companies.
    AMD has always had bad driver support, and having left AMD to go back to Nvidia, I have made the right and correct choice.
    My 880m has been great, with no problems to it at all, and I expect this to stay this way for years to come.
     
  33. transphasic

    transphasic Notebook Consultant

    Reputations:
    195
    Messages:
    225
    Likes Received:
    128
    Trophy Points:
    56
    ________________________________________________________

    Well said, Cloud. You nailed it right on the head in your analysis.
    Nvidia wanted to delay, and squeeze out more blood from the turnip, and 880m was what we got as a result. It was done strictly for financial reasons for the shareholders to get every last ounce of money out of us gamers, mostly because AMD has not been keeping up with the pace of innovation
    release.
    The 780m is the same as the 880m in every way but one- an extra $200 dollar profit for Nvidia at our expense.
     
  34. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I won't be able to test for a bit, my machine is going back to have its slave card replaced.

    Sent from my HTC One_M8 using Tapatalk
     
    Mr. Fox likes this.
  35. kamlesh

    kamlesh Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    3
    Trophy Points:
    6
    New bios just got released by Alienware A13 for alienware 17 lets see if it fixed the problem.
     
  36. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    Weird, Witcher doesn't run that well for me. I know you have SLI, but it still struggles to stay at a solid 60 on my end maxed out at 1080. I mean, it's more than playable, I just do see drops to the low 50s at points.

    As for the utilization, I expect it to drop when it hits the V-Sync threshold (30 or 60FPS) but not when the card is struggling to hit 25FPS and is not even at 50% usage. That's just weird. Watch Dogs is also the only one that I used the lock for. I do tend to use V-sync as I hate tearing but I'll try adaptive anyway just in case.

    Again though, it's just that game and BF4 that have the framerate halved periodically for no reason. I can't imagine that's a V-sync issue as I use that with every other title to no detrimental effect. Still, it's worth checking out.
     
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Ubersampling on?
     
  38. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    Of course nVidia wanted to delay. Laptops have much harsher requirements in power and temperatures compared to desktops. These high end GPUs toy with those limits every gen, so unless a new architecture comes out, or a new dieshrink is achieved, you will see rebrands with small increments.

    But AMD not keeping up with the pace of innovation? Yessh some people are quick to ignore history and how the battle between both always brought the best. If it wasn't for AMD and the 7970m, we wouldn't have gotten the 680m we got. nVidia played it safe with a small increment in power, and got that swift kick to the head. Afterwards we got 680m and quickly the 780m. But we are at the current limit for these GPUs. Neither can bring something out right this moment, so they are preparing an upgrade. But it's easier to squeeze the customers out of their money by selling names like 880m.

    Both AMD and nVidia are preparing their next upgrade. While it won't be as large as a dieshrink upgrade, the new architecture will bring a decen increase in performance in the next couple of months.

    Personally I am much more in favor of AMD, but in laptops their presence is much reduced, considering they have less than half the available products to even compete. I use both nvidia and AMD extensively, both laptops and desktops, and I have both in high regards. Both have failed me hard, but also both have offered me excellent products for the last 10 years.

    I am very much looking forward to M295x and 980m :D
     
  39. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    I doubt it. Even a single overclocked 780m can't maintain 30fps with ubersampling on. I doubt SLI has more than 100% increase in performance. Maybe SLI isn't working and the game is being carried by only a single GPU? Or maybe scaling is not as good on witcher.
     
  40. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Would like to remind everyone that Ubersampling is a 20x sampling bonus. At 1080p base, 4x super sampling = 4k res. 16x SS = 8k res (7680 x 4320 resolution). 20x ubersampling is beyond even that, so yeah. Don't expect it to run well on pretty much any single GPU out there XD.

    As far as witcher 2 goes, 1 card between 40-60fps @ 1080p maxed seems right, because with my two 780Ms at stock, that's what I get in 3D (half fps or less)
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    LOL what are you talking about? Ubersampling is somewhere around 3x (1.73x1.73) SSAA based on IQ and performance. Driver-forced 4x SGSSAA is better-looking and performing than Ubersampling.
     
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Ubersampling was supposed to be a 20x improvement. If it's only 3x, then they really suck at optimization for everything, lawl.

    Oh well, you learn something new every day. Might as well force it to 1440p and play in 3D
     
  43. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    AA is still just a gimmick for the most part. It's an artificial softening of the image which has so many mixed results. Unless you're running non native resolutions, there's really not much need for AA, other than 2xMSAA just to smooth things out a bit. If you're looking at a still screenshot you might be able to pick apart the apparent improvements, but in reality, it's hard to discern and all comes down to personal preference.
     
    n=1 likes this.
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    20x?! That's insane. So with Ubersampling enabled you get 1/20 the frame rate? For me, Ubersampling only cuts FPS by 1/2 to 2/3. And it is well-optimized for what it does (4x not 3x supersampling--I was wrong before) and doesn't use any additional VRAM, unlike driver-forced AA which can skyrocket VRAM usage depending on sampling rate.

    I can't recommend Ubersampling though because it is bugged. Turning it on disables the in-game MLAA (which is really really nice and sharp, among the best PPAA I've ever seen) as well as the sharpening filter, which means some textures are actually sharper with Ubersampling off. Also, while Ubersampling does a terrific job of smoothing out most jaggies, there are some (ex. HDR lighting) that strangely aren't AA'ed at all.

    TBH, I think the best combo of IQ+performance is in-game MLAA with 16x AF forced in driver (Witcher 2 has no AF by default). If you must have your graphics p0rn, forgo the Ubersampling and use driver downsampling, although this will shrink the HUD. And I take back what I said earlier about SGSSAA as it doesn't work properly in this game due to causing excessive blurring. Neither does MSAA, which causes artifacts. Damn deferred shading. :mad:
     
  45. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You are entitled to your opinion, but IMO antialiasing is always needed and is not a gimmick at all because it addresses one of the fundamental flaws of computer graphics.

    The only AA that "artificially softens the image" are the post-processing techniques (e.g. FXAA) which are nothing more than a simple blur filter. "Fake" AA, if you will. "Real" sampling-based AA such as MSAA and SSAA actually increase detail.

    And in reality, AA is most noticeable not in static screenshots but in actual gameplay due to the crawling/shimmering while in motion caused by temporal sub-pixel aliasing.

    I personally can't stand playing any game without AA unless it's on an extremely dense PPI screen, and there's really no excuse not to use AA in this day and age considering the cheap PPAA techniques that work with every game and cost almost 0 FPS.
     
  46. DumbDumb

    DumbDumb Alienware !Wish money wasn't the problem.

    Reputations:
    1,583
    Messages:
    1,649
    Likes Received:
    259
    Trophy Points:
    101
    '
    THEY DONT WORK WITH THE 880S IN SLI OR OTHER WISE.. base score 0f about 18k in sli with those drivers.. driver errors out every 2 seconds and with these driver notebook seems to just hang and be unresponsive.
     
  47. DumbDumb

    DumbDumb Alienware !Wish money wasn't the problem.

    Reputations:
    1,583
    Messages:
    1,649
    Likes Received:
    259
    Trophy Points:
    101
    Also its friday pidge we are all awaiting this update from you!
     
    Cloudfire and TBoneSan like this.
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So July 24th was the last time pidge was here.

    Is the problem fixed or did he run away once the silicon quality speculations etc was brought up?
    Maybe it was more of a "to hell with these tinfoil guys. Go f yourselves. Im done with this &¤#%!#"

    You are free to explain that we are wrong pidge. I`m sure you know a ton more about these details than any of us do :p
     
    TBoneSan likes this.
  49. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Hmmm .. Silence can't be good.
    880m owners are getting bent over big time on this one.

    -1 Rep Nvidia
     
    Cloudfire likes this.
  50. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Dude more like -9000 rep
     
    Cloudfire, TBoneSan and kamlesh like this.
← Previous pageNext page →