The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia clockblock: vBIOS (unblocked in 353.00)

    Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Feb 23, 2015.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    KF2 is also UE3
     
    Mr. Fox likes this.
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Oh yeah... 780M SLI EATS IT UP with Alternate frame rendering 2... well over 100 FPS running stock clocks.

    [parsehtml]<iframe width='1280' height="720" src="https://www.youtube.com/embed/U7QHdh6iGXQ" frameborder='0' allowfullscreen></iframe>[/parsehtml]
     
    Last edited by a moderator: May 6, 2015
    D2 Ultima likes this.
  3. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Sounds great to me! I manage ~120 with 1058/6000 for Toxxik, so SLI should love it. Maybe I'll force it to 125fps and OC my cards a bit; hope I don't CPU bottleneck.
     
  4. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    Tested 350.12 driver earlier in 3DMark Fire Strike with 1072/1490. Got the same score as with 344.75 (P6360). Experienced no throttling with this Overclock on my gtx780m. It is strange that all your AW models with (1045/1500) and some other, experienced throttling in Fire Strike with this driver. This indicates that this driver works differently on different laptops, and even within the same brand.

    Now I have also tested with 3Dmark11 benchmark test and Nvidia driver 350.12 as many have complained about problems with trottling during overclocking of Gtx780m. Run with 1058/1485(no trottling) on gtx780m and gets a score of P9456. Is there anyone who has comparable score with gtx780m (1059/1485) and other drivers?
    Works in any case with my AW17R1
    Skjermbilde (773).png
    http://www.3dmark.com/3dm11/9745494
     
    Last edited: Apr 28, 2015
  5. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    ^ it is possible that it simply hates SLI... remember the 880M had huge problems in SLI that didn't show up in single-GPU? And the 980M seems to be ok in single GPU AW machines, much more so than in the SLI models.
     
  6. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Mmmm forced good SLI into Killing Floor 2. Good good. Some nice nVidia Inspector magic here.
    Toxikk dealt with with some good old NCP magic.

    I feel good tonight.
     
    Mr. Fox likes this.
  7. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Using @j95 mod for 350.12 my 780M SLI throttles even at stock clocks. Cleaned up with DDU and reinstalled my 345.20 DT driver mod and the throttling goes away. It behaves as though 350.12 is imposing a clock throttle to reduce power consumption, very much like what I experienced with 980M SLI in the M18xR2. When overclocked, the throttle occurs sooner (almost immediately under load).

    Here's an example video... [ LINK]
     
  8. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    @Mr. Fox Does this happen forcing power management to "prefer maximum performance" in global settings in NCP? And in windows power options, just in case, setting "PCI Express link state power management" when plugged in to "off"?

    I can say without a shadow of a shadow of a doubt, that is something I've never even *SEEN* before on this computer, far less with this driver package or any program.
     
  9. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Oh yes, absolutely... in fact, that is the ONLY way I configure all of my computers, 100% of the time. I never allow any kind of power conserving settings. I even disable everything in the BIOS related to power saving (Link Power Management, ASPM, you name it... all turned off) because I want my systems to suck every ounce of power they possibly can so they can to run balls to the wall 24/7, with no regard whatsoever for "efficiency" crap. I hold nothing but contempt for "energy efficiency" and power management technology. Just for kicks, I did try balanced mode in Windows and the "let the driver decide" crap options in NVCP and the throttling is much worse that way.

    [parsehtml]<iframe width='1280' height="720" src="https://www.youtube.com/embed/aPxSge4NMlQ" frameborder='0' allowfullscreen></iframe>[/parsehtml]
     
    Last edited by a moderator: May 6, 2015
    D2 Ultima likes this.
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh god no take it away *shudders at horrible throttle-monster*
     
    TomJGX and Mr. Fox like this.
  11. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    LOL... solution is simple... I just don't use this driver and life is good. :vbthumbsup:

    Edit: And, yeah, this is very similar to how 980M SLI malfunctioned in my M18xR2, just not quite as severe as 980M SLI throttling was.

    Strange (and most disturbing) that NVIDIA can screw up performance so badly like this with a simple driver update.
     
    Last edited: Apr 28, 2015
    D2 Ultima likes this.
  12. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    A horror video :eek:. Maybe the new direction from Nvidia... What will the next driver bring?
     
    Mr. Fox likes this.
  13. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Yup, scary stuff from the Green Goblin. These guys at NVIDIA need a lobotomy, but beating them for several hours with a Cat o' Nine Tails embedded with glass fragments would be more fun right about now.
     
    ajc9988 likes this.
  14. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Yes, I remember that because it affected me for a short time, then I upgraded. Although, now it seems more random. Some 780M users are not having issues with 350.12. Some 680M owners are reporting this driver works fine. Some Maxwell GPU owners (mobile and desktop) are having problems with it, but others are not. When I modded the NVDMI.INF there was no throttling at stock clocks, only overclock. Using your NV_DISPI.INF mod 780M in my machines throttles stock and when overclocking (as the video demonstrates). Using 345.20, my problems disappear. Those distinct changes in behavior seem to confirm the problem is the fault of NVIDIA's driver, as does the fact that your mod fixes throttling for 980M in Alienware machines. It is NOT OK for NVIDIA to be doing this sort of crap.

    It is always good to complain in forums because bad publicity sometimes produces greater accountability for no reason except to avoid more bad press. But, it is also useful to poop on the carpet in the GeForce forums as much as possible and tell them how bad they suck in their own house. NVIDIA has a bad habit of only taking action to fix bad publicity and sweep the dirt under the rug, but they NEVER seem willing to admit their mistakes... they always seem to have a convenient excuse for their mistake and/or bad judgment.

    Official NVIDIA 350.12 WHQL Game Ready Display Driver Feedback Thread (Released 4/13/15)

    Which GPUs? Kepler or Maxwell? If you are asking about 980M SLI users, I don't know. There are not many upgraded Alienware machines to look at, and fewer with SLI. I think the problems have scared most of the potential upgraders away and they are looking at sticking with what they have or buying another brand. I'm half tempted to pull the 980M cards from my Clevo and test your driver mod in my M18xR2 and Alienware 18, but it's way more hassle and time consuming than I feel like dealing with right now. I'd have to reinstall Windows 8 again and screw up a bunch of stuff on my machine simply to convert back to pure UEFI because of Alienware's screwed up BIOS compatibility problem. NVIDIA just needs step up to the plate and do the right thing by fixing their engineering abortion to make things right for everyone. Their incompetence has never been this amazing.
     
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Eh same story on the desktop side with nVidia no longer optimizing for Kepler cards ( Source)

    Now before everyone calls me a mad (tinfoil) hatter, I encourage you to look at TechPowerUp's very nice performance summary charts, and compare the 780/Titan against reference 970 from when Maxwell was first released, and then 4 months later.

    Specifically, lay these charts side by side:

    1920x1080 Performance
    [​IMG] [​IMG]

    2560x1600/1440 Performance
    [​IMG] [​IMG]

    From the 970 review (Sep 2014):
    At 1080p: 780 = 88% 970; Titan = 95% 970
    At 1600p: 780 = 90% 970; Titan = 98% 780

    Now skip forward 4 months to Jan 2015 and look at the 960 review:
    At 1080p: 780 = 85% 970; Titan = 92% 970
    At 1440p: 780 = 85% 970; Titan = 92% 970

    So in just 4 short months, both the 780 and Titan lost 3% relative performance against 970 at 1080p, while that gap increased to 5/6% at 1600/1440p.

    To address the 800lb gorilla in the room: yes quite a few of the tested games changed and the 960 review added more games, but there still exists 10 common games between the 2 reviews, so a good sample size.

    =====================================================================

    Slightly OT (stop reading if AMD makes you go [​IMG])

    I'm amazed at how much the 7970 GHz Ed closed the gap with its big die Kepler (GK110) rivals in just 4 short months. Seriously if you do the same comparison as above except with Titan/780 instead of 970, you'll see:

    Sep 2014 970 review:
    At 1080p: 7970 GHz Ed = 76% Titan, 83% 780
    At 1600p: 7970 GHz Ed = 78% Titan, 85% 780

    Jan 2015 960 review:
    At 1080p: 7970 GHz Ed = 82% Titan, 88% 780
    At 1440p: 7970 GHz Ed = 85% Titan, 92% 780

    So again in 4 months, 7970 GHz Ed closed the gap by 5/6% at 1080p, and a pretty good 7% at 1600/1440p.

    If we compare against its original rival, the GTX 680, you'll see that it went from on par to just hair faster at 1080p, but at 1600/1440p the advantage increased from 12% to 18%.

    Adding more insult to injury, if you compare against the reference 970 in those 4 months, the gains are much smaller, 3% at 1080p, and only 1.8% at 1600/1440p. Further evidence that nVidia simply abandons older hardware owners once it has released a new lineup.

    Oh and before someone screams drivers:
    - AMD only had 2 new drivers out between Sep 18 2014 and Jan 22 2015 - Catalyst 14.9 (Sep 29 2014), and Catalyst 14.12 (Dec 9 2014)
    - nVidia had at least 6 new drivers out between the same time period
     
    Last edited: Apr 29, 2015
    TBoneSan, D2 Ultima and Mr. Fox like this.
  16. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    I don't think you're mad (nuts) at all. Those are valid observations.

    Although optimization is really nice to have, I'd be happy enough if they would hold their ground on performance and knock it off with breaking things. Just make drivers that work, stop trying to control what I can do with my own property, and stop trying to impress me with stupid feature gimmicks that don't make my beast run faster. That won't distract me from what's broken, even if it is shiny or has sparkles. It makes sense that they would focus on getting Maxwell right, since it is their latest and greatest. Why not just set the expectation up front and announce, "This driver is designed for Maxwell only, and might even make your Kepler run like crap... so don't use it." LOL.
     
    D2 Ultima and n=1 like this.
  17. TR2N

    TR2N Notebook Deity

    Reputations:
    301
    Messages:
    1,347
    Likes Received:
    255
    Trophy Points:
    101
    I guess the moral of the story is that you can never satisfy all gamers new and old.
     
    Mr. Fox likes this.
  18. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    That's true. But if I were one of the guys with a new 970 or 980 GPU that malfunctioned with the newest drivers, I wouldn't be too thrilled about that either. 350.12 is all over the board. Works great for some, but not for others running the same GPU. Really strange... especially, since drivers used to be something NVIDIA was consistently good at. Maybe they're just tired, LOL.
     
  19. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well, let's just say me and a few others have been saying (on other forums) that Maxwell drivers have really gone down the gutter since a few weeks after launch. Needless to say we were harassed by the nVidia Gestapo loyalists, but seems like we were right all along, and I feel vindicated.

    I'm curious whether this is nVidia starting to become Intel, because they can afford to just not care anymore due to the lack of competition, or if there is something genuinely wrong about Maxwell. Not long ago I wrote a book ranting about Maxwell's fake efficiency which mainly comes from a very aggressive throttling algorithm, and can cause instability problems even at stock. Perhaps nVidia is reaping what they sowed for this fake efficiency...

    I also think honesty has become a very rare commodity these days, especially when it comes to any for-profit organization. Hell just in the context of tech companies alone, I'm pretty sure if I dig deep enough, I can dig up horror stories on just about every company in the field.
     
    Last edited: Apr 29, 2015
    TBoneSan, TomJGX, Papusan and 2 others like this.
  20. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    I think a lot of that is probably accurate... maybe all of it. It certainly appears to play out that way if you're paying attention. And, yeah, honesty is a rare commodity... so is competence.

    I think part of the problem with inconsistent results is they are preoccupied with asinine "features" and trying too hard to be slick with a level of system integration that is totally inappropriate and unrealistic. What I mean by that is there are too many variables, dependencies and contingencies that should not even be a consideration for NVIDIA. They need to focus on making the stinking GPU... period. They should say, "OEMs, this is a 125W MXM module... it will draw 125W under normal use, and this GPU requires your motherboard have X in order for it to work right... Now, shut up and deal with it accordingly." Instead of "Well, this GPU can use 125W if there is a full moon, unless it's cloudy with a chance of rain, in which case it is going to draw 135W... but we're going to make it only use 100W if the battery charge is below 70% because that equals an extra 3.675% of battery run time" Or, "This GPU can sense how much output the PSU is rated for, and if the imbeciles that built it cut corners to save a few bucks, we can still make it work fairly well by blocking overclocking through the drivers, except for when the product of the relative humidity in Chicago divided by the ambient temperature in Warsaw on Monday is greater than the numeric value 20, in which case it will not overclock, but if it is less than 20 it will still overclock, but it will throttle... except for when it doesn't, in which case there will be a BSOD unless the CPU TDP is below 40% of TDP capacity."

    I am being ridiculous to make a point. It used to be more like that "this is how it is, now shut up and deal with it" approach and I think that worked out a lot better. Fancy does not mean better, and sometimes it actually means worse. They need to build the GPU, make it awesome and let the chips fall where they may with the OEMs. A 970, 980, Titan, or any given MXM card should run the same on every machine with the same core components assuming that it isn't messed up or misconfigured by an incompetent OEM or end user.
     
    Ashtrix, ajc9988, TBoneSan and 6 others like this.
  21. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    That's just NVidia optimising the newer drivers for the new Maxwell generation, that doesn't necessarily mean that they're decreasing the efficiency of Kepler cards, just means they're improving the Maxwell side while Kepler stays the same. That's just normal for new generation architecture to have gradually improving drivers right?
     
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I think you might be onto something there, the "trying to cater to everybody while somehow maintaining efficiency (puke)" part. I'll also add "making sure the GPUs don't overheat even if the OEMs decide to challenge the laws of physics by fitting a 75W or 100W part inside a 0.5" thick machine" to the list.

    Oh I'm not cynical enough to think that nVidia is somehow sabotaging their own cards. But what I am suggesting is that it would appear Kepler cards either receive less or no optimization compared to Maxwell, which you seem to be saying as well. This is what I meant by nVidia abandoning old hardware owners -- as soon as a new arch is out, optimization for the old arch either is either very deproritized or stopped entirely.
     
    TBoneSan, Robbo99999 and Mr. Fox like this.
  23. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    EXACTLY. And, that is an excellent add-on example.

    The GPU should be a "dumb" component, as in system oblivious and agnostic. I would MUCH PREFER my system to turn off from lack of power if I exceed the capacity of the PSU than have to deal with erratic behavior. I can deal with functional limits by backing off... on my terms, not theirs. When I was trying to get 980M SLI to work right in my M18xR2 it was not possible to overclock to the point of tripping the breaker on the AC adapter or having it turn off like it does with 680M SLI or 780M SLI when I push the overclock too far. It should have been, but it throttled as soon as it hit a certain power consumption level. That level was eerily similar to the nominal rating of AC adapter (~325W) and with 350.12 drivers making 780M SLI exhibit behavior similar to what 980M SLI did, I'm ready to bust some skulls. Not because I care about a random messed up driver--which is forgivable and forgettable--but, because they may have taken a liberty that I feel should not be theirs to take. That makes me feel violated, and without any lubrication to lessen the trauma.

    Looking back on things, going back to Fermi days, the nonsense started with unnecessarily complex performance states and the GPU Boost marketing gimmick. Remember 580M throttling, and matching P0 clocks with P1 clocks magically fixing things? There should only be two performance states for a GPU... full-on 3D and no-demand... on and off. Adding the different performance levels is just retarded. When you examine that rat's nest in the vBIOS with tools like NiBiTor, Kepler BIOS Tweaker or Maxwell BIOS Tweaker it's enough to make a guy want to puke. It does not need to be that complex, and making it so adds no value, IMHO. All it does is enhance the opportunity for malfunction and instability to surface.

    I cleaned up my previous post a bit. I had a few typos from falling asleep when I was typing my thoughts.
     
    TBoneSan likes this.
  24. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I just think that Kepler had likely been optimized to a point that it was difficult to eek out a whole lot more. With Maxwell being a newer architecture, it's still maturing and they're finding ways to improve on it. Like with (most) any product, the more mature it is, the better the quality (usually) and the less you can improve on it or make it more efficient.

    But I agree with Mr. Fox on the throttling. There is really no need at all for that to happen. Laptop makers should be able to artificially limit the clock speeds for the ultrathins if the cooling or power requirements aren't up to snuff, but make it a special SKU and call it something different like L970m or 968m or something to designate it's lower performance.

    Problem is Intel has been doing this for years. Artificial throttle limits have impeded laptop performance for years. I think it would be OK if they did this as long as they offered an option for an "Extreme" CPU or GPU that gave users ultimate control over the clocks.
     
    Papusan, Robbo99999, Ethrem and 2 others like this.
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    buuut we're not likely to get that, because the market has spoken, and the same thing I called is happening. Especially on other forums. People are hunting for stuff that'll last them 4 years playing BF4 at good graphics settings and "5 pounds" is too heavy. It's either a cheap-out with a Y50 or people are hunting Razer Blades and Aorus laptops. All of which have issues that don't need to be there. When people come asking for video render machines or anything that's worth its own salt in the CPU department, because that's what they want to do and gaming is a secondary, and I tell them "well, your choices are limited unless you want to deal with a throttle-happy CPU, as gaming is, despite what many think, 'medium-low' CPU load at best, and you're asking for 'medium' or 'medium-high' loads", I get JUMPED ON. Legit bashed, fought down, attacked, what have you, with a ridiculous amount of people saying "I have <insert crap machine> and it plays games fine!" or "I don't notice any issues!" or even the people who will

    prepare yourself Mr. Fox

    DEFEND the fact that machines aren't going to keep their speeds/do their job properly, by saying the following:
    Manually lowering the turbo bins so it won't throttle is a good alternative
    Allowing it to throttle, as it "won't affect the performance all that much"
    Claiming that the user can't seem to scrounge up more than $900-$1000 for a render/game machine, therefore what they can buy is fine (never suggest to save up more, no, that's terrible. Yes, I understand some people need it NOW, but those are rare)
    Insist that it's "just a laptop" and it doesn't need to work so well, and that if anybody wanted to do something, they'd buy a desktop

    And possibly an even larger issue is that people don't seem to get that things are broken/deteriorating. As long as a laptop is thin (because maxwell's unnatural coolness automagically means all tech, now and forevermore, will be just as cool or even cooler, and that we will soon be having flagship mobile GPUs in sub 1" thick notebooks, without any heating or power issues at all, totally uncaring about any lack of storage options or soldered anything.

    Or we've got the state when people have the "I don't notice this so it must not exist" issue with desktop cards, because heaven forbid any user who does more with their system than you sitting there playing CS:GO and BF4 and every title Ubisoft churns out that has "assassin's creed" on it sheepishly away from your circle of friends who openly bash Ubisoft, and has experienced and proved more issues that you probably don't notice. It's like the 970 issue. Any time I see someone looking for a 970 I tell them buy a 290X or save and get a 980. People are still harping that a 970 is a great card. Anybody who experiences the issues with them (notably dual- or more monitor users with Win 8.1 who like to multitask) is instantly bashed/shamed for ever even thinking about complaining about a LEGITIMATE defect in a card that hampers the way they use their PC. I said it once and I'll say it again: I cross 3GB vRAM so much it's a joke; if I had 970s I would have noticed something was wrong in about a week from launch.

    What we have is a lot of people with money who are pretty spoiled and have a "if it doesn't affect me, it might as well not exist" attitude, and the anti-consumer crap we're speaking out about right now might as well not exist for all they care. It's why I keep saying I hope intel/nVidia REALLY screw over the desktop people the way the laptop users are getting it. I hope all of them need to hack their system BIOSes and vBIOSes to get what was once considered "normal" functionality, and that they all begin to complain. While I spend my time laughing at them and giving people a taste of their own medicine, maybe things might change for the better.

    But there's one thing that's for certain, and that's that nothing will happen without competition. Intel could solder every single desktop chip to the board for skylake and if AMD's new CPU line has a worse IPC than bulldozer and doesn't come clocked at 8GHz out of the box, you can bet your next paycheck that people are gonna be buying skylake for gaming and anything that isn't pure productivity.
     
    Mr. Fox and Ethrem like this.
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Riding books again
     
    D2 Ultima likes this.
  27. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, I agree, but I do think that idle low clocks are useful to decrease temperatures & power consumption during 2D stuff like desktop & web browsing, so I don't think they should do away with those - so I wouldn't want it running full blast all the time (max clocks).
     
  28. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    The growing severity of retardedness among the spoiled adolescent and yuppie gamer-brat crowd really is taking its toll. When stupidity exists in large numbers, the results can be truly devastating. It would be very painful to watch, but tragedies like the clock-blocking and BGA filth that we are putting up with in the laptop enthusiast world does need to happen to our desktop jockey brothers to get them to wake up and stop being self-centered laptop-hating idiots. Most of them don't have a clue about how bad things are getting, and a sip of the nastiness could be very useful to get their attention. Hopefully, they can start to work with us instead of against us. If we can rally their support and get them to stop being part of the problem, that would be awesome. Working together, perhaps we can exorcise the thin and light demons from the souls of all of the bubblegum gamers and put the industry back on a path to excellence. You don't need to know how, or even be interested in being an overclocking nut to understand that killing that sport is going to eventually destroy everything that everyone values in the PC world at some point. If the ODMs and OEMs have to make stuff that satisfies us to remain in business, everyone is going to be better off because of it.

    The pantywaists that got us into this terrible mess, with their pathetic sniveling and whining about 5 pound 15-inch and and 7 pound 17-inch laptops being too big, thick and heavy should be spanked severely and sent to bed early, with no dinner. But, the evil OEM witches that thought it would OK to stop paying attention to mobile extreme performance enthusiasts should burned at the stake. Bring me my belt and a book of matches, LOL.
    I agree, which is what I meant by " There should only be two performance states for a GPU... full-on 3D and no-demand... on and off." Full blast when 3D load exists, low power state when it does not, with absolutely nothing in between... on and off high performance state. Sorry if that wasn't entirely clear with the "on and off" part. What would be really awesome is having a manual toggle switch for that.
     
    Exec360, Ashtrix, TBoneSan and 3 others like this.
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    There needs to be more than just 2 P-states (highest and lowest). The in-between ones for lower demand tasks like video acceleration and in case GPU is running too hot at P0 and throttles, you don't want it to fall all the way down to idle clocks. That was the arrangement for years and it caused no issues, it was only the introduction of an additional Boost state with Kepler that all the trouble started.
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah my 280M had three states. I still remember the clocks. 120/240/400, 336/400/786 and 583/950/1450. That worked just fine.
     
    Mr. Fox likes this.
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    My GPUs already run at P0 doing video acceleration (like YouTube in a browser). I don't want my GPUs to have any option to throttle from being "too hot" because I want to be the one that defines what that means. I'd rather fix my own thermal problems and just have it run full blast until it shuts down my laptop once it reaches the thermal trip point. I have my CPU thermal management settings disabled or set to their maximum available values in the BIOS on my machines that support it. The thermal trip point is hard-coded to protect the CPU and cannot be modified, but throttling clocks to cool down is largely averted. It works perfectly like that, and I would do the same with the GPUs if I could. I also prefer driving a car with a stick shift instead of a slush-box tranny.

    Having one in the middle like the old days would be OK, but I would still want to have absolute control over it rather than letting the GPU/vBIOS decide the best course of action on my behalf.
     
  32. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's cool Mr. Fox. Just remember that not everyone uses their laptop as a desktop like you do where battery life/heat/fan noise aren't considerations. ;)
     
    Ionising_Radiation likes this.
  33. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Yes, I understand that. And, I agree there is a need for that flexibility. The main thing is I want everyone to have the option of exercising absolute control over their hardware if they want to. There's no reason both cannot coexist. The GPU does not need to make all of the decisions. Letting it have unilateral control over its own behavior without our input into the situation is part of a huge problem we are all faced with.
     
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's why I hate GPU Boost :mad:
     
    TBoneSan and Mr. Fox like this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Me, too... it blows... real bad.
     
  36. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Possibly. We probably need to go back and look at the relative performance of the cards before Maxwell, but we may never know for sure. The timing just seemed a bit too coincidental. Also brought up AMD's example because I was amazed that 3 years later it was still gaining ground against Kepler cards, although there's probably a joke about terrible AMD drivers somewhere in there. :p

    Well we do have (or used to have) Extreme MX/XM CPUs that let you do whatever you wanted with them, although they obviously went for ridiculous amounts of money. On the desktop side, technically "extreme edition" GPUs exist, in the form of custom models like MSI's Lightning cards and EVGA's Classified cards.

    But with Intel going full retard BGA on mobile CPUs and nVidia trying to block overclocking, I doubt things are going to change for the better anytime soon.
     
    D2 Ultima likes this.
  37. j95

    j95 Notebook Deity

    Reputations:
    2,461
    Messages:
    1,475
    Likes Received:
    1,308
    Trophy Points:
    181
    Meaning optimizing without taking into account previous generation 'excluding critical issues' Risky enough aka 320.18 WHQL. . Keep upgrading... :newpalm:

     
    Mr. Fox and TomJGX like this.
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    It's also not clear whether it was really gaining ground over 3 years, or AMD's driver team is so incompetent that it took them 3 years to figure out how to make their own products work correctly. Small tweaks and gains make sense. Major gains do not. I have to wonder why they didn't deliver the goods on performance 3 years ago. (I'm not trying to be cute or joking around. I really do have reservations about their overall technical competence and what seems to be a long history of lackluster driver support.)
     
  39. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Or, you know, it could be that GCN is so radically different from what they'd done in the 5 years prior that it's taken them time to figure things out and, as a result, they're still squeezing optimizations out of it
     
    Mr. Fox likes this.
  40. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Really, that's almost as crazy as what Pelosi said about not be able to know what is included in the Obamacare package until after the legislation was passed. How could AMD engineer something so radically different that they had to figure it out later? I guess that could happen by accident, which still means it took them too many years to figure out how to fix their products. That doesn't give me any warm fuzzies, even if they got lucky and figured out how to correct their own mistakes after many years of effort.
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Fix? GCN was never broken. Software development is a gradual process. GCN was AMD's biggest architectural change since Terascale in 2007 while Nvidia went on a more gradual path from Tesla to adding double precision in Fermi to nerfing DP in Kepler to killing DP in Maxwell. I'm just thankful that AMD is still improving GCN 1.0 while Nvidia seems to have abandoned Kepler long ago in terms of meaningful optimization. Which is why they were evenly matched at release but now GCN is clearly faster as the numbers by n=1 showed.

    As far as "mobile products still as pathetic as the day they were released," didn't 7970M make laptop enthusiasts all wet including yourself?
     
  42. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    GCN in a desktop has always been fairly decent, but that's not very relevant in a discussion about notebooks.

    Their mobile GCN products have been plagued with issues from day one. Gradual software development isn't worth a lot if products frequently fail within their first 12 to 18 months of use. None of the 7970M rebrands have been anything to be excited about in terms of performance. Here we are, several years later, and the best we can muster is a "meh" for mobile GCN.
     
    Ethrem likes this.
  43. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I think it's pretty clear by now that GCN just isn't made for mobile
     
    TomJGX and Mr. Fox like this.
  44. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    There, I fixed that for you, bro. :vbwink:
     
    TomJGX likes this.
  45. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    What's the difference? :p

    I was trying to convey that Maxwell and Kepler to a lesser extent are mobile focused--built from the bottom up. Nvidia has made this very clear. GCN is built from the top down. Opposing design philosophies.
     
    Robbo99999, TomJGX and Mr. Fox like this.
  46. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    @octiceps - I agree with you 100%. My only point is that improvements to GCN are not very relevant in the mobile realm, where GCN continues to leave a lot to be desired even though they have had ample time and opportunity to do something amazing with it.
     
    Robbo99999 likes this.
  47. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's because with GCN, AMD never set out to make a great high-end mobile dGPU. There's a long list of reasons and history behind it including AMD's reasons for buying ATi in the first place, why Bulldozer was designed the way it was, and AMD's business strategy. Which I won't bore you with. :)
     
  48. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Fair enough, bro. Chances are, I wouldn't care what their reasons are anyway, so good call. + Rep for your patience.
     
    octiceps likes this.
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah you definitely wouldn't care LOL :D
     
    D2 Ultima likes this.
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Ol' Foxy is a results-focused man. The "why" it doesn't work doesn't help much XD.
     
    Mr. Fox likes this.
← Previous pageNext page →