The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    nVIDIA GeForce Drivers v411.63 WHQL Findings & Fixes

    Discussion in 'Gaming (Software and Graphics Cards)' started by Papusan, Sep 19, 2018.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    nVIDIA GeForce Drivers v411.63 WHQL Windows 10 - (64-bit)

    nVIDIA GeForce Drivers v411.63 WHQL Windows 7, Windows 8.1, Windows 8 (64-bit)


    @Ultra Male Are yoo ready, bruh? :D You're all welcome to the new barbeque party. And the Nvidia driver packages continue blow up in size (File Size: 517.11 MB). But will it blow your graphics to the hardware graveyard/heaven? Please test it. The new Turing graphics cards is out. Let's see how much this driver will break for Pascal owners :p
    [​IMG]


    Game Ready

    Software Module Versions
    • nView - 149.34
    • HD Audio Driver - 1.3.37.5
    • NVIDIA PhysX System Software - 9.17.0907
    • GeForce Experience - 3.15.0.164 • CUDA - 10.0


    Provides the optimal gaming experience for Assassin's Creed Odyssey, Forza Horizon 4, and FIFA 19.

    Gaming Technology

    Includes support for NVIDIA GeForce RTX 2080 and RTX 2080 Ti graphics cards.

    New Features

    • Added support for CUDA 10.0
    • NVIDIA RTX Technology
      NVIDIA RTX supports the Microsoft DirectX Raytracing (DXR) API on NVIDIA Volta and Turing GPUs.

      In order to get started with developing DirectX Raytracing applications accelerated by RTX, you'll need the following:
      • NVIDIA Volta or Turing GPU
      • Windows 10 RS4
      • Microsoft's DXR developer package, consisting of DXR-enabled D3D runtimes, HLSL compiler, and headers
    • Vulkan 1.1
      • This driver release provides full support for the new Vulkan 1.1 API and passes the Vulkan Conformance Test Suite (CTS) version 1.1.1.2.
      • Includes interoperability with CUDA 10.0.
      • New extensions for Turing GPUs:
        • VK_NVX_raytracing (also available for Pascal GPUs with 8GB or more video memory, and Volta GPUs)
        • VK_NV_compute_shader_derivatives
        • VK_NV_corner_sampled_image
        • VK_NV_fragment_shader_barycentric
        • VK_NV_mesh_shader
        • VK_NV_representative_fragment_test
        • VK_NV_scissor_exclusive
        • VK_NV_shader_image_footprint
        • VK_NV_shading_rate_image
      • See https://www.khronos.org/registry/vulkan/ for the full specification.
    • Vulkan HDR for Windows
      This driver release supports the Vulkan VK_EXT_swapchain_colorspace and VK_EXT_hdr_metadata extensions allowing applications to output HDR content to HDR displays via the Vulkan APIs.
    • OpenGL extensions for Turing GPUs
      • GL_NV_compute_shader_derivatives
      • GL_NV_fragment_shader_barycentric
      • GL_NV_mesh_shader
      • GL_NV_representative_fragment_test
      • GL_NV_scissor_exclusive
      • GL_NV_shading_rate_image
      • GL_NV_shader_texture_footprint
    • See https://www.khronos.org/registry/OpenGL/index_gl.php for the full specification.
    NVIDIA Control Panel

    System Info shows Boost Clock values (instead of Base Clock) for Turing and later GPUs.

    Application SLI Profiles

    Added or updated the following SLI profiles:

    • HOB
    • Lake Ridden
    • NieR:Automata
    • Northgard
    • Pure Farming 2018
    • Raid: World War II
    • Star Wars: Battlefront II (2017)
    • TT Isle of Man
    3D Vision Profiles

    Added or updated the following 3D Vision profiles:

    • Elder Scrolls: Online - Good
    • Assassin's Creed: Odyssey - Not recommended
    Discontinued Support

    • 32-bit Operating Systems
      Beginning with Release 396, NVIDIA is no longer releasing Game Ready drivers for 32-bit operating systems for any GPU architecture.
    • NVIDIA Fermi GPUs
      Beginning with Release 396, the NVIDIA Game Ready driver no longer supports NVIDIA GPUs based on the Fermi architecture.
    Limitations in This Release

    The following features are not currently supported or have limited support in this driver release:

    • Turing GPU Driver Installation on Windows 10
      Drivers for Turing GPUs will not be installed on systems with Windows 10 RS2 or earlier. This includes Windows 10 Threshold 1, Threshold 2, Redstone 1, and Redstone 2 operating systems.
    Fixed Issues in this Release

    • Using power monitoring in GPU monitor tools causes micro stutter. [2110289/2049879]
    • [Monster Hunter World]: Low frame rate in the game. [2335958]
    • [Tom Clancy's The Division]: Graphics corruption occurs when using NVIDIA Gameworks settings. [2005096]
    • [Call of duty WW II ][1x3 Surround]: The center Surround display renders black screen. [200370257]
    • [Planetside 2][G-SYNC]: G-SYNC does not work with the game. [2221050]
    • [ARCHICAD][OpenGL]: The OpenGL driver crashes the application. [2093819]
    • [GeForce GTX 1080Ti]: Random DPC watchdog violation error when using multiple GPUs on motherboards with PLX chips. [2079538]
    • [YouTube][Mosaic with Sync]: Secondary GPU doesn't render video content on full-screen YouTube video. [200402117]
    As usual... New is always better:D
    upload_2018-9-19_21-12-50.png


    Official 411.63 Game Ready WHQL Display Driver Feedback Thread (Released 9/19/18)
    444B4603-6614-4FD6-A26F-F29C26C9C28C.jpeg
    Display Driver Uninstaller v18.0.0.0
     
    Last edited: Sep 19, 2018
    jaybee83, j95, Mr. Fox and 2 others like this.
  2. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    guinea pig time!
     
    jaybee83, Mr. Fox and Vistar Shook like this.
  3. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    Let the performance and stability damaging driver games begin! :vbbiggrin:
     
    Awhispersecho, jaybee83, j95 and 3 others like this.
  4. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    Left for dead by the NVIDIOTS... again. I guess if Fermi is good enough, those customers don't deserve new drivers.

    There's just no excuse for this bull crap. More collusion between the Santa Clara Mafia and the Redmond Nazis. Gotta love the strong-arm tactics... gangsta style product support.
     
    j95, hacktrix2006 and Papusan like this.
  5. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,879
    Trophy Points:
    931
    yeah you see, and y'all blame me for being on the latest Windows 10 build which is the suckiest, it's because of crap like this :rolleyes: :rolleyes:
     
  6. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    No blame. It's a matter of personal preference.

    As far as OS build compatibility, I'm sure Brother @j95 can fix that with one of his awesome driver mods.

    But, it really boils down to whether or not having new GeFarts drivers is important enough to compromise everything else by having the latest piece of crap OS. Unless there is a compelling game that requires the new driver to run, there is still no point in updating the GPU driver if everything is working correctly. It's a total waste of time and energy if you don't have a legitimate and compelling reason to do it. I'd venture a guess that 99% of the time, there is no reason for doing it other than end users having a goofy fetish and being OCD about having the latest drivers. It's pretty silly IMHO. The best way to have a stable system is the "if it ain't broke, don't fix it" approach. Coincidentally, it also requires the least amount of effort.
     
    Papusan and Spartan@HIDevolution like this.
  7. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,879
    Trophy Points:
    931
    well, the latest games usually run like crap if they would even run to start with on older drivers.

    I tried one of the older drivers one time from MSI and Doom wouldn't even launch d00d. Now these drivers say they support FIFA 2019 which I did pre-purchase, I'm 100% sure they wouldn't run on older drivers or if they did they would perform like garbage.

    So I am forced to drink the Micro$h4ft c00l 4!D

    These morons have really squeezed us into a corner.

    Then comes the bloody OEMs who are shifting their apps to the Windows Garbage Store. Example: MSI stopped Nahimic Setup installer and now it's exclusively through the store. Don't want me to install it? then my taptop would sound like the sound is coming from a crappy phone.
     
    j95, Papusan and Mr. Fox like this.
  8. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    1. Everything should be better with DirectX 12... Said Micro$h4ft

    DirectX 11 mot DirectX 12 – Radeon RX Vega 64 och Geforce GTX 1080 Sweclockers.com | Use translator

    upload_2018-9-20_0-35-15.png

    The result that is being painted is very clear. Of six titles tested, the AMD Radeon RX Vega 64 performs better under DirectX 12 in five. For the Nvidia Geforce GTX 1080 it's all about - only one game shows better DX12 results, and in many cases DX11 is considerably faster.

    There is more than the numbers show
    So far, all right. DirectX 12 delivers better performance with AMD Radeon, and Nvidia is the worst - or? If so, it would be so simple.
    Although hugely much better than before, DirectX 12 still means some uncertainty, regardless of graphics maker. There are some more graphics bugs, a little more crashes, a little more unexpected performance or strange menus. A little more opulent, simply.
    The big sorcerer, both for AMD and Nvidia, is Total War: Warhammer II - more than the numbers show. The DirectX 12 feature is clearly labeled as beta, and luckily - game crashes, does not work as it should with some cards, provides bit-wise crazy performance apps and bitwise more than doubled charging times.

    But, should I drive with DirectX 12 or not?
    An attempt to rule the rule is that if an AMD Radeon is under the hood - test, by all means. You can very well earn a few frames per second, without actually losing so much - especially with a bit slower processor.
    Those who, on the other hand, sit on a Nvidia Geforce can with good conscience leave the switch set to DirectX 11, in cases where the election is available. In almost every case, performance is equivalent or better, while we are facing a whole lot of small problems.
    The above two summarizes pretty well the status of DirectX 11 and DirectX 12, early autumn in the year 2018. Challenges around the new interface are many, and sometimes the profits are unclear - especially for limited resource developers. It may take many years before we can add 11th to history.

    ------------------------------------------------

    AMD Radeon R9 290X vs. Nvidia Geforce GTX 780 Ti - five years later Sweclockers.com

    It has been spoken much that AMD ages significantly better over time, fine wine, and SweClocker tests show that in some contexts it actually fits - the R9 290X has been against the tooth of the day significantly better than the GTX 780 Ti. Equally big difference, however, was not in the previous article, the Radeon R9 Fury X against the Geforce GTX 980 Ti, although AMD managed to take some land against the rival even there.
    upload_2018-9-20_0-42-50.png
    upload_2018-9-20_0-43-29.png

    upload_2018-9-20_0-44-42.png

    As you can see... Nvidia doesn't give a ****y after they have released new models!! The same can be said about Micro$h4ft.

    Msi just follow the Alienware's :D What is better than download AW's (useless) overclockings tool from Windows Store? :p Alienware followed MSI with Battery Turbo boost. As you see They are like brothers and sisters :biggrin:
     
    j95 and Spartan@HIDevolution like this.
  9. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,879
    Trophy Points:
    931
    So Mr Clifford and Mr. Azor are friends? [​IMG]

    [​IMG]
     
    Papusan likes this.
  10. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Spartan@HIDevolution likes this.
  11. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    For the record, 388.31 works fine for me on everything I have played recently, including Overwatch (not my favorite) and Doom and New Colossus (both among my favorites) and Battlefield 1. There have not been any newer releases that interest me.
    Well, if they said it then it must be true. They never lie except when their lips move.
    To be fair, AMD kind of/sort of has to provide driver support for their antique GPUs because they don't have any new GPU that is worth wasting any money on compared to their antique GPUs. No innovation or progress for team red on the GPU front for a long time now. Most GeFarts GPU owners upgrade because they actually have an upgrade path that provides a tangible benefit.

    It kind of sucks that Fermi owners won't have new drivers any more, but they're kind of beating a dead horse after this many years. Anyone content with a Fermi GPU probably should be content with the most current driver available for it. If they were serious about performance or gaming they would not still be using a Fermi GPU.
    If Micro$lop doesn't care, then why should the NVIDIOTS? Monkey see, monkey do. Two wrongs must make it right.
     
    Last edited: Sep 19, 2018
    Awhispersecho and Papusan like this.
  12. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    I know this but there is no reason to not follow AMD's footsteps in driver optimization. But as we know, Nvidia prefer to force people over to newer graphics if they want some more FPS. No money in old already sold graphics cards, Oh well.

    At least THIS is some good news if this is correct... Intel open up for older OS on their coming Z390 lineup. But I don't know what the Redmond Morons aka Micro$h4ft will say to this :D

    "Our sources indicate that vanilla H310 motherboards will continue to be offered at retail locations, but they fully expect the H310C motherboards, which will be branded with either an H310C or H310 R2.0 branding, to replace the existing SKUs eventually. The new chipsets will also support Windows 7, as reported by our sister site AnandTech, which may signal that Intel will restore compatibility with the older OS on its newer motherboards, such as the forthcoming Z390 lineup. That's an abrupt about-face from the decision to stop supporting older versions of Windows with the Kaby Lake processors."
     
    Mr. Fox likes this.
  13. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    NVIDIA has always only cared about money. AMD hasn't made any real money from GPUs for a long time because they suck at that. So, we are status quo.
    Yes, that is really awesome news. Nice to see that decision-makers at Intel are not nearly as stupid as the Redmond Retards.
     
    Papusan likes this.
  14. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Fermi was released in 2010. Exactly how long do you expect a company in the fastest evolving sector of computer hardware to support outdated products?
     
    Awhispersecho and Mr. Fox like this.
  15. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
  16. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    Let's be honest, Fermi was a crap architecture, and it's a miracle they've supported it for as long as they have. Do you remember how garbage the 580M and 675M were? AMD was handily spanking them with the 6990M/7970M until nVidia answered with Kepler.
     
  17. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    No, 580M was much better than 6970M and 6990M, and 7970M was only barely an improvement. I distinctly remember wishing I had kept my 580M SLI setup because the 7970M CrossFried that replaced it was one of the most goofed up pieces of garbage that I had ever owned. It never functioned correctly and 4 out of 6 7970M GPUs that I got from Dell were defective trash.

    But, it did not take long before 680M came to the rescue. I wasted no time in switching back to the green team and kicking the 7970M crap to the curb. I had almost exactly the same crappy experience with 6970M and 6990M. I was 100% an ATI fanboy at that point in time, but AMD totally ruined everything. It was then I switched sides and had no regrets with 580M SLI.

    The last GPUs I owned from Team Red that were actually worth a damn were 4870 and 5870 and Radeon All-In-Wonder Pro, and these were all ATI products, not AMD. Everything turned to crap when AMD bought out ATI.

    However, I do agree that 675M was crap. It had some reliability issues that 580M did not, and it was a stupid gimmick rebadged 580M with a new name to extort totally worthless upgrade money from 580M owners. Not much different than the 880M gimmick GPU released to sucker 780M owners into a worthless upgrade. But, I never had a lick of trouble from 580M cards. They consistently outlasted 7970M by several years.

    Don't you remember all the Clevo and Alienware owners with 7970M cards dying just outside of the 1 year warranty and all of the in-warranty drama with the 7970M trash cards? We rarely saw any 580M cards dying until at least 2 or 3 years later. And, 680M with an unlocked vBIOS overclocked like a banshee and absolutely annihilated 7970M... a total bloodbath for 7970M.

    I was smack in the middle of all that action and remember the good and bad like it was yesterday. For me, the 7970M trash was the final nail in the coffin for AMD. I haven't had anything nice to say about AMD GPUs since then. They haven't given us a reason to yet.

    The text you quoted was sarcasm, along with the statement before it. Maybe it was not obvious, but I was making light of concern for driver support for Fermi being relevant today. I agree with you, LOL. The "again" part was also sarcasm, alluding to the digital genocide was saw with the Green Goblin using cancer drivers to gut 780M performance in perfect timing with Maxwell's release.

    ^^^^
    So, what do we learn from recent history? Intel and NVIDIA and Micro$loth are all shifty, dishonest bastards that can never be trusted under any circumstances to do the right thing even though they generally release respectable products, and AMD are established losers who are finally, genuinely, trying very hard (with CPUs) to pull out of their decade-long tailspin into the abyss.
     
    Last edited: Sep 19, 2018
    DreDre and Papusan like this.
  18. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    I don't know man, maybe your memory is a little fuzzy, for sure the 580M/675M was a terrible card, I had 5 fail in one year, many died just by looking at them, plus nvidia making them throttle so easily. I always had stellar performance from my 6990M and 7970M setups, of course neither lasted very long before giving out from poor production, but later 7970Ms were much better built. Of course, they couldn't hold a candle when nvidia came out swinging with the full fat core that went into the 780M. Down the road, AMD improved GCN performance quite a bit, while nvidia chose to gimp and neglect kepler, not to mention they built those cards with dx12/vulkan in mind and didn't neuter professional performance. 7970M can crush 980M in almost any professional application since nvidia decided to gimp geforce for pro apps, so they can profit off quadro.

    BUT BACK ON TOPIC
    Nvidia supported Fermi long enough IMO, its a miracle any of those cards still work. FWIW, nVidia did support Fermi longer than AMD supported Terascale 2 by a good 2 years, which is quite extraordinary in my book considering they released around the same time.
     
    Mr. Fox and Papusan like this.
  19. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    No way, not fuzzy at all. I remember living in AMD hell for way too long. In fact, that is why I am still an AMD hater to this day. I think I may still have the benchmark links to prove that 580M performance totally destroyed 6970M/6990M. I had two pair of 6970M and two pair of 6990M cards that were defective and eventually replaced under warranty with 580M by Dell. Up until that time I was an ATI fanboy. After the switch to Team Green, I never had any trouble with 580M whatsoever. Then I gave AMD a second chance and 4 out of 6 7970M cards that I got from Dell were absolute trash. When I finally got 2 good 7970M cards, they were totally outclassed by 680M performance and 7970M was no longer desirable product because it was being raped by 680M. I have never since had any respect whatsoever for AMD graphics cards. During that same time frame I had a couple of AMD desktop graphics cards in systems I had built for my sons and they were as much garbage as the mobile GPUs that were ruining my experience with laptops.
     
    Papusan likes this.
  20. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    M18x, 580M, 6990M are cheap now, please pick one up for ****s and giggles and lets have a boxing match, 580M vs 6990M CF lol
     
    Mr. Fox and Papusan like this.
  21. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    I would not touch an Alienware again at any price. My current contempt for their worthless garbage eclipses all of the fond memories of how great their products used to be.

    Let me see if I still have the links from my old benchmarks in a zip file in one of my external hard drives. 580M SLI left my 6900M CF benchmarks in the dust... very decisive win... not even close. I bet johnksss still has some of his old links to benchmarks showing the same thing. We were both having a good time overclocking the snot out of 580M SLI and 680M SLI while the sad owners of the junky AMD GPUs were watching all of the excitement from the sidelines.

    Remember the conference call we sponsored here with Dell/Alienware about all of the 6970M and 6990M that were malfunctioning? Alienware's lead Graphics Engineer, Louis Bruno, and Bill Biven (our social media rep) hosted that conference call for us.

    The only problem 580M had was Dell's special cancer vBIOS. Once svl7 exorcised the throttling filth from the cancerous Dell firmware, it was a bloodbath for AMD cards. With the cancer vBIOS their performance was nearly identical.
     
    Last edited: Sep 19, 2018
    Papusan likes this.
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    I dug these up... 680M SLI versus 7970M CrossFried... I had forgotten that 7970M also hindered CPU performance beside being much weaker than 680M.

    https://www.3dmark.com/compare/3dm11/4272954/3dm11/5171076#

    upload_2018-9-19_19-26-56.png
     
    Papusan likes this.
  23. j95

    j95 Notebook Deity

    Reputations:
    2,461
    Messages:
    1,475
    Likes Received:
    1,308
    Trophy Points:
    181

    Build 17763.1​
    [​IMG]






    OEM INF(s) lacks a required entry when it comes to Windows 10...still.

    New *NVIDIA_Devices.NTamd64.10.0...16299* probably with next version.

    Switching to MSI mode.

    It seems not to be the case, for the time being ;) give it a shot :D
     
    Last edited: Sep 20, 2018
    0lok, Prema, Vasudev and 2 others like this.
  24. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    Looks like it took from 2010 to 2016 for AMD to get their Catalyst drivers right and by then I was on 1080 SLI, LOL. After six years 6990M CrossFire almost caught up to 580M SLI. These are the #1 results from 3DMark with dual GPU and 2720QM (which is the CPU I had at that point in time).

    https://www.3dmark.com/compare/3dm11/11665724/3dm11/4599113#
    2018-09-19_19-37-50.jpg
     
    Last edited: Sep 19, 2018
    j95 and Papusan like this.
  25. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    REALLY old run lol, I can beat that 7970M score on a tired M17x R2 with barely overclocked and significantly undervolted 8970M:
    https://www.3dmark.com/3dm11/12683533

    Like I said, AMD made big strides later in life, nvidia not so much when it came to kepler, as kepler did not age well, nevermind fermi
     
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    With AMD, it's too little too late. I mean, yeah... it's nice they finally caught up with the 21st century and all that, but you would have never achieved that graphics performance when the 7970M and 8970M GPUs were actually still relevant. We're in 2018 and they are just now getting their drivers to deliver 2012 performance compared to 2012 NVIDIA performance. That's pretty damned sad if you ask me, LOL. Hardly anyone is using 2012 NVIDIA GPUs any more, so you're comparing apples to oranges. 680M performance no longer matters, and neither should 7970M or 8970M performance, unless you are still using legacy hardware. I can see where it might matter in that scenario. But, everyone else has moved on and those legacy GPUs are not on par with 2017-2018 flagship product performance. It's a shame they were not able to deliver that performance back in 2012 when it actually mattered. When Kepler and Fermi were still relevant, AMD was lagging way behind and getting their butt kicked real bad. If you compare current generation AMD flagship o current generation NVIDIA flagship, they're still getting raped by NVIDIA.
     
    Last edited: Sep 20, 2018
  27. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Fermi was a turd, AMD were running circles around Nvidia back then.
     
  28. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Did you make a driver mod for nvidia 399.32 Developer driver? https://developer.nvidia.com/39932-win-10
    So any recommendation for an overlay which looks modern? I tried NZXT CAM and uninstalled it due to data mining or telemetry! Now I have MSI AB and RTSS. But, RTSS causes old dx9 or some dx12 games to crash
     
  29. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,486
    Trophy Points:
    681
    Are you girls done whining now? @j95 (THX!) driver is out so let's see some actual results... :D
     
    0lok, Vaeron, Vasudev and 2 others like this.
  30. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    False equivalence. You don't need a separate driver in order for a CPU to maintain compatibility with an operating system and other software. If you're talking about Intel's graphics drivers, the HD 3000, which was released with Sandy Bridge in 2010/2011 so roughly the same time as Fermi's launch, stopped receiving updates in 2016.
     
    jeremyshaw likes this.
  31. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    This isn't an driver update for compatibility with an operating system and other software or Intel's graphics driver. As you see... Intel pushed microcode for older chips. They could give damn and done it Nvidias way.
     
  32. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    Circling the drain maybe. Their GPUs were failing left and right and Fermi marked the end of AMD winning at anything for roughly a decade. I know because I was a victim of their incompetence. After dealing with 8 or 9 consecutive defective AMD products under different brand names it seemed pretty clear they were circling the drain. After jumping ship to the greener side of the fence there have been no regrets.

    Don't misinterpret what I am saying. NVIDIA sucks, too. Just not as bad as AMD.
     
    Last edited: Sep 20, 2018
  33. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Again, you're making a false equivalence. 1) Microcode updates are released to fix serious flaws in hardware that could potentially have devastating effects on business customers, which make up the majority of Intel's well, business. If such a flaw existed within a GPU, then it would surely receive the same treatment. 2) Businesses update their IT infrastructures far less frequently than end users/enthusiasts, so it makes sense for Intel to continue to support legacy hardware. Although it is less true today, at the time Fermi was released, NVIDIA's primary customers were end users and enthusiasts who change their hardware with far more regularity.

    Anyway, back to the top of these drivers. These drivers are a no-go. Every time I plug in my Xbox controller, the gamma of my displays revert back to default, as if something gets triggered to ignore what I set. Reinstalled version 398.98, and all is well again.
     
    Spartan@HIDevolution likes this.
  34. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    They suck for benching as well. Less stable and lower scores. Even seems to somewhat impair CPU performance. I went back to 388.31. That one seems to be the best for my purposes.
     
  35. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,879
    Trophy Points:
    931
    How could a GPU Driver impair CPU Performance?
     
    Papusan and Mr. Fox like this.
  36. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    By changing something that adversely affects the CPU. @Papusan found the same to be true.

    It shouldn't, but we can ask the same question about why does Windoze OS X impair CPU performance. The short answer is because it sucks, and the people that made it suck at their jobs.
     
    Spartan@HIDevolution and Papusan like this.
  37. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    The HD 4670 was a damn good mid-range card. We need the Radeon of 2008 to come back swinging with Navi.. I'd love to have a card that could put up 1080 numbers with less TDP..
     
    saturnotaku, Mr. Fox and Vasudev like this.
  38. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Well, everything needs supervision of the CPU.
    Think of it this way, before displaying anything on screen you ask cpu to allocate memory,data and all other stuff you need before hand or even dynamic allocation is possible if you want to keep everything as constrained as possible. Then once your requirements are met, you call the graphics API to draw, put the data back/forth in buffers, submit workload as batches and get the results to see onscreen. If along the lines, your PC gets out of memory the OS must put non essential drivers/apps on page file and go back to allocating memory and follow the control flow of Graphics API causing little overheads here and there.
    I hope you get the gist of it. GPU,SSD and other peripherals are simply slaves of CPU.
     
    saturnotaku, Papusan and Mr. Fox like this.
  39. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    The Radeon Mobility 5870 and 6750 were fantastic cards. Maybe not as fast as their NVIDIA counterparts, but they had much better thermal performance, at least in my experience.

    That weird gamma issue I mentioned in my previous post was not a result of the GeForce drivers but of NOD32 of all things. Upon rebooting my machine after the latest build of that installed, the problem hasn't cropped up again.
     
  40. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Very different from what I saw. The G73Jh had beefy cooling but a lot of 5870M cards in that system still ran at close to 100C before repasting.
     
  41. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    I had an MSI GX740 with the 5870 and was never anywhere close to overheating with it. Maybe I hit the silicon lottery, but I only saw above 85 under synthetic loads on both the CPU and GPU.
     
  42. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Test MSI GX740 - 102°C on the graphic card of HD5870:D
     
    yrekabakery likes this.
  43. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    That's rather surprisng as the GX740 has much worse cooling than the G73Jh:
    [​IMG]
    [​IMG]

    And as mentioned in the NotebookCheck review that @Papusan linked, the GX740 peaked at 86C/102C on the CPU/GPU in the stress test, while the G73Jh peaked at 80C/93C on the CPU/GPU in the same test.
    [​IMG]
     
    Papusan likes this.
  44. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    I never had any issues at all with ATI Radeon Mobility 4870 CrossFried or 5870 CrossFried. I had both in two different Alienware M17xR2. In fact, my youngest boy still has his M17xR2 and the ATI 4870 CrossFried is still kicking after all these years. I was not as fortunate with the AMD Radeon 6XXX series on notebooks or desktops. Went through multiple defective desktop and laptop cards before I finally threw in the towel and said to hell with the notion of running AMD graphics. I loved the old ATI GPUs so much I found it in my heart to forgive them and made the terrible mistake of giving hem a second chance with 7970M and regretted it almost immediately (after 4 defective GPUs) and again threw in the towel. Have never looked back, and AMD haven't given us any reason to look at their GPUs since then.
     
  45. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    To be fair, AMD bought ATi in 2006, but it wasn't until 2010 that they retired the ATi brand name. So you were still using AMD products, just under the old name.
     
    Papusan and Mr. Fox like this.
  46. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    And, 2009-2010 was about the time their products started getting extra crappy, LOL. 2007 was when 5870 was released if memory serves me correctly, so apparently they hadn't had time to destroy the product yet.
     
    Papusan likes this.
  47. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    2010, actually.
     
  48. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Nah, the 5870 came out late 2009/early 2010 depending on desktop or laptop version (laptop version was actually an underclocked desktop 5770). AMD was on the comeback trail in '09-'10, having finally recovered from their disasters in recent years (Radeon 2000 series in '07 and Phenom in '08). 5000 series was first DX11 GPU on the market and an efficient architecture while Fermi ("Thermi") was delayed and stumbled out the gate. Who remembers the dreadful GTX 480M. Phenom II offered more cores than Core 2 and Nehalem for the money while staying competitive in clocks and IPC.
     
  49. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,785
    Trophy Points:
    931
    I guess I should have googled it rather than trying to remember. Oh well, ancient history now. I believe I got my first Alienware M17xR2 with 5870M CrossFried in late 2009, so it must have been one of the first with 5870M CrossFried. It was an in-warranty replacement for an XPS M1730 with 8800M SLI that I purchased in late 2007-early 2008 and it had been nothing but headaches. The second M17xR2 with 4870M CrossFried was purchased second-hand from a member of this forum, and that is the one my youngest son still has.
     
    Last edited: Sep 22, 2018
    Vasudev likes this.
  50. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    what settings do you set the drivers to in the nvidia control panel? i always left it at default quality setting but trying high performance.
     
 Next page →