nVIDIA GeForce Drivers v411.63 WHQL Windows 10 - (64-bit)
nVIDIA GeForce Drivers v411.63 WHQL Windows 7, Windows 8.1, Windows 8 (64-bit)
@Ultra Male Are yoo ready, bruh?You're all welcome to the new barbeque party. And the Nvidia driver packages continue blow up in size (File Size: 517.11 MB). But will it blow your graphics to the hardware graveyard/heaven? Please test it. The new Turing graphics cards is out. Let's see how much this driver will break for Pascal owners
![]()
![]()
Game Ready
Software Module Versions
• nView - 149.34
• HD Audio Driver - 1.3.37.5
• NVIDIA PhysX System Software - 9.17.0907
• GeForce Experience - 3.15.0.164 • CUDA - 10.0
Provides the optimal gaming experience for Assassin's Creed Odyssey, Forza Horizon 4, and FIFA 19.
Gaming Technology
Includes support for NVIDIA GeForce RTX 2080 and RTX 2080 Ti graphics cards.
New Features
- Added support for CUDA 10.0
- NVIDIA RTX Technology
NVIDIA RTX supports the Microsoft DirectX Raytracing (DXR) API on NVIDIA Volta and Turing GPUs.
In order to get started with developing DirectX Raytracing applications accelerated by RTX, you'll need the following:
- NVIDIA Volta or Turing GPU
- Windows 10 RS4
- Microsoft's DXR developer package, consisting of DXR-enabled D3D runtimes, HLSL compiler, and headers
- Vulkan 1.1
- This driver release provides full support for the new Vulkan 1.1 API and passes the Vulkan Conformance Test Suite (CTS) version 1.1.1.2.
- Includes interoperability with CUDA 10.0.
- New extensions for Turing GPUs:
- VK_NVX_raytracing (also available for Pascal GPUs with 8GB or more video memory, and Volta GPUs)
- VK_NV_compute_shader_derivatives
- VK_NV_corner_sampled_image
- VK_NV_fragment_shader_barycentric
- VK_NV_mesh_shader
- VK_NV_representative_fragment_test
- VK_NV_scissor_exclusive
- VK_NV_shader_image_footprint
- VK_NV_shading_rate_image
- See https://www.khronos.org/registry/vulkan/ for the full specification.
NVIDIA Control Panel
- Vulkan HDR for Windows
This driver release supports the Vulkan VK_EXT_swapchain_colorspace and VK_EXT_hdr_metadata extensions allowing applications to output HDR content to HDR displays via the Vulkan APIs.- OpenGL extensions for Turing GPUs
- GL_NV_compute_shader_derivatives
- GL_NV_fragment_shader_barycentric
- GL_NV_mesh_shader
- GL_NV_representative_fragment_test
- GL_NV_scissor_exclusive
- GL_NV_shading_rate_image
- GL_NV_shader_texture_footprint
- See https://www.khronos.org/registry/OpenGL/index_gl.php for the full specification.
System Info shows Boost Clock values (instead of Base Clock) for Turing and later GPUs.
Application SLI Profiles
Added or updated the following SLI profiles:
3D Vision Profiles
- HOB
- Lake Ridden
- NieR:Automata
- Northgard
- Pure Farming 2018
- Raid: World War II
- Star Wars: Battlefront II (2017)
- TT Isle of Man
Added or updated the following 3D Vision profiles:
Discontinued Support
- Elder Scrolls: Online - Good
- Assassin's Creed: Odyssey - Not recommended
Limitations in This Release
- 32-bit Operating Systems
Beginning with Release 396, NVIDIA is no longer releasing Game Ready drivers for 32-bit operating systems for any GPU architecture.- NVIDIA Fermi GPUs
Beginning with Release 396, the NVIDIA Game Ready driver no longer supports NVIDIA GPUs based on the Fermi architecture.
The following features are not currently supported or have limited support in this driver release:
Fixed Issues in this Release
- Turing GPU Driver Installation on Windows 10
Drivers for Turing GPUs will not be installed on systems with Windows 10 RS2 or earlier. This includes Windows 10 Threshold 1, Threshold 2, Redstone 1, and Redstone 2 operating systems.
As usual... New is always better
- Using power monitoring in GPU monitor tools causes micro stutter. [2110289/2049879]
- [Monster Hunter World]: Low frame rate in the game. [2335958]
- [Tom Clancy's The Division]: Graphics corruption occurs when using NVIDIA Gameworks settings. [2005096]
- [Call of duty WW II ][1x3 Surround]: The center Surround display renders black screen. [200370257]
- [Planetside 2][G-SYNC]: G-SYNC does not work with the game. [2221050]
- [ARCHICAD][OpenGL]: The OpenGL driver crashes the application. [2093819]
- [GeForce GTX 1080Ti]: Random DPC watchdog violation error when using multiple GPUs on motherboards with PLX chips. [2079538]
- [YouTube][Mosaic with Sync]: Secondary GPU doesn't render video content on full-screen YouTube video. [200402117]
![]()
![]()
Official 411.63 Game Ready WHQL Display Driver Feedback Thread (Released 9/19/18)
![]()
Display Driver Uninstaller v18.0.0.0
-
-
-
Let the performance and stability damaging driver games begin!
Awhispersecho, jaybee83, j95 and 3 others like this. -
j95, hacktrix2006 and Papusan like this. -
Spartan@HIDevolution Company Representative
-
As far as OS build compatibility, I'm sure Brother @j95 can fix that with one of his awesome driver mods.
But, it really boils down to whether or not having new GeFarts drivers is important enough to compromise everything else by having the latest piece of crap OS. Unless there is a compelling game that requires the new driver to run, there is still no point in updating the GPU driver if everything is working correctly. It's a total waste of time and energy if you don't have a legitimate and compelling reason to do it. I'd venture a guess that 99% of the time, there is no reason for doing it other than end users having a goofy fetish and being OCD about having the latest drivers. It's pretty silly IMHO. The best way to have a stable system is the "if it ain't broke, don't fix it" approach. Coincidentally, it also requires the least amount of effort.Papusan and Spartan@HIDevolution like this. -
Spartan@HIDevolution Company Representative
I tried one of the older drivers one time from MSI and Doom wouldn't even launch d00d. Now these drivers say they support FIFA 2019 which I did pre-purchase, I'm 100% sure they wouldn't run on older drivers or if they did they would perform like garbage.
So I am forced to drink the Micro$h4ft c00l 4!D
These morons have really squeezed us into a corner.
Then comes the bloody OEMs who are shifting their apps to the Windows Garbage Store. Example: MSI stopped Nahimic Setup installer and now it's exclusively through the store. Don't want me to install it? then my taptop would sound like the sound is coming from a crappy phone. -
DirectX 11 mot DirectX 12 – Radeon RX Vega 64 och Geforce GTX 1080 Sweclockers.com | Use translator
The result that is being painted is very clear. Of six titles tested, the AMD Radeon RX Vega 64 performs better under DirectX 12 in five. For the Nvidia Geforce GTX 1080 it's all about - only one game shows better DX12 results, and in many cases DX11 is considerably faster.
There is more than the numbers show
So far, all right. DirectX 12 delivers better performance with AMD Radeon, and Nvidia is the worst - or? If so, it would be so simple.
Although hugely much better than before, DirectX 12 still means some uncertainty, regardless of graphics maker. There are some more graphics bugs, a little more crashes, a little more unexpected performance or strange menus. A little more opulent, simply.
The big sorcerer, both for AMD and Nvidia, is Total War: Warhammer II - more than the numbers show. The DirectX 12 feature is clearly labeled as beta, and luckily - game crashes, does not work as it should with some cards, provides bit-wise crazy performance apps and bitwise more than doubled charging times.
But, should I drive with DirectX 12 or not?
An attempt to rule the rule is that if an AMD Radeon is under the hood - test, by all means. You can very well earn a few frames per second, without actually losing so much - especially with a bit slower processor.
Those who, on the other hand, sit on a Nvidia Geforce can with good conscience leave the switch set to DirectX 11, in cases where the election is available. In almost every case, performance is equivalent or better, while we are facing a whole lot of small problems.
The above two summarizes pretty well the status of DirectX 11 and DirectX 12, early autumn in the year 2018. Challenges around the new interface are many, and sometimes the profits are unclear - especially for limited resource developers. It may take many years before we can add 11th to history.
------------------------------------------------
AMD Radeon R9 290X vs. Nvidia Geforce GTX 780 Ti - five years later Sweclockers.com
It has been spoken much that AMD ages significantly better over time, fine wine, and SweClocker tests show that in some contexts it actually fits - the R9 290X has been against the tooth of the day significantly better than the GTX 780 Ti. Equally big difference, however, was not in the previous article, the Radeon R9 Fury X against the Geforce GTX 980 Ti, although AMD managed to take some land against the rival even there.
As you can see... Nvidia doesn't give a ****y after they have released new models!! The same can be said about Micro$h4ft.
What is better than download AW's (useless) overclockings tool from Windows Store?
Alienware followed MSI with Battery Turbo boost. As you see They are like brothers and sisters
j95 and Spartan@HIDevolution like this. -
Spartan@HIDevolution Company Representative
-
At least they share the same stupid ideas.
Spartan@HIDevolution likes this. -
It kind of sucks that Fermi owners won't have new drivers any more, but they're kind of beating a dead horse after this many years. Anyone content with a Fermi GPU probably should be content with the most current driver available for it. If they were serious about performance or gaming they would not still be using a Fermi GPU.
Last edited: Sep 19, 2018Awhispersecho and Papusan like this. -
At least THIS is some good news if this is correct... Intel open up for older OS on their coming Z390 lineup. But I don't know what the Redmond Morons aka Micro$h4ft will say to this
"Our sources indicate that vanilla H310 motherboards will continue to be offered at retail locations, but they fully expect the H310C motherboards, which will be branded with either an H310C or H310 R2.0 branding, to replace the existing SKUs eventually. The new chipsets will also support Windows 7, as reported by our sister site AnandTech, which may signal that Intel will restore compatibility with the older OS on its newer motherboards, such as the forthcoming Z390 lineup. That's an abrupt about-face from the decision to stop supporting older versions of Windows with the Kaby Lake processors."Mr. Fox likes this. -
Papusan likes this. -
saturnotaku Notebook Nobel Laureate
Awhispersecho and Mr. Fox like this. -
-
-
But, it did not take long before 680M came to the rescue. I wasted no time in switching back to the green team and kicking the 7970M crap to the curb. I had almost exactly the same crappy experience with 6970M and 6990M. I was 100% an ATI fanboy at that point in time, but AMD totally ruined everything. It was then I switched sides and had no regrets with 580M SLI.
The last GPUs I owned from Team Red that were actually worth a damn were 4870 and 5870 and Radeon All-In-Wonder Pro, and these were all ATI products, not AMD. Everything turned to crap when AMD bought out ATI.
However, I do agree that 675M was crap. It had some reliability issues that 580M did not, and it was a stupid gimmick rebadged 580M with a new name to extort totally worthless upgrade money from 580M owners. Not much different than the 880M gimmick GPU released to sucker 780M owners into a worthless upgrade. But, I never had a lick of trouble from 580M cards. They consistently outlasted 7970M by several years.
Don't you remember all the Clevo and Alienware owners with 7970M cards dying just outside of the 1 year warranty and all of the in-warranty drama with the 7970M trash cards? We rarely saw any 580M cards dying until at least 2 or 3 years later. And, 680M with an unlocked vBIOS overclocked like a banshee and absolutely annihilated 7970M... a total bloodbath for 7970M.
I was smack in the middle of all that action and remember the good and bad like it was yesterday. For me, the 7970M trash was the final nail in the coffin for AMD. I haven't had anything nice to say about AMD GPUs since then. They haven't given us a reason to yet.
^^^^
So, what do we learn from recent history? Intel and NVIDIA and Micro$loth are all shifty, dishonest bastards that can never be trusted under any circumstances to do the right thing even though they generally release respectable products, and AMD are established losers who are finally, genuinely, trying very hard (with CPUs) to pull out of their decade-long tailspin into the abyss.Last edited: Sep 19, 2018 -
BUT BACK ON TOPIC
Nvidia supported Fermi long enough IMO, its a miracle any of those cards still work. FWIW, nVidia did support Fermi longer than AMD supported Terascale 2 by a good 2 years, which is quite extraordinary in my book considering they released around the same time. -
Papusan likes this.
-
-
Let me see if I still have the links from my old benchmarks in a zip file in one of my external hard drives. 580M SLI left my 6900M CF benchmarks in the dust... very decisive win... not even close. I bet johnksss still has some of his old links to benchmarks showing the same thing. We were both having a good time overclocking the snot out of 580M SLI and 680M SLI while the sad owners of the junky AMD GPUs were watching all of the excitement from the sidelines.
Remember the conference call we sponsored here with Dell/Alienware about all of the 6970M and 6990M that were malfunctioning? Alienware's lead Graphics Engineer, Louis Bruno, and Bill Biven (our social media rep) hosted that conference call for us.
The only problem 580M had was Dell's special cancer vBIOS. Once svl7 exorcised the throttling filth from the cancerous Dell firmware, it was a bloodbath for AMD cards. With the cancer vBIOS their performance was nearly identical.Last edited: Sep 19, 2018Papusan likes this. -
https://www.3dmark.com/compare/3dm11/4272954/3dm11/5171076#
Papusan likes this. -
Build 17763.1
OEM INF(s) lacks a required entry when it comes to Windows 10...still.
New *NVIDIA_Devices.NTamd64.10.0...16299* probably with next version.
Switching to MSI mode.
give it a shot
Last edited: Sep 20, 2018 -
Looks like it took from 2010 to 2016 for AMD to get their Catalyst drivers right and by then I was on 1080 SLI, LOL. After six years 6990M CrossFire almost caught up to 580M SLI. These are the #1 results from 3DMark with dual GPU and 2720QM (which is the CPU I had at that point in time).
https://www.3dmark.com/compare/3dm11/11665724/3dm11/4599113#
Last edited: Sep 19, 2018 -
https://www.3dmark.com/3dm11/12683533
Like I said, AMD made big strides later in life, nvidia not so much when it came to kepler, as kepler did not age well, nevermind fermi -
Last edited: Sep 20, 2018
-
-
So any recommendation for an overlay which looks modern? I tried NZXT CAM and uninstalled it due to data mining or telemetry! Now I have MSI AB and RTSS. But, RTSS causes old dx9 or some dx12 games to crash -
-
saturnotaku Notebook Nobel Laureate
jeremyshaw likes this. -
-
Don't misinterpret what I am saying. NVIDIA sucks, too. Just not as bad as AMD.Last edited: Sep 20, 2018 -
saturnotaku Notebook Nobel Laureate
Anyway, back to the top of these drivers. These drivers are a no-go. Every time I plug in my Xbox controller, the gamma of my displays revert back to default, as if something gets triggered to ignore what I set. Reinstalled version 398.98, and all is well again.Spartan@HIDevolution likes this. -
Vasudev, Papusan and Spartan@HIDevolution like this.
-
Spartan@HIDevolution Company Representative
-
It shouldn't, but we can ask the same question about why does Windoze OS X impair CPU performance. The short answer is because it sucks, and the people that made it suck at their jobs.Spartan@HIDevolution and Papusan like this. -
saturnotaku, Mr. Fox and Vasudev like this.
-
Think of it this way, before displaying anything on screen you ask cpu to allocate memory,data and all other stuff you need before hand or even dynamic allocation is possible if you want to keep everything as constrained as possible. Then once your requirements are met, you call the graphics API to draw, put the data back/forth in buffers, submit workload as batches and get the results to see onscreen. If along the lines, your PC gets out of memory the OS must put non essential drivers/apps on page file and go back to allocating memory and follow the control flow of Graphics API causing little overheads here and there.
I hope you get the gist of it. GPU,SSD and other peripherals are simply slaves of CPU.saturnotaku, Papusan and Mr. Fox like this. -
saturnotaku Notebook Nobel Laureate
That weird gamma issue I mentioned in my previous post was not a result of the GeForce drivers but of NOD32 of all things. Upon rebooting my machine after the latest build of that installed, the problem hasn't cropped up again. -
yrekabakery Notebook Virtuoso
-
saturnotaku Notebook Nobel Laureate
-
-
yrekabakery Notebook Virtuoso
And as mentioned in the NotebookCheck review that @Papusan linked, the GX740 peaked at 86C/102C on the CPU/GPU in the stress test, while the G73Jh peaked at 80C/93C on the CPU/GPU in the same test.
Papusan likes this. -
-
yrekabakery Notebook Virtuoso
-
Papusan likes this.
-
saturnotaku Notebook Nobel Laureate
-
yrekabakery Notebook Virtuoso
-
Last edited: Sep 22, 2018Vasudev likes this.
-
what settings do you set the drivers to in the nvidia control panel? i always left it at default quality setting but trying high performance.
nVIDIA GeForce Drivers v411.63 WHQL Findings & Fixes
Discussion in 'Gaming (Software and Graphics Cards)' started by Papusan, Sep 19, 2018.