The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    New NVIDIA Geforce WHQL Driver 353.06

    Discussion in 'Gaming (Software and Graphics Cards)' started by KING19, May 31, 2015.

  1. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,604
    Messages:
    23,562
    Likes Received:
    36,866
    Trophy Points:
    931
    Thanks for this golden information! I have always hated Optimus because I am always wondering hmm, is this app, video player, browser, etc, running of my crappy Intel HD Graphics or my dedicated GPU? If I buy a performance gaming laptop, I sure as heck want everything to run at max performance (ie the dedicated GPU)

    sorry zb0t :(
     
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You have another benefit. Desktop capture is now available for Shadowplay. They're trying to work on a solution for this for Optimus users, but since the iGPU is displaying the desktop, the only way to do it would likely be (my guess, at least) to force the dGPU to number crunch even the desktop display, but then that kills the battery life benefits of Optimus.
     
    Spartan@HIDevolution likes this.
  3. zb0t

    zb0t Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    1
    Trophy Points:
    6
    Thanks for this info Ultima! :)
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Recording and windowed/desktop capture is a non-issue with Optimus since you can simply use QuickSync or NVENC in Afterburner. Stuff like downsampling/DSR, refresh rate overclocking, and G-Sync are the real bummers.
     
  5. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    Yes, all the issues with compatibility of new technologies has led me away from laptops
     
  6. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Unless you have a 4K screen - then only two downsides ;)

    I really would like G-Sync though. NVIDIA should partner with Intel to support Optimus based systems.
     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That offers them no benefit though... they'll be giving away their proprietary tech for GPUs that can't handle them just so people can have Gsync and battery life. I may be an Optimus hater, but I still cannot logically see why so many compromises should be made for Optimus. Improving the tech is one thing, sure, but sacrifices still need to be made. I'm starting to ask them to let us turn on/off SLI without restarting our PCs cuz desktops (and desktop modified drivers) can do it.
    But those don't get the shadowtime ability XD.
     
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Afterburner does. It's called prerecord and it's better than shadow time because you have complete control over the duration and can put the rolling buffer into RAM instead of disk.
     
  9. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    Can't wait to try out the rog swift tomorrow :D
     
  10. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    1) So....? Consumer wins.
    2) Wait, what compromises?
     
  11. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No:

    G-Sync
    DSR and downsampling
    VR
    3D
    120Hz
    Refresh rate overclocking
     
  12. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Ah but what I meant was what compromises must be made specifically in regards to G-Sync to get it functioning on Intel GPUs? Can't see how how it would impact non-Optimus machines and it would benefit millions of NVIDIA consumers. I say NVIDIA + Intel partnership to enable G-Sync would be a wonderful thing.
     
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You can still set the duration for shadowtime and the location of the buffer, but I can't set it into RAM you're right. That being said, MSI AB's pre-record thing isn't driver level so it requires some CPU power (however minimal). Also the capture type differs; some games REALLY hate being captured in certain ways, and shadowplay doesn't seem to trigger those ways (likely because it doesn't hook to the games).
    Yeah, consumer wins, but they don't make money off it. In fact they lose money more than anything else. It's not like people are buying AMD GPUs for mobile when they want performance, and all these thin laptops cannot even use AMD's cards, so it's not like they're gonna sell more.
    Compromises like... well everything Octiceps just said above. I don't see why nVidia and Intel should fight so hard just to get these working for everyone when they're not going to make any money off of it.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Doesn't sound optimistic: http://anandtech.com/show/9303/nvidia-launches-mobile-gsync

    [​IMG]
     
  15. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Haven't used ShadowPlay in ages. Can you specify exact time limit (in s) and file size limit (in MB) for the prerecord buffer like you can in AB?

    Funny, all the stories I've read of games not being recordable (like TW3 at release) involved ShadowPlay not AB.
     
  16. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You can specify minutes from 1 to 20 for shadowtime, and it's always active. The MB needed is shown below the minutes, and you can custom-set how much bitrate, resolution and FPS you want shadowplay to use (and that affects the pre-record buffer). No way through the program itself to set down to the seconds.

    TW3 was a rare case, yeah. I heard people complain about it. But I've never seen it be a problem in other games, really. Except for Binding of Isaac Rebirth; it wouldn't grab that game (I assume due to the kind of OpenGL he coded it in). I will admit it could do with better audio settings and such though. But games like Dark Souls 2 don't like being grabbed and recorded all that much, and other games like BF4 etc show some issues with conventional grabbing at higher FPSes for certain programs' capture methods. If you read up on the nVidia driver tweaking thing that was linked some time ago, it shows that Shadowplay only grabs the entire framebuffer and doesn't actually hook to a game, which is why it can't be used for windowed mode. But desktop recording for borderless windowed games would be pretty sweet, I won't lie.
     
  17. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    OK I see. ShadowPlay hasn't changed then. AB lets you set time and file size limits for the prerecord buffer independently, whichever it hits first. No 20min/1min upper/lower limit and such.

    And yeah, ShadowPlay's audio quality and settings are a joke. I know people who are forced to use AB for that reason alone. AB is just a much better recording utility for power users. The last time I used ShadowPlay, AB blew it away in terms of customization and functionality. I don't know how much ShadowPlay has improved since then but it doesn't sound like much if at all.
     
  18. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    You think with all the coding experience they have at nvidia they would be able to make a decent recording program for their own hardware.
     
    TomJGX likes this.
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You would be surprised.
    I don't doubt that at all. I mean, I could do it with Playclaw using NVENC and pre-record buffers too etc. I just mean that for absolute minimal performance hit and lack of needing to set a pre-record starting, shadowplay be it. I hate its barebonesness though. I'm waiting for Playclaw 5.5 to come out. Looks like they're really getting into it, which is good since it hasn't updated in about a year.
     
  20. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    "Coding experience"

    Recent Nvidia drivers

    Does not compute
     
    TomJGX likes this.
  21. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    Just put my 780ti's up for sale on eBay, this will fund my second Titan X.
     
    TomJGX likes this.
  22. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Cool story bro
     
  23. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    Really didn't want to part with them :(
     
  24. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Please continue
     
  25. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Well... I'm giving up on the M18xR2 working right with these drivers. I did make progress. Modded the 353.06 desktop driver and used strings for GTX 980 and the throttling is much improved, but it's still there and still hurts performance. I am able to overclock the snot out of a single 780M now, so it is definitely something they added to the drivers to restrict power and stay within some sort of arbitrary limit. I am guessing that code was in the BIOS all along and only recently did they add something to their drivers to stay within an arbitrary threshold. Sucks. If they do not fix their drivers and knock of this foolish nonsense soon will be selling all of my Alienware laptops. It's probably safe to assume this is what is happening with 980M SLI upgrades as well. They have incorporated filth for 980M "power compliance" that is ruining my Kepler experience.

    [parsehtml] <blockquote class="twitter-tweet" lang="en"><p lang="en" dir="ltr"><a href="https://twitter.com/ManuelGuzman">@ManuelGuzman</a> <a href="https://twitter.com/nvidia">@nvidia</a> <a href="https://twitter.com/NVIDIAGeForce">@NVIDIAGeForce</a> Here is a brief 3 minute video demonstrating the problem Please fix it. <a href="https://t.co/mhDRD2qBl3">https://t.co/mhDRD2qBl3</a></p>&mdash; Mr. Fox ([USER=165514]@Mr_Fox_Rox[/USER]) <a href="https://twitter.com/Mr_Fox_Rox/status/605557782494904320">June 2, 2015</a></blockquote> <script async src="//platform.twitter.com/widgets.js" charset="utf-8"></script>[/parsehtml][parsehtml]<iframe width="1280" height="720" src="https://www.youtube.com/embed/zkqOfN0AQ7k?rel=0" frameborder="0" allowfullscreen></iframe>[/parsehtml]
     
  26. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    Has anyone experienced BSOD with the new drivers?mine 336 or 346mhz overclock was fine for project cars in 350.12 but now it crashes to desktop or BSOD(happen 2-3 times),They are making it less stable to for us upgrading.I read today that amd fiji will start at 599$.I wonder how it will compare to 980 or 980ti.Windows 10 is made for AMD.Nvidia will a lot of customers cause of the new windows and their behaviour
     
  27. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    It better perform at least as well as a 980ti or AMD are dead in the water
     
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Are you on an Alienware and/or have a Killer wireless card?
    I tried a +158/500 Overclock today and it wasn't really an issue but I didn't exactly leave it for long
     
  29. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    No no.I got acer v3 772g which is a old system haha :) but it lasted long enough until nvidia screwed it up but next year i buy a desktop which is more suited for gaming

    OFF TOPIC:Has anyone ever heard for this game.Looks really cool and comes today out? http://store.steampowered.com/app/346110/
     
  30. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
  31. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    1. Did you at least know that Graphics score is what means for estimating GPU's performanse? Now look again why you got lower scores with 353.06. I would retest it again.
    Well, I feel cool that 10 minute video sacrifice just 1MB for audio and it sounds OK. But yeah, more options is better.
     
    Last edited: Jun 1, 2015
  32. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Well these drivers brang back the strange image doubling blur phenomenon on occasions on my desktop 980 but also have experienced on 980m . Not to mention, one of the dirtiest installations ever.. Black screen after install with a strange blinking white icon on the bottom left, reboot, black screen, reboot and ok.
    No thanks Nvidia... No thanks.
    Geez, going onto 1 year of rubbish drivers.
     
    Last edited: Jun 1, 2015
    Mr. Fox and Spartan@HIDevolution like this.
  33. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
    AW 18: 880M SLI, 4910MQ CPU Oc'ed to 4.3GHz, 32GB Corsair Vengeance RAM, 1TB Samsung 850 Evo m-Sata SSD... On my 4.3GHz overclock profile it crashes Project Cars. Was fine before the latest driver. But going down to the 4.2GHz profile flawless gameplay, VERY nice increase in FPS with this new driver. Average 10-15FPS on various tracks.
     
  34. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    This was really not much to brag about. The difference between 353.06 vs 352.86 WHQL drivers is minimal... 0.2% increase in graphics scores with 3DMark Fire Strike is a joke. LoL. This is all Nvidia could boast of... http://www.3dmark.com/compare/fs/4977946/fs/4934209
     
  35. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Nvidia promised game performance not benchmark performance
     
    TomJGX likes this.
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    Sure... But it had been a little more fun with a larger increase than 0.2% in benchmarks. But 0.2% increase is better than lower scores. LoL.
     
  37. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Ehh... 0.2 is well within the margin of error... These drivers suck again.
     
  38. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
  39. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Kurwa!

    Do you mean GTX 850/860/960M, i.e. GM107 users don't get those tessellation improvements too, and are stuck in the same boat as Kepler users?

    And despite all that, I finally managed to get 45-50 FPS in TW3 at medium-high settings with this driver - also dragged down the resolution slider to 1600x900. The smoothness is more than worth the loss in visual fidelity.

    And yet again, GTA V performance is best with 350.12. I cannot achieve frame-rates higher than 40 FPS with the last two drivers when I used to easily reach 60 FPS at high-very high settings with 350.12.

    Nvidia, get your **** together.
     
    Last edited: Jun 2, 2015
    TomJGX and Mr. Fox like this.
  40. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    @SRSR333

    [​IMG]

    Too tired/bothered to compare with Kepler GPUs in-class, but I'm sure octiceps will be more than happy to help you out :D
     
  41. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    The whitepaper says only Maxwell 2 has PolyMorph 3.0 but Maxwell 1 still has tessellation optimizations over Kepler.

    Okay... What settings are they using besides x64 though? Because when I run it on my 980M, the results are quite BAD. 253 FPS with a +100 core and memory overclock... This is also an OpenGL benchmark... its not going to be relevant for DX11 tessellation performance.

    EDIT: Nevermind... "The latest version of TessMark is designed to focus on tessellation via OpenGL 4. We run the latest version of the benchmark using the high resolution map set at maximum tessellation while 1080p full screen, reporting the average FPS."

    Power limit throttle brings the core down to around 1100

    tess.PNG

    Okay... Anand did something different... According to http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/17 I should be getting over 2k FPS on my 780 Ti, however...

    unnamed.png
     
    Last edited: Jun 2, 2015
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Maxwell 2 is PolyMorph Engine 3.0:

    [​IMG]

    Maxwell 1 is PolyMorph Engine 2.0 like Kepler:

    [​IMG]

    [​IMG]

    However, based on these TessMark results, it seems Maxwell 1 has improved tessellation performance over Kepler like Maxwell 2 does.

    980 (GM204): 2321 FPS
    770 (GK104): 1625 FPS

    750 Ti (GM107): 796 FPS
    650 (GK107): 412 FPS

    Notice how 980 scores about 3x as high as 750 Ti, which makes sense as the GM204 GPU (16 SMMs) is about 3x as big as GM107 (5 SMMs).

    Interesting to note, Maxwell 2 is nowhere near 3x as fast as Kepler in tessellation despite what Nvidia claims in its GTX 980 whitepaper.
     
    Last edited: Jun 2, 2015
  43. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    ^ I think the fairest comparison of Kepler vs Maxwell would be 980M vs GTX 770, as they have the same amount of cores. That'd show the direct Maxwell improvements. Of course you'd likely have to clock the 770's memory down to match, but still.
     
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The core count is not directly comparable because they're different architectures. The "a 128 CUDA core Maxwell SMM has 90% the performance of a 192 CUDA core Kepler SMX" explanation that Nvidia gave (which would mathematically translate to each Maxwell core being 40% stronger than a Kepler core). So it's not fair to compare a neutered GM204 vs. a fully enabled GK104 just to match CUDA core counts.

    There is one PolyMorph Engine per SMM/SMX. GM204 has double the tessellation engines of GK104 because GM204 has 16 SMMs while GK104 has 8 SMXs. That's where GM204's tessellation advantage comes from.

    Similarly, GM107 has 5 SMMs and tessellation engines while GK107 has 2 SMXs and tessellation engines.
     
    Last edited: Jun 2, 2015
  45. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Welp. That's a GG I guess. Now if only Maxwell had stable voltage and kept proper clocks in SLI and stuff. That'd be great.
     
  46. karasahin

    karasahin Notebook Consultant

    Reputations:
    31
    Messages:
    268
    Likes Received:
    47
    Trophy Points:
    41
    I'll be damned. This driver really improved TWD3's performance. I can play it now with these settings, rarely dropping below 30 FPS in town and stable 40 FPS in outside with GTX 870M. 3DMark 11 score is almost at same level (though there is a little drop, but it happens all the time). This performance should've been there in the first place but I'm glad that NVIDIA cleaned their mess. But I must say nevertheless, the graphics don't look so good even on ULTRA. I mean it is good but it is not as anticipated. It is like something is unnatural.
     

    Attached Files:

    Last edited: Jun 2, 2015
  47. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    I take it you haven´t read the last couple of pages.. Nice it´s working for you though.
     
  48. LeRoyVoss

    LeRoyVoss Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Anyone knows why the latest 353.06 WHQL nVidia drivers are available for cards such as GTX 860M, 850M and so on and not for the GTX 960M? Looks very strange to me...
     
  49. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Indeed, I get improved performance in TW3. Your GPU is more powerful than mine - you should consider moving up a few of your settings (especially water and texture quality). Barely any FPS impact, and the game looks a lot batter. Also, I notice you don't have AO switched on. Use SSAO - it makes all the rough corners stand out a little less.

    And finally, you might consider sacrificing 1080p resolution for 900p, to get a boost of up to 20 FPS - yes, 20.

    Yet, Nvidia nuked performance in everything else. GTA V is still a mess (-25 FPS on average). Damn it Nvidia, I want to play all games on a single driver, not switch between drivers for different games. This isn't the 1990s.
     
  50. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    TessMark is useless because it is OpenGL 4 tessellation performance, not DirectX 11. A comparison of Heaven results and measuring the performance hit the respective cards take with the extreme tessellation option would be a much better example.
     
← Previous pageNext page →