The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    RTX 2060 review, 3rd gen Ryzen news, and best of PCs at CES 2019 | The Full Nerd Ep. 81
    PCWorld
    RTX 2060 discussion starts @ 05:25 ends @ 17:00
    Streamed live 17 hours ago
    Join The Full Nerd gang as they talk about the latest PC hardware topics. In the first Full Nerd episode of 2019 we talk about Brad's review of Nvidia's RTX 2060, the news around 3rd Gen Ryzen, and some of our favorite things at CES 2019. As always we will be answering your live questions so speak up in the chat.


    Gordon Ung 14 hours ago
    "But as we point out: remember we also had a process shrink between 970 and 1070. To be fair, should this be 970 vs. 770? Both were 28nm parts I believe. Pascal got that nice fat bump going from 28nm to 16nm.

    You do have to factor that into it which is also why I think people are under estimating where Radeon VII is going to land.

    Should Nvidia have waited for 7nm? Well, that's a business decision that we have no true insight into externally. But I can understand why it wants to get the ball rolling on HRT. You can't just go to 10 from 0. Developers have to see a reason to support new features for it to happen.

    Should a consumer wait for 7nm RTX? Brad pretty much said that when RTX first launched. But again, that's what you do with your money.

    I do agree this is a somewhat of a sidegrade--because that's what Nvidia decided to do. They decided to Zig When AMD Zagged and pushed HRT as something to put on everyone's plate. If they actually get developer support for it and it works out--great.

    If HRT support is as slow as everything else, then yeah, you got that Radeon VIII option yes?"

    Nvidia GeForce RTX 2060 Founders Edition review: Ray tracing and 1440p gaming get more affordable
    Ray tracing goes mainstream as prices go upstream.
    By Brad Chacos Senior Editor, PCWorld | JAN 7, 2019 6:00 AM PT
    https://www.pcworld.com/article/333...geforce-rtx-2060-founders-edition-review.html
     
    Last edited: Jan 15, 2019
  2. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,701
    Trophy Points:
    431
    An early adopter tax on new hardware? I'm shocked, shocked I tell you!
     
    Talon, bennyg, Papusan and 2 others like this.
  3. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    RTX 2060 launch day and the Nvidia reddit seems to be flooded with people loving the new driver, free g-sync, and RTX 2060s are being bought up.

    RTX for the masses, GTX 1080 performance for $350.
     
    Papusan and Vistar Shook like this.
  4. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Disregard full screen matters.
     
    Last edited: Jan 15, 2019
  5. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    A Peek Into NVIDIA's AI Robotics Lab in Seattle
     
    Vistar Shook and hmscott like this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Given the claims that these 2060's would be available at $349 for both FE and AIB, I knew - and most / all of you knew that wasn't going to happen - the price was going to be higher for the most desirable models, making the 2060 even higher overpriced than the $100 bump between the 1060 and 2060 release AIB MSRP pricing.

    Nvidia's attempt, and other Nvidia leaning reviewers and commenters attempts, to associate the 2060 price point as a 1070ti upgrade is silly - making the 2060 performance matching the 1070ti a simple same price side-grade with little or no difference in performance and a downgrade in VRAM from 8GB in the 1070ti to 6GB in the 2060.

    The 2060 is supposed to be the mid-range 20 series product, and if it came out at $249 - or even $199 as previous '60 series products then it would be.

    With the 2060's $349 MSRP $100-$150 higher than the previous '60 AIB price's, it's a price increase too far, and it's not a '70 price point model without the associated performance bump.

    The RTX 2060 is yet another failed 20 Series release, a complete flop.

    Yet some people will buy it. Given no other new offering from Nvidia or AMD at $349 some people will buy instead of following their instincts and walking away to find used or new previous generation alternatives.

    Looking at which 2060 listings have sold out, there are only 2 sold out on Newegg, the highest priced and the lowest priced. The Nvidia.com 2060 FE is still available, not sold out. Demand for the 2060 is low if there are still so many in stock sku's after the first day of sales.

    IDK why people would buy the budget GPU and then pay the highest price (about $450) for the most expensive model. $500 used to be the cost of the highest priced model, now it's more than double that. Why support that kind of corporate greed and encourage them to do that to us all?

    Paying $450+tax for a 2060 doesn't make sense when you can get a used previous generation 1080ti for the same or $50-$100 more, or a 1070ti / 1080 / Vega 56/64 instead for less.
     
    Last edited: Jan 16, 2019
  7. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Reddit seems to be all but ignoring the 2060 launch, maybe you can provide some links to what / where you are seeing the flood of 2060 interest?

    Here's the front page of /r/Nvidia, and there's hardly a mention of the 2060:

    nvidia subreddit 1-16-2019 #1.JPG
    Except for 1 thread about a specific 2060 review with 65 comments (hardly a flood), and a couple of questions with 0 comments, there's only the PSA warning about 2060's not having USB-C Virtual Links Output connectors, and that ranks higher than the 2060 review:
    PSA: Almost All AIB RTX 2060 Don't Have USB-C Virtual Link Output
    Submitted 17 hours ago by Nourdon
    https://old.reddit.com/r/nvidia/comments/agbl6g/psa_almost_all_aib_rtx_2060_dont_have_usbc/

    "I first noticed this from browsing the RTX 2060 listing on Newegg, but there seems to be a disturbing pattern of lack of Virtual Link Output on all of the RTX 2060 there.

    I then look into it further on the techpowerup gpu database for the RTX 2060 and only found 4 model of AIB below support the USB-C Virtual Link output out of the 40+ model available there. Even the highest end model from ASUS, EVGA, and MSI don't have the virtual link output.

    Here are the list of AIB model that have the virtual link output:
    • Colorful iGame RTX 2060 Ultra OC
    • Colorful iGame RTX 2060 Vulcan X OC
    • GIGABYTE AORUS RTX 2060 XTREME
    • ZOTAC RTX 2060 Extreme Plus OC6"
    From looking at reddit I'd have to conclude no one is interested in or talking about the 2060 launch, another complete flop RTX GPU hardware release from Nvidia.

    Even Nvidia's 20 Series Hardware Support Forum doesn't have any threads about the 2060, Nvidia didn't even start the typical new hardware feedback thread:
    And, as I noted in my previous post about 2060 pricing, the vast majority of for sale listings I posted earlier this morning are still showing up now, only 2 show "sold out" on newegg - the highest priced and lowest priced sku's - comparing this against typical releases selling out the first day this shows the RTX 2060 GPU having the lowest interest for a new GPU release in recent memory.

    Nothing but the sound of crickets... and even they aren't discussing the RTX 2060. ;)
     
    Last edited: Jan 16, 2019
  8. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
  9. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Vistar Shook likes this.
  10. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Nvidia CEO Says GTX 10-Series Inventory Almost Depleted

    https://www.tomshardware.com/news/nvidia-gtx-10-series-gpu-inventory-depleted,38455.html

    What are the odds Nvidia does some huge price slashing on RTX after GTX 10 is gone? I think very high. Genious really, leave RTX prices high and sell off remaining 10 series stock left over from the crypto hangover to make consumers think they're getting a "deal". Once stock is gone, slash prices on RTX and get back on track.

    Nvidia won't suffer the same issue next year with 7nm as they are probably producing far fewer 20 series 12nm chips.
     
    Vistar Shook likes this.
  11. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,701
    Trophy Points:
    431
    hmscott, Vistar Shook and Talon like this.
  12. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Market analysts and Jensen were saying it was going to be 2 quarters - 6 months - to run down the inventory of $2B and the analysts thought it was going to be longer due to the new GPU's being announced.

    To eliminate that huge $2B inventory in one month, that's incredibly fast if true, but Jensen has sounded like this before when inventory numbers started rising, so IDK how far down the inventory has really been depleted.

    The 10 series new prices were held so high for so long, and such a short time for the lower prices - prices went back up after RTX cards flopped making them overpriced again, and that's why I and so many others recommended buying Series 10 used as prices are still low - you have to patiently shop but you can still find them.

    I hope the 10 Series inventory is really greatly reduced and it does allow Nvidia to drop the RTX prices, but historically Nvidia has held strong on pricing even without mining buyers sucking up inventory.

    My guess is Nvidia will lose money on RTX sooner than lower prices, and unless someone else steps into Jensen's position it's not going to change.
     
    Papusan and Talon like this.
  13. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Jensen will tell you what you want when you have no choice.
    A quick look at Newegg and all but 3 cards are gone for GTX 1080, 1070 Ti and GTX 1070. Since the Newegg RTX 2060 essentially replace all of those cards hands down for performance to price ratio at $349 I look at just those. GTX 1060, 1050/Ti can't even be cosidered at this point with RX 570/580 prices. RX 590 not such much.

    IMO the only cards that should be considered at this point are:

    RTX 2080 Ti for the high end consumer.

    RTX 2070 if you get it at $499

    RTX 2060 for $349-$375~ max

    RX 570/580 for the low end consumer.

    RTX 2080 is too expensive unless you get it under $700 on a sale. Vega 7 offering GTX 1080 Ti (maybe) performance at $700 is a joke with it's power consumption and lack of AI/DXR features when GTX 1080 Ti can be purchased used for far cheaper.

    Both Vega 56 and 65 are quite easily beat out by RTX 2060 in more than one way.
     
    Robbo99999, Papusan and Vistar Shook like this.
  14. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331


    Great news for consumers. Prosumer move from Nvidia and allows all of those with cheaper Free Sync monitors to enjoy adaptive sync with their Nvidia GPU.
     
  15. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331


    Nvidia's RTX NvEnc is beyond impressive... (GPU encoding explanation, x264 Medium Comparison)
     
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Not really, Nvidia has already trashed VESA Adaptive-Sync and AMD's original invention FreeSync that VESA Adaptive-Sync was based.

    Nvidia also said they could only get 12 out of 400 FreeSync monitors to work 100% compatible with Gsync, so that make 97% of FreeSync monitors not 100% compatible with Gsync. So unless you have one of those 12 make / model FreeSync monitors, you could / will experience glitches with Gsync.

    AMD adjusts / tunes support for monitors to cover FreeSync function across all FreeSync monitors, where Nvidia seems to be drawing a hard line against published specs, which may be defensible, but makes for a whole lot of unhappy users when their FreeSync monitor glitches under Gsync.

    There are reports from people already, in fact most of the /r/Nvidia threads refer to Gsync compatibility with FreeSync:
    nvidia subreddit 1-16-2019 #1.JPG
    Look at the titles, some are pretty funny: " FreeSync breaks my entire setup "gone sexual" *not clickbait*" => Nice how they get the name FreeSync right in a clickbait title, instead of "Gsync Compatible". ;)

    Nvidia CEO Says GTX 10-Series Inventory Almost Depleted
    Submitted 3 hours ago by eric98k
    https://www.reddit.com/r/nvidia/comments/agog5k/nvidia_ceo_says_gtx_10series_inventory_almost/

    tamasmagyarhunor 12 points 3 hours ago
    "********. I don't believe this. or maybe he's talking about nvidia FE cards :/"

    twistr36O 5 points 3 hours ago
    "That makes sense. 3rd party cards must still be a thing, in terms of stock."

    ARabidGuineaPig 3 points 2 hours ago
    "Sold my 1080ti for 550$. Not shabby imo for almost owning it two years"

    TheWalkingDerp_ 1 point 2 hours ago
    "Might also talk about GPU inventory and not cards."

    That last one makes sense, rumor was Nvidia was forcing AIB partners to buy quantities of 10 series GPU's as well with their 20 series GPU orders. If that's the case there is still an inventory of GPU's ready to put into cards still out there, maybe if the RTX cards continue to fail the AIB partners will make 10 series GPU's (if Nvidia let's them) and those 2nd wave of 10 series GPU's will be more reasonably priced.
     
    Last edited: Jan 20, 2019
  17. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    If you had watched the video then you would have seen they reported 6/6 display port free sync monitors not listed by Nvidia all worked 100%.

    3 of my friends with free sync 144hz monitors all tested (all different monitors) theirs yesterday and have reported 0 issues.

    Seems to be working great for the majority of users so far. Reading through reddit a lot of users are reporting total success with their screens as well.
     
    Vistar Shook and mitchega like this.
  18. mitchega

    mitchega Notebook Consultant

    Reputations:
    23
    Messages:
    103
    Likes Received:
    130
    Trophy Points:
    56
    Agreed...almost all well manufactured adaptive sync monitors are up and running without issue under the new driver update from what I have experienced and have saw other reporting. Some cheap adaptive sync monitors don't always perform well even when an AMD card is at the wheel. Nvidia is casting shade as to not alienate their premium Gsync module. They are going to dance around this gently, trying to emphasis the better performance of a true Gsync monitor so they don't cannibalize their own product market. It's an interesting situation...because having owned and used both products over the years, the Gsync module is be and far the superior product...but how do you emphasize that whiling giving customers choice? Nvidia has put themselves into a bit of a pickle with this driver update. It's was probably the least intrusive good will gesture they had available to combat the negativity surrounding the RTX launch and the known fact that AMD would launch something in Q1 2019. It will be interesting to see how they continue to market and brand the Gsync technology going forward.
     
    Last edited: Jan 16, 2019
    Talon, Vistar Shook and hmscott like this.
  19. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Beyond impressive actually means it's caught up to X.264 and HVEC so that NVENC can now be used for streaming, hence the addition to streaming interfaces. It's still marginal in areas and is just now catching up enough to be used for streaming.

    Also, these improvements are available to all GPUs that have NVENC chips, which goes back several generations, so it's not an RTX exclusive, though there is supposed to be improved display output hardware on the RTX cards, the encoding doesn't use these RTX hardware output improvements, only the decoding so you could encode on any NVENC capable GPU for the same results for uploading.

    From the posted video:
    NVENC improvements are for all GPUs that include NVENC chips.jpg
     
    Last edited: Jan 16, 2019
  20. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You seem to only selectively see what you want to see when looking. :)

    If you open your eye's a bit wider you'll find that the 2060 is going for more than you are seeing, "RTX 2060 for $349-$375~ max"

    Here's a new snippet from newegg's available inventory, there were more above $400 earlier today:
    Several 2060 above 350-375.JPG
    And, if you go elsewhere, there are more around and well above $400, here's one example, which is more indicative of what people will find outside newegg:
    BLT 2060 pricing #2.JPG
    and many are not in stock until February:
    BLT 2060 pricing.JPG
    BLT 2060 pricing #3.JPG

    But, on the other hand, there sure are a lot more 2060's out there still in stock near launch, so maybe Nvidia made more available at launch thinking that their Nvidia RTX BS would work on more people this time? Or, nobody's buying them. :)
     
    Last edited: Jan 16, 2019
  21. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    So that means Nvidia was caught lying again? Not a surprise of course, but seems like a pretty big whopper of a lie if Nvidia tried to convince people that 988 out of 1000 monitors failed 100% compatibility with Gsync, don't you think?

    I hope the Gsync compatibility works for most people, that wasn't my point, it was simply that Nvidia was pushing out the idea that hardly any were going to work based on their testing.

    Glad to see it was just Nvidia fibbing again. Those rascal's just can't seem to get it right even when they do something good they have to trash it.

    Oh yeah, it's FreeSync, not Gsync, that's why Nvidia got confused and they thought they needed to trash it because it was AMD's thing, not theirs.

    At least in Nvidia's dastardly mind they know the truth. Nvidia's strict practice of deception makes them do weird things like trashing their own good work when being compatible with AMD's original invention, FreeSync. :D

    Update: Nvidia only tested 400 monitors, with 12 working, so only 388 failed which is a little better failure rate than I thought, 97% instead of 98.8%, big improvement. ;)
     
    Last edited: Jan 16, 2019
  22. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    No, as you can see from the screenshot you posted, NvENC on Turing has the best encoding quality. Previous Nvidia generations have needed x264 (CPU) encoding to achieve the same level of quality at compression ratios suitable for streaming.
     
    Talon and hmscott like this.
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    If you listen to what he is saying, and watch the whole video, the improvements are from working on the OBS processing software and driver improvements that gained improvements on all the generations of NVENC chips - across all the generations of GPU's that have them.

    Which is why I posted the chart he included with CC enabled, "It's a good time for streamers", of all NVENC assisted GPU's not just RTX GPU's. Here is is again, showing streaming OBS improvements on all generations of NVENC GPU's "Works with any compatible GEFORCE" in green font:
    NVENC improvements are for all GPUs that include NVENC chips.jpg

    Watch it again and maybe turn on CC so you can catch the nuances of what he is saying. The quality "output" improvements are in the RTX GPU's, but the encoding is the same. He should have shown a Pascal or earlier card encoding to show this, perhaps there will be another reviewer that will do a side by side comparison. Improving performance allows you to increase quality on previous generations GPU's, improving quality improves encoding results.

    Here's an ongoing discussion for OBS NVENC users:

    NVENC Performance Improvements (Beta)
    https://obsproject.com/forum/threads/nvenc-performance-improvements-beta.98950/

    "The quality improvements you may have been hearing about will largely only be seen on Turing GPUs (RTX 20XX), but the performance improvements should be measurable on all GPUs that have NVENC (GTX 6XX and higher)."

    Sunday at 9:06 PM #74
    "I'd like to see some comparison videos to be posted."

    I'd like to see the comparison between a 1080ti and a 2080 for NVENC encoding with both at maximum quality settings on both RTX and non-RTX GPU's now that OBS performance is improved to allow this on the non-RTX GPU's...that would show any improvements in the NVENC hardware for RTX over the previous generations. He did the test comparisons using only a 2080 as the test GPU - he only showed and you only saw same GPU comparisons.

    He would have needed to include a non-RTX GPU, like a 1080ti encoding vs an RTX 2080 to fairly compare results against the RTX encoding, and to see if there are actual quality differences.

    With the improvement in OBS performance allowing previous generation GPU's to encode at higher quality - maximum quality settings - I think the difference will be minimal.

    From reading the posts in that thread it seems the OBS software and Nvidia drivers are still "not quite there yet", there are still lots of problem reports for stability and results.

    Monday at 7:00 AM #75
    "so will this fix my issue with obs not working well with my 2080 card? in obs i keep dropping fps and i can no longer stream anymore. i had a 1080 and everything was working great. my PC specs are 2080, 32gb of ddr4, i7 8700k not oc, 750x psu. all this happened when i upgraded my 1080 to the 2080. with some of my games i have to have vsync on."

    Jan 9, 2019 #42
    Trixz2007 said:
    "Thank you and what about live streaming"

    "the improvements are on the encoding, you will see it in streaming and recording. I can confirm that as i have already streamed well over 12 hours with this beta.

    After this comment i will be streaming and testing out performance with streaming 3440x1440 60fps since both 1080p and 1440p standards have already shown stellar results its time to see how far it can be pushed."
     
    Last edited: Jan 16, 2019
  24. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2060/ -- Stock clocked "A" chip that can be overclocked just as easily.

    https://www.newegg.com/Product/Prod...tx 2060&cm_re=rtx_2060-_-14-137-380-_-Product -- Overclocked dual fan aftermarket "A" chip.

    https://www.newegg.com/Product/Prod...tx 2060&cm_re=rtx_2060-_-14-500-457-_-Product -- $369.99 gets you an 1800mhz overclocked card that will boost even further with GPU boost and manual overclocking. One of the best 2060s so far and matched a 1080 in performance when overclocked.

    https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2060_AMP/35.html

    https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2060_AMP/29.html -- Even beats out the mighty 1080 Ti in Wolfensten II at 1080p. :)


    For those that want a $349-$375 RTX 2060 because according to some they're impossible to find....
     
  25. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331

    [​IMG]

    From Reddit..

    "Just seems like they were working through the 400 monitors alphabetically and just said "**** it" when they hit BenQ."

    It's neither. It's adaptive sync which is an open standard. Putting your sticker on the box doesn't make it yours.
     
    raz8020 and hmscott like this.
  26. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,701
    Trophy Points:
    431
  27. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The encoding performance improvement is present on all generations of Nvidia cards containing NvENC, but the image quality improvement is exclusive to Turing. Improving performance does not allow previous generation GPUs to increase stream quality, idk where you came up with that. At the same bit rates, Turing NvENC provides a much better looking stream than previous generation GPUs. Turing NvENC is comparable to x264 Medium or better in image quality, while Pascal NvENC is comparable to x264 Very Fast in IQ.

    Turing is a huge boon to gamers with mainstream Intel CPUs like the 8600K/9600K, 8700K/9700K, and 9900K, as these happen to be the best CPUs for high refresh rate gaming, but may not have the core/thread count necessary for good quality x264 streaming in heavily multi-threaded AAA games at high frame rates without potentially affecting performance significantly. Turing is also great for laptop streamers using power/thermal limited mobile CPUs, freeing up that large chunk of CPU processing power for the game itself and allowing the CPU to clock higher within the same TDP and temperature limits.
     
  28. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Saved by mining crash early this year :D
     
    hmscott likes this.
  29. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    There are generational improvements between all of the GPU's with NVENC chips in hardware, but they are always small. The biggest improvement this time is from software updates available to all GPU's with NVENC off-loading the CPU - the copy back and forth to process - and the off-load should allow previous generations to enable the maximum quality - an increase allowed from previous lower settings due to CPU load. At the new maximum quality settings all GPU's preceding the Turing GPU should see quality improvements.

    Any additional improvement is going to be marginal at best seen side by side, it's going to take direct comparison to see the difference. I'd check / compare maximum settings on your existing GPU and compare your old settings results against the maximum quality settings allowed by the software / driver update, before spending money on an RTX GPU to get that "better" result.

    I don't believe Nvidia on most everything, so I always recommend double checking - verifying - before buying.
     
    Last edited: Jan 16, 2019
  30. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    @Talon already posted that video a couple of posts back. :)

     
  31. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Hmm, darn I thought I saw it was 1000, that made the percentage calculation easier I guess, at 98.8% :)

    So Nvidia's "Gsync compatible" failed on 388 out of 400 monitors, so that makes Nvidia "Gsync" failure rate go down to only 97% failure rate, yeah that helped. ;)
    Oh come on man, you know that AMD submitted FreeSync to VESA for standards adoption, right? VESA renamed it to Adaptive-Sync but it's still all AMD's design specification, it's FreeSync down to the bone.

    So what, you work for Nvidia or something? Cause you sure seem to want to make sure to put out the company line on everything... does 'ole Leather Chaps help pay your bills? ;)

    AMD FreeSync Proposal Adopted by VESA – Will become a Standard for Display Port 1.2a
    By Usman Pirzada, Apr 8, 2014
    https://wccftech.com/amd-freesync-adpoted-vesa-standard-display-port-12a/

    "We have received word that VESA has accepted AMD’s proposal and FreeSync will become a standard for Display Port 1.2a. FreeSync, who was brought to life to rival Nvidia’s G-Sync, will now be a much greater force to be reckoned with then it was before with just a prototype to support it."

    Display Port 1.2A to have FreeSync Standard – Proposal accepted by VESA

    That's why Nvidia's "Gsync Compatible" AMD FreeSync "Copycat" doesn't work on HDMI, that's still reserved for AMD hardware as AMD is the originator of the standard ratified by VESA, for Display Port only.

    You'll need to buy the real thing, AMD GPU's with FreeSync to get the *FULL* Free-Sync experience, including HDMI :)

    AMD FreeSync™ Technology Over HDMI®
    https://www.amd.com/Documents/freesync-hdmi.pdf
     
    Last edited: Jan 17, 2019
  32. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    That’s an OBS performance optimization due to the Nvidia collab, not an NvENC hardware improvement. The hardware improvement is the improved quality per bit rate. Even before the optimization, NvENC was much more performant than x264 at the expense of quality. With Turing, you get the best of both worlds, with even better performance than before.
     
    hmscott likes this.
  33. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    NVIDIA Has No Plans for Adaptive Sync Support on Maxwell, Prior GPUs
    Tecpowerup.com | Yesterday, 16:30

    In case anyone's been living under a rock (and in these times, if you can do that, I probably envy you), NVIDIA at CES 2019 announced it was opening up G-Sync support to non-G-Sync totting monitors. Via adoption of VESA's open VRR standard (Adaptive Sync, on which FreeSync is based), the company will now add support for monitors that usually only support FreeSync. The company also vowed to test all configurations and monitors, with a whitelist of automatically-enabled panels and manual override for those that don't pass the certification process or still haven't been subjected to it.

    Now, via a post on NVIDIA's GeForce forums, ManuelGuzmanNV, with a Customer Care badge, has said, in answer to a users' question on Variable Refresh-Rate support for NVIDIA's 9000 series, that "Sorry but we do not have plans to add support for Maxwell and below". So this means that only NVIDIA's 1000 and 2000-series of GPUs will be getting said support, thus reducing the number of users for which VRR support on NVIDIA graphics cards is relevant. At the same time, this might serve as a reason for those customers to finally make the jump to one of NVIDIA's more recent graphics card generations, in case they don't already own a VRR-capable monitor and want to have some of that smoothness.

    [​IMG] [​IMG]
     
    raz8020, CaerCadarn and hmscott like this.
  34. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I'd need to see the results of 1080ti vs 2080 of encoding at highest bit rate / quality settings on both to compare myself, I'm not going to believe Nvidia statements parroted from others that haven't verified it themselves, without verifying myself or see it compared by someone that has actually done the comparison / verification.

    Trust but verify - GWB

    Don't trust them bastards, verify. - me
     
  35. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    AMD FreeSync - the original and only true Adaptive-Sync implementation in GPU's, supports FreeSync way back into their past GPU's and APU's:

    AMD Radeon™ FreeSync Technology
    https://www.amd.com/en/technologies/free-sync
    FreeSync AMD Supported GPUs and APUs.JPG

    Nvidia can't even get "Gsync Compatibility" on more than 97% of the monitors they tested, out of 400 FreeSync monitors tested, only 12 passed the Nvidia "Gsync" test, while 100% FreeSync (VESA Adaptive-Sync) are fully supported by AMD GPU's.

    AMD FreeSync Proposal Adopted by VESA – Will become a Standard for Display Port 1.2a
    By Usman Pirzada, Apr 8, 2014
    https://wccftech.com/amd-freesync-adpoted-vesa-standard-display-port-12a/

    Oh, yeah, "Gsync Compatible" doesn't have an HDMI capability, only AMD has that:

    AMD FreeSync™ Technology Over HDMI®
    https://www.amd.com/Documents/freesync-hdmi.pdf
     
    Last edited: Jan 16, 2019
  36. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Comparing at the highest is pointless due to limits enforced by streaming platforms like Twitch. The comparison needs to be done at low bit rates like 3.5K-6K.
     
    hmscott likes this.
  37. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Of course... AMD need every sales they can get. Not fully supported by AMD GPU's would mean force more gamers over on Nvidia.
     
    hmscott likes this.
  38. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Cute, then why would you point out the higher rates allowed by the NVENC chip on the Turing, if it doesn't matter? Wasn't that what I said in the first place?
     
  39. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    What’s cute is your cluelessness. I was talking about higher quality at the same bitrate, not higher max bitrate. Max bitrate is pointless for streaming at low bitrate, it’s the quality and compression ratio at low bitrate that matters, and that’s where x264 has traditionally ruled the roost.
     
    hmscott likes this.
  40. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Once again, you've got it backwards. AMD is the source for FreeSync, so Nvidia is the one to be worried about FreeSync Compatible Gsync support.

    Nvidia is worried enough to enable Gsync compatibility on their crappy GPU's so as to open up more of their market to owners with FreeSync ( invented by and submitted to VESA for adoption) monitors.

    Too bad Nvidia are too twisted to give credit where credit is due, they are riding on the success of AMD's FreeSync, where Gsync failed to gain wide adoption.

    AMD and Freesync are synonymous, that's why Nvidia is afraid to "say the words", and instead just wants to take the credit....
    1a332akwxx821.png
    cmozokhyh6921.jpg
    bzjtp5trbeiy.png
     
    Last edited: Jan 16, 2019
  41. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    So testy, you're the one that brought up bit rate as being the difference between Turing and previous NVENC chips.

    So have you actually compared the output of NVENC encoded files on Pascal vs Turing and compared them side by side, or are you parroting Nvidia marketing BS?

    Find an actual comparison of both Pascal and Turing running the latest driver and software for encoding / streaming and compare. That's the only way to know for sure.

    That video posted only showed everything compared from the same 2080, not from a Pascal GPU under the same updated software in comparison to the Turing GPU.
     
    Last edited: Jan 16, 2019
  42. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    No, you’re the one being dense and putting words in my mouth. This whole time I was talking about Turing NvENC’s improved image quality at the same bit rate, while you assumed I was talking about the increase in max bitrate. Did you even watch the video? He was comparing the stream quality of Turing NvENC vs. x264 Very Fast/Fast/Medium at 3.5K, 6K, and 8K bitrates.
     
    Last edited: Jan 17, 2019
    hmscott likes this.
  43. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    He also said @ ~04:20 you can't push the highest 8k bit-rate encoding to Twitch or you might have issues with it, or weren't you paying attention?

    But, I wasn't in particular talking about streaming, I was talking about encoding at highest rates whether you can stream it or not - and his comparisons were at the highest bit-rate, so to be fair I suggested doing that on Pascal too, you added the streaming limitation.

    IDC, either way is fine, at the lower rates Pascal should have a better chance of matching Turing anyway, if you want to make it easier for Pascal to win, or should I say for Turing to lose, that's fine by me.

    Let's wait for actual encoding comparison's between Pascal and Turing before we accept Nvidia's BS, I don't trust Nvidia and you shouldn't either.
     
    Last edited: Jan 17, 2019
  44. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    8K is allowed for some Twitch partners.

    OBS is the most popular game streaming software. Almost that whole video of discussion was about Turing NvENC’s quality improvement for streaming, and now you’re moving the goalposts? Brilliant.
     
  45. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Geez man, pay attention, the video encoding comparisons he did were at the highest bit rates, then he added the caveat that for streaming that highest rate you might run into problems with Twitch.

    The encoding comparisons in that video were done at the highest bit rate; have you caught up now? Good. :)
     
  46. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Hopefully, but if they launch these 11xx series chips soon, then it will be these ones that are priced cheaper than the RTX lineup, thereby 'allowing' RTX to stay at the higher price. I'd like to see RTX prices decrease though, and then if they do launch 11xx series I'd like to see these cheaper again, but they don't seem to have enough competition from AMD to warrant price drops of RTX.
     
    hmscott likes this.
  47. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Maybe Nvidia will do a numbskull move like Intel that released reduced capability "F" CPU's, at the same MSRP:
    Intel F CPUs at same price as igpu models.JPG
    Nvidia might release their GTX 11xx GPU's at the same price as the RTX GPU's - so as to not cannibalize the RTX GPU sales. :)
     
    Papusan and Robbo99999 like this.
  48. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yeah, quite a few positive reports over on guru3d too: https://forums.guru3d.com/threads/geforce-417-71-game-ready-driver-download-discussion.424859/
     
  49. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Ha, well good humour, but that's not gonna happen!
     
    hmscott likes this.
  50. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Sheez, well Nvidia Gsync Compatible mode sucks if it can only activate on 1 monitor:

    Astyanax Master Guru
    " Only single displays are currently supported; multiple monitors can be connected but no more than one display should have G-SYNC enabled."
    https://forums.guru3d.com/threads/g...iver-download-discussion.424859/#post-5627554

    With AMD Radeon GPU's FreeSync works on all FreeSync / Adaptive-Sync connected monitors simultaneously.

    Maybe AMD will send help to Nvidia to give those guys a leg up, help'em figure out how to make Gsync Compatibility work right. I'm sure AMD would be happy to help, if Nvidia asked nicely. :)
     
    Last edited: Jan 17, 2019
    Papusan likes this.
← Previous pageNext page →