RTX 2060 review, 3rd gen Ryzen news, and best of PCs at CES 2019 | The Full Nerd Ep. 81
PCWorld
RTX 2060 discussion starts @ 05:25 ends @ 17:00
Streamed live 17 hours ago
Join The Full Nerd gang as they talk about the latest PC hardware topics. In the first Full Nerd episode of 2019 we talk about Brad's review of Nvidia's RTX 2060, the news around 3rd Gen Ryzen, and some of our favorite things at CES 2019. As always we will be answering your live questions so speak up in the chat.
Gordon Ung 14 hours ago
"But as we point out: remember we also had a process shrink between 970 and 1070. To be fair, should this be 970 vs. 770? Both were 28nm parts I believe. Pascal got that nice fat bump going from 28nm to 16nm.
You do have to factor that into it which is also why I think people are under estimating where Radeon VII is going to land.
Should Nvidia have waited for 7nm? Well, that's a business decision that we have no true insight into externally. But I can understand why it wants to get the ball rolling on HRT. You can't just go to 10 from 0. Developers have to see a reason to support new features for it to happen.
Should a consumer wait for 7nm RTX? Brad pretty much said that when RTX first launched. But again, that's what you do with your money.
I do agree this is a somewhat of a sidegrade--because that's what Nvidia decided to do. They decided to Zig When AMD Zagged and pushed HRT as something to put on everyone's plate. If they actually get developer support for it and it works out--great.
If HRT support is as slow as everything else, then yeah, you got that Radeon VIII option yes?"
Nvidia GeForce RTX 2060 Founders Edition review: Ray tracing and 1440p gaming get more affordable
Ray tracing goes mainstream as prices go upstream.
By Brad Chacos Senior Editor, PCWorld | JAN 7, 2019 6:00 AM PT
https://www.pcworld.com/article/333...geforce-rtx-2060-founders-edition-review.html
-
-
saturnotaku Notebook Nobel Laureate
-
RTX 2060 launch day and the Nvidia reddit seems to be flooded with people loving the new driver, free g-sync, and RTX 2060s are being bought up.
RTX for the masses, GTX 1080 performance for $350.Papusan and Vistar Shook like this. -
A Peek Into NVIDIA's AI Robotics Lab in Seattle
Vistar Shook and hmscott like this. -
Nvidia's attempt, and other Nvidia leaning reviewers and commenters attempts, to associate the 2060 price point as a 1070ti upgrade is silly - making the 2060 performance matching the 1070ti a simple same price side-grade with little or no difference in performance and a downgrade in VRAM from 8GB in the 1070ti to 6GB in the 2060.
The 2060 is supposed to be the mid-range 20 series product, and if it came out at $249 - or even $199 as previous '60 series products then it would be.
With the 2060's $349 MSRP $100-$150 higher than the previous '60 AIB price's, it's a price increase too far, and it's not a '70 price point model without the associated performance bump.
The RTX 2060 is yet another failed 20 Series release, a complete flop.
Yet some people will buy it. Given no other new offering from Nvidia or AMD at $349 some people will buy instead of following their instincts and walking away to find used or new previous generation alternatives.
Looking at which 2060 listings have sold out, there are only 2 sold out on Newegg, the highest priced and the lowest priced. The Nvidia.com 2060 FE is still available, not sold out. Demand for the 2060 is low if there are still so many in stock sku's after the first day of sales.
IDK why people would buy the budget GPU and then pay the highest price (about $450) for the most expensive model. $500 used to be the cost of the highest priced model, now it's more than double that. Why support that kind of corporate greed and encourage them to do that to us all?
Paying $450+tax for a 2060 doesn't make sense when you can get a used previous generation 1080ti for the same or $50-$100 more, or a 1070ti / 1080 / Vega 56/64 instead for less.Last edited: Jan 16, 2019 -
Here's the front page of /r/Nvidia, and there's hardly a mention of the 2060:
Except for 1 thread about a specific 2060 review with 65 comments (hardly a flood), and a couple of questions with 0 comments, there's only the PSA warning about 2060's not having USB-C Virtual Links Output connectors, and that ranks higher than the 2060 review:
PSA: Almost All AIB RTX 2060 Don't Have USB-C Virtual Link Output
Submitted 17 hours ago by Nourdon
https://old.reddit.com/r/nvidia/comments/agbl6g/psa_almost_all_aib_rtx_2060_dont_have_usbc/
"I first noticed this from browsing the RTX 2060 listing on Newegg, but there seems to be a disturbing pattern of lack of Virtual Link Output on all of the RTX 2060 there.
I then look into it further on the techpowerup gpu database for the RTX 2060 and only found 4 model of AIB below support the USB-C Virtual Link output out of the 40+ model available there. Even the highest end model from ASUS, EVGA, and MSI don't have the virtual link output.
Here are the list of AIB model that have the virtual link output:
- Colorful iGame RTX 2060 Ultra OC
- Colorful iGame RTX 2060 Vulcan X OC
- GIGABYTE AORUS RTX 2060 XTREME
- ZOTAC RTX 2060 Extreme Plus OC6"
Even Nvidia's 20 Series Hardware Support Forum doesn't have any threads about the 2060, Nvidia didn't even start the typical new hardware feedback thread:
And, as I noted in my previous post about 2060 pricing, the vast majority of for sale listings I posted earlier this morning are still showing up now, only 2 show "sold out" on newegg - the highest priced and lowest priced sku's - comparing this against typical releases selling out the first day this shows the RTX 2060 GPU having the lowest interest for a new GPU release in recent memory.
Nothing but the sound of crickets... and even they aren't discussing the RTX 2060.Last edited: Jan 16, 2019 -
https://videocardz.com/newz/rumor-nvidia-geforce-gtx-1660-ti-to-feature-1536-cuda-cores
Rumor: NVIDIA GeForce GTX 1660 Ti to feature 1536 CUDA coresRobbo99999 and Vistar Shook like this. -
Robbo99999 Notebook Prophet
Vistar Shook likes this. -
Nvidia CEO Says GTX 10-Series Inventory Almost Depleted
https://www.tomshardware.com/news/nvidia-gtx-10-series-gpu-inventory-depleted,38455.html
What are the odds Nvidia does some huge price slashing on RTX after GTX 10 is gone? I think very high. Genious really, leave RTX prices high and sell off remaining 10 series stock left over from the crypto hangover to make consumers think they're getting a "deal". Once stock is gone, slash prices on RTX and get back on track.
Nvidia won't suffer the same issue next year with 7nm as they are probably producing far fewer 20 series 12nm chips.Vistar Shook likes this. -
saturnotaku Notebook Nobel Laureate
*rimshot*hmscott, Vistar Shook and Talon like this. -
To eliminate that huge $2B inventory in one month, that's incredibly fast if true, but Jensen has sounded like this before when inventory numbers started rising, so IDK how far down the inventory has really been depleted.
The 10 series new prices were held so high for so long, and such a short time for the lower prices - prices went back up after RTX cards flopped making them overpriced again, and that's why I and so many others recommended buying Series 10 used as prices are still low - you have to patiently shop but you can still find them.
I hope the 10 Series inventory is really greatly reduced and it does allow Nvidia to drop the RTX prices, but historically Nvidia has held strong on pricing even without mining buyers sucking up inventory.
My guess is Nvidia will lose money on RTX sooner than lower prices, and unless someone else steps into Jensen's position it's not going to change. -
IMO the only cards that should be considered at this point are:
RTX 2080 Ti for the high end consumer.
RTX 2070 if you get it at $499
RTX 2060 for $349-$375~ max
RX 570/580 for the low end consumer.
RTX 2080 is too expensive unless you get it under $700 on a sale. Vega 7 offering GTX 1080 Ti (maybe) performance at $700 is a joke with it's power consumption and lack of AI/DXR features when GTX 1080 Ti can be purchased used for far cheaper.
Both Vega 56 and 65 are quite easily beat out by RTX 2060 in more than one way.Robbo99999, Papusan and Vistar Shook like this. -
Great news for consumers. Prosumer move from Nvidia and allows all of those with cheaper Free Sync monitors to enjoy adaptive sync with their Nvidia GPU. -
Nvidia's RTX NvEnc is beyond impressive... (GPU encoding explanation, x264 Medium Comparison) -
Nvidia also said they could only get 12 out of 400 FreeSync monitors to work 100% compatible with Gsync, so that make 97% of FreeSync monitors not 100% compatible with Gsync. So unless you have one of those 12 make / model FreeSync monitors, you could / will experience glitches with Gsync.
AMD adjusts / tunes support for monitors to cover FreeSync function across all FreeSync monitors, where Nvidia seems to be drawing a hard line against published specs, which may be defensible, but makes for a whole lot of unhappy users when their FreeSync monitor glitches under Gsync.
There are reports from people already, in fact most of the /r/Nvidia threads refer to Gsync compatibility with FreeSync:
Look at the titles, some are pretty funny: " FreeSync breaks my entire setup "gone sexual" *not clickbait*" => Nice how they get the name FreeSync right in a clickbait title, instead of "Gsync Compatible".
Nvidia CEO Says GTX 10-Series Inventory Almost Depleted
Submitted 3 hours ago by eric98k
https://www.reddit.com/r/nvidia/comments/agog5k/nvidia_ceo_says_gtx_10series_inventory_almost/
tamasmagyarhunor 12 points 3 hours ago
"********. I don't believe this. or maybe he's talking about nvidia FE cards :/"
twistr36O 5 points 3 hours ago
"That makes sense. 3rd party cards must still be a thing, in terms of stock."
ARabidGuineaPig 3 points 2 hours ago
"Sold my 1080ti for 550$. Not shabby imo for almost owning it two years"
TheWalkingDerp_ 1 point 2 hours ago
"Might also talk about GPU inventory and not cards."
That last one makes sense, rumor was Nvidia was forcing AIB partners to buy quantities of 10 series GPU's as well with their 20 series GPU orders. If that's the case there is still an inventory of GPU's ready to put into cards still out there, maybe if the RTX cards continue to fail the AIB partners will make 10 series GPU's (if Nvidia let's them) and those 2nd wave of 10 series GPU's will be more reasonably priced.Last edited: Jan 20, 2019 -
3 of my friends with free sync 144hz monitors all tested (all different monitors) theirs yesterday and have reported 0 issues.
Seems to be working great for the majority of users so far. Reading through reddit a lot of users are reporting total success with their screens as well.Vistar Shook and mitchega like this. -
Last edited: Jan 16, 2019Talon, Vistar Shook and hmscott like this.
-
Also, these improvements are available to all GPUs that have NVENC chips, which goes back several generations, so it's not an RTX exclusive, though there is supposed to be improved display output hardware on the RTX cards, the encoding doesn't use these RTX hardware output improvements, only the decoding so you could encode on any NVENC capable GPU for the same results for uploading.
From the posted video:
Last edited: Jan 16, 2019 -
If you open your eye's a bit wider you'll find that the 2060 is going for more than you are seeing, "RTX 2060 for $349-$375~ max"
Here's a new snippet from newegg's available inventory, there were more above $400 earlier today:
And, if you go elsewhere, there are more around and well above $400, here's one example, which is more indicative of what people will find outside newegg:
and many are not in stock until February:
But, on the other hand, there sure are a lot more 2060's out there still in stock near launch, so maybe Nvidia made more available at launch thinking that their Nvidia RTX BS would work on more people this time? Or, nobody's buying them.Last edited: Jan 16, 2019 -
I hope the Gsync compatibility works for most people, that wasn't my point, it was simply that Nvidia was pushing out the idea that hardly any were going to work based on their testing.
Glad to see it was just Nvidia fibbing again. Those rascal's just can't seem to get it right even when they do something good they have to trash it.
Oh yeah, it's FreeSync, not Gsync, that's why Nvidia got confused and they thought they needed to trash it because it was AMD's thing, not theirs.
At least in Nvidia's dastardly mind they know the truth. Nvidia's strict practice of deception makes them do weird things like trashing their own good work when being compatible with AMD's original invention, FreeSync.
Update: Nvidia only tested 400 monitors, with 12 working, so only 388 failed which is a little better failure rate than I thought, 97% instead of 98.8%, big improvement.Last edited: Jan 16, 2019 -
yrekabakery Notebook Virtuoso
-
Which is why I posted the chart he included with CC enabled, "It's a good time for streamers", of all NVENC assisted GPU's not just RTX GPU's. Here is is again, showing streaming OBS improvements on all generations of NVENC GPU's "Works with any compatible GEFORCE" in green font:
Watch it again and maybe turn on CC so you can catch the nuances of what he is saying. The quality "output" improvements are in the RTX GPU's, but the encoding is the same. He should have shown a Pascal or earlier card encoding to show this, perhaps there will be another reviewer that will do a side by side comparison. Improving performance allows you to increase quality on previous generations GPU's, improving quality improves encoding results.
Here's an ongoing discussion for OBS NVENC users:
NVENC Performance Improvements (Beta)
https://obsproject.com/forum/threads/nvenc-performance-improvements-beta.98950/
"The quality improvements you may have been hearing about will largely only be seen on Turing GPUs (RTX 20XX), but the performance improvements should be measurable on all GPUs that have NVENC (GTX 6XX and higher)."
Sunday at 9:06 PM #74
"I'd like to see some comparison videos to be posted."
I'd like to see the comparison between a 1080ti and a 2080 for NVENC encoding with both at maximum quality settings on both RTX and non-RTX GPU's now that OBS performance is improved to allow this on the non-RTX GPU's...that would show any improvements in the NVENC hardware for RTX over the previous generations. He did the test comparisons using only a 2080 as the test GPU - he only showed and you only saw same GPU comparisons.
He would have needed to include a non-RTX GPU, like a 1080ti encoding vs an RTX 2080 to fairly compare results against the RTX encoding, and to see if there are actual quality differences.
With the improvement in OBS performance allowing previous generation GPU's to encode at higher quality - maximum quality settings - I think the difference will be minimal.
From reading the posts in that thread it seems the OBS software and Nvidia drivers are still "not quite there yet", there are still lots of problem reports for stability and results.
Monday at 7:00 AM #75
"so will this fix my issue with obs not working well with my 2080 card? in obs i keep dropping fps and i can no longer stream anymore. i had a 1080 and everything was working great. my PC specs are 2080, 32gb of ddr4, i7 8700k not oc, 750x psu. all this happened when i upgraded my 1080 to the 2080. with some of my games i have to have vsync on."
Jan 9, 2019 #42
Trixz2007 said:
"Thank you and what about live streaming"
"the improvements are on the encoding, you will see it in streaming and recording. I can confirm that as i have already streamed well over 12 hours with this beta.
After this comment i will be streaming and testing out performance with streaming 3440x1440 60fps since both 1080p and 1440p standards have already shown stellar results its time to see how far it can be pushed."Last edited: Jan 16, 2019 -
https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2060/ -- Stock clocked "A" chip that can be overclocked just as easily.
https://www.newegg.com/Product/Prod...tx 2060&cm_re=rtx_2060-_-14-137-380-_-Product -- Overclocked dual fan aftermarket "A" chip.
https://www.newegg.com/Product/Prod...tx 2060&cm_re=rtx_2060-_-14-500-457-_-Product -- $369.99 gets you an 1800mhz overclocked card that will boost even further with GPU boost and manual overclocking. One of the best 2060s so far and matched a 1080 in performance when overclocked.
https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2060_AMP/35.html
https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2060_AMP/29.html -- Even beats out the mighty 1080 Ti in Wolfensten II at 1080p.
For those that want a $349-$375 RTX 2060 because according to some they're impossible to find.... -
From Reddit..
"Just seems like they were working through the 400 monitors alphabetically and just said "**** it" when they hit BenQ."
-
saturnotaku Notebook Nobel Laureate
-
yrekabakery Notebook Virtuoso
Turing is a huge boon to gamers with mainstream Intel CPUs like the 8600K/9600K, 8700K/9700K, and 9900K, as these happen to be the best CPUs for high refresh rate gaming, but may not have the core/thread count necessary for good quality x264 streaming in heavily multi-threaded AAA games at high frame rates without potentially affecting performance significantly. Turing is also great for laptop streamers using power/thermal limited mobile CPUs, freeing up that large chunk of CPU processing power for the game itself and allowing the CPU to clock higher within the same TDP and temperature limits. -
-
Any additional improvement is going to be marginal at best seen side by side, it's going to take direct comparison to see the difference. I'd check / compare maximum settings on your existing GPU and compare your old settings results against the maximum quality settings allowed by the software / driver update, before spending money on an RTX GPU to get that "better" result.
I don't believe Nvidia on most everything, so I always recommend double checking - verifying - before buying.Last edited: Jan 16, 2019 -
-
So Nvidia's "Gsync compatible" failed on 388 out of 400 monitors, so that makes Nvidia "Gsync" failure rate go down to only 97% failure rate, yeah that helped.
So what, you work for Nvidia or something? Cause you sure seem to want to make sure to put out the company line on everything... does 'ole Leather Chaps help pay your bills?
AMD FreeSync Proposal Adopted by VESA – Will become a Standard for Display Port 1.2a
By Usman Pirzada, Apr 8, 2014
https://wccftech.com/amd-freesync-adpoted-vesa-standard-display-port-12a/
"We have received word that VESA has accepted AMD’s proposal and FreeSync will become a standard for Display Port 1.2a. FreeSync, who was brought to life to rival Nvidia’s G-Sync, will now be a much greater force to be reckoned with then it was before with just a prototype to support it."
Display Port 1.2A to have FreeSync Standard – Proposal accepted by VESA
That's why Nvidia's "Gsync Compatible" AMD FreeSync "Copycat" doesn't work on HDMI, that's still reserved for AMD hardware as AMD is the originator of the standard ratified by VESA, for Display Port only.
You'll need to buy the real thing, AMD GPU's with FreeSync to get the *FULL* Free-Sync experience, including HDMI
AMD FreeSync™ Technology Over HDMI®
https://www.amd.com/Documents/freesync-hdmi.pdfLast edited: Jan 17, 2019 -
yrekabakery Notebook Virtuoso
hmscott likes this. -
NVIDIA Has No Plans for Adaptive Sync Support on Maxwell, Prior GPUs
Tecpowerup.com | Yesterday, 16:30
In case anyone's been living under a rock (and in these times, if you can do that, I probably envy you), NVIDIA at CES 2019 announced it was opening up G-Sync support to non-G-Sync totting monitors. Via adoption of VESA's open VRR standard (Adaptive Sync, on which FreeSync is based), the company will now add support for monitors that usually only support FreeSync. The company also vowed to test all configurations and monitors, with a whitelist of automatically-enabled panels and manual override for those that don't pass the certification process or still haven't been subjected to it.
Now, via a post on NVIDIA's GeForce forums, ManuelGuzmanNV, with a Customer Care badge, has said, in answer to a users' question on Variable Refresh-Rate support for NVIDIA's 9000 series, that "Sorry but we do not have plans to add support for Maxwell and below". So this means that only NVIDIA's 1000 and 2000-series of GPUs will be getting said support, thus reducing the number of users for which VRR support on NVIDIA graphics cards is relevant. At the same time, this might serve as a reason for those customers to finally make the jump to one of NVIDIA's more recent graphics card generations, in case they don't already own a VRR-capable monitor and want to have some of that smoothness.
raz8020, CaerCadarn and hmscott like this. -
Trust but verify - GWB
Don't trust them bastards, verify. - me -
AMD Radeon™ FreeSync Technology
https://www.amd.com/en/technologies/free-sync
Nvidia can't even get "Gsync Compatibility" on more than 97% of the monitors they tested, out of 400 FreeSync monitors tested, only 12 passed the Nvidia "Gsync" test, while 100% FreeSync (VESA Adaptive-Sync) are fully supported by AMD GPU's.
AMD FreeSync Proposal Adopted by VESA – Will become a Standard for Display Port 1.2a
By Usman Pirzada, Apr 8, 2014
https://wccftech.com/amd-freesync-adpoted-vesa-standard-display-port-12a/
Oh, yeah, "Gsync Compatible" doesn't have an HDMI capability, only AMD has that:
AMD FreeSync™ Technology Over HDMI®
https://www.amd.com/Documents/freesync-hdmi.pdfLast edited: Jan 16, 2019 -
yrekabakery Notebook Virtuoso
hmscott likes this. -
hmscott likes this.
-
-
yrekabakery Notebook Virtuoso
hmscott likes this. -
Nvidia is worried enough to enable Gsync compatibility on their crappy GPU's so as to open up more of their market to owners with FreeSync ( invented by and submitted to VESA for adoption) monitors.
Too bad Nvidia are too twisted to give credit where credit is due, they are riding on the success of AMD's FreeSync, where Gsync failed to gain wide adoption.
AMD and Freesync are synonymous, that's why Nvidia is afraid to "say the words", and instead just wants to take the credit....Last edited: Jan 16, 2019 -
So have you actually compared the output of NVENC encoded files on Pascal vs Turing and compared them side by side, or are you parroting Nvidia marketing BS?
Find an actual comparison of both Pascal and Turing running the latest driver and software for encoding / streaming and compare. That's the only way to know for sure.
That video posted only showed everything compared from the same 2080, not from a Pascal GPU under the same updated software in comparison to the Turing GPU.Last edited: Jan 16, 2019 -
yrekabakery Notebook Virtuoso
Last edited: Jan 17, 2019hmscott likes this. -
But, I wasn't in particular talking about streaming, I was talking about encoding at highest rates whether you can stream it or not - and his comparisons were at the highest bit-rate, so to be fair I suggested doing that on Pascal too, you added the streaming limitation.
IDC, either way is fine, at the lower rates Pascal should have a better chance of matching Turing anyway, if you want to make it easier for Pascal to win, or should I say for Turing to lose, that's fine by me.
Let's wait for actual encoding comparison's between Pascal and Turing before we accept Nvidia's BS, I don't trust Nvidia and you shouldn't either.Last edited: Jan 17, 2019 -
yrekabakery Notebook Virtuoso
OBS is the most popular game streaming software. Almost that whole video of discussion was about Turing NvENC’s quality improvement for streaming, and now you’re moving the goalposts? Brilliant. -
The encoding comparisons in that video were done at the highest bit rate; have you caught up now? Good. -
Robbo99999 Notebook Prophet
hmscott likes this. -
Nvidia might release their GTX 11xx GPU's at the same price as the RTX GPU's - so as to not cannibalize the RTX GPU sales.Papusan and Robbo99999 like this. -
Robbo99999 Notebook Prophet
-
Robbo99999 Notebook Prophet
hmscott likes this. -
Astyanax Master Guru
" Only single displays are currently supported; multiple monitors can be connected but no more than one display should have G-SYNC enabled."
https://forums.guru3d.com/threads/g...iver-download-discussion.424859/#post-5627554
With AMD Radeon GPU's FreeSync works on all FreeSync / Adaptive-Sync connected monitors simultaneously.
Maybe AMD will send help to Nvidia to give those guys a leg up, help'em figure out how to make Gsync Compatibility work right. I'm sure AMD would be happy to help, if Nvidia asked nicely.Last edited: Jan 17, 2019Papusan likes this.
Nvidia Thread
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.