I just see a major change coming with lots of uncertainty due to climate change, the impending collapse of the USD (if things continue as they currently are, it isn't an absolute, but distinct possibility as countries de-dollarize and more recommend going back to the gold standard, and as the petro-dollar wanes due to changes related to dealing with climate change).
Because of the political climate, horizons of 2030-50 are set to likely bring large changes. Due to idiotic oligarchs in the US, which Trump is implementing their policies without consideration to the ramifications, along with other idiots like Marco Rubio trying to prevent Huawei from bringing patent infringement cases in the US by banning any country or company on the national security list from access to US courts or the US International Trade Courts, which likely runs afoul of many treaties related to the WTO, we could see potential harm to the international patent system, including, in the extreme, China just invalidating and not enforcing US corporations' patents filed in China, opening a pandora's box of cheaply made goods on publicly available information, of course after implementation R&D occurs.
So I hear ya on the not wanting to derail the thread (otherwise I would talk policy and politics a lot more).
But, to the disposable point, due to wage stagnation in the US (a point largely overlooked in economic analyses of the tech sector on when, why, and how consumers upgrade, along with why phones break more commonly and need replaced more often, taking more of the average citizen's capital to replace those devices instead of larger investments in computers, while also ignoring the subsidization by telecomms and spreading out the cost over the 2 year life cycle versus the ability to get financing on certain computer products, etc.), I do see a slight rise in right to repair gaining traction.
Because of this, I would not be surprised if more move, over the next decade or two, into more modular devices able to upgrade specific components rather than forced disposibility and obsolescence. Truth is, those are a factor of capitalism, and aspects of capitalism and neoliberalism are being rejected globally to varying degrees (but, contra point is fascism and fascistic principles and policies are also on the rise globally). So...
-
-
) in the Silicon lottery. A disgusting way add in hardware in notebooks. On top... With BGA parts the ODM's can shrink down the laptop chassis to near pancake size (see crApplebooks). This will in the end means worse cooling.
Of course Thermal throttling issues have do with what socket or soldered attachment is used in laptops. They are very near connected to each other
See.. Today's laptop physical size due the use of BGA will inevitably lead to thermal throttling. All this is well documented and there is no cure for it outside go back to socket hardware
With better process tech as 3-7nm to better the power efficiency, all we will get is even thinner models and the same throttling problems will just continue. See this is no good solution for the overheating/throttling mess.Last edited: Jun 27, 2019 -
It is right now, because OEM's use inefficient cooling materials and methodology for thing/light chassis (or virtually any laptop).
For example... instead of using copper, they should be using synthetic diamonds/graphene/carbon nanotubes composite materials, and even though such composites could have been made for a while, along with different air cooling techniques... none of that was implemented (at least I know of no OEM's that did so), mainly due to thinking it would shoot up their costs to high levels (Even though affordable mass production was/is doable and the quantities of any 3 needed materials would be small - heck, why not make a composite by still using copper as a basis but meshing CNT/graphene and synth. diamonds with it to vastly improve cooling?).
As for the thermal solder used... I have to agree that this could play a role in CPU operational temperatures. We've seen first hand with Intel how delidding is needed to remove the inefficient thermal paste and replacing it with a better one dropped between 10 to 20 deg C, along with similar thermal paste being used between the lid and heatpipe (as OEMs' couldn't be bothered to properly apply the thermal paste - they could have automated the process by now).
AMD Zen CPU's (at least those that use a solder - which is everything in the Zen and Zen+ series onwards, as the only ones not using solder were Zen 1 APU's, Zen+ ones use a solder, and Zen 2 will use it too) all have solders in place which was demonstrated to be a superior approach to Intel... so, right now, with the limited implementation of AMD Zen CPU's in laptops such as GL702ZC and Helios 500, we hadn't seen issues with the CPU's themselves (if anything the GL702ZC issues were probably down to overall cooling assembly itself)... and as for the laptops using AMD APU's of Zen series (not referring to Asus TUF series)... again, poor cooling execution (and they didn't use solders).
Even if a CPU has solder thermal material, its the cooling assembly that will make or break the thing.
OEM's need to vastly step up their game in that department.
Asus started with implementing Liquid Metal apparently in their units... which is a small step forward, but they (and other OEM's) need to tailor the cooling to each unit/model.
You can't use same cooling approach from Intel and slap it into AMD laptop or vice versa. For that matter, you can't use Intel u series cooling and apply it to AMD u series.
I don't care about 'cutting corners'... OEM's should be able to afford doing all this on the fairly cheap without cutting corners.
The cooling even for 15-25W APU's is atrociously inefficient.
As for better process tech with 3-7nm improving power efficiency... yes, but don't forget the smaller node you use, the bigger the heat output will be as you pack that much more transistors next to each other. One way of managing that is to clock the hw to slightly lower levels (or, find its 'sweet spot' on a given node) and make large leaps in IPC and uArch to increase performance.
Polaris for example was really efficient at around 1100 to 1200MhZ... but AMD overshot the clocks and ended up going way over the 'comfort zone' to improve performance (which also increased power draw)... plus it didn't help they used a 14nm/12nm node from Samsung designed for low clocks and mobile parts and the fact they never optimized the voltages out the factory (this one really messed them up).
Anyway, cooling execution in laptops is down to OEM's...
If they have certain hw, they should really tune it individually to work as it should.hmscott likes this. -
In laptops, even if you get socketed hw such as CPU and even GPU, there is no guarantee you will be able to upgrade either one of them.
Mainly because the CPU upgrades these days depend on OEM BIOS releases (and we've seen they tend to drop support for laptops about 6 months down the line, and even if they DO release a new
BIOS, its usually just some fixes with the OS or general improvement in performance - which would be nice if you ARE experiencing any issues, except you can't really notice any changes... mostly).
Ask them about new CPU microcode integrations, and they will either ask you 'what are you talking about?' or just say flat out 'no'.
As for MXM gpu's... sadly, upgrading that is usually even more cost prohibitive and will (again) depend on BIOS support (except possibly in cases of motherboards specifically based off their desktop counterparts that identify the same on laptops - but since MXM is a different format, I doubt that modularity would be in place).
The prices of newer mobile GPU's on e-bay alone can be staggeringly high and not worth what the sellers ask for them... in some cases, the price of that new GPU will end up being half the cost of what you paid for the laptop itself (if not even exceed the laptops' value).
So, while having a replaceable CPU and GPU is a neat idea... it's not worth anything unless you can actually upgrade those components later on on an affordable level - as it stands right now, you're more likely to just replace the entire unit rather than a GPU (at least when it comes to laptops).
This is completely unacceptable.
Laptops should be designed with modularity in mind and easy/cheap (aka very affordable) access to components for repair/maintenance/upgrades.
You probably won't RMA the laptop just to clean it out from dust and have the thermal paste re-applied if you know how to do this yourself... no, OEMs' should give us unrestricted access for that (and provide completely unlocked BIOS-es) because some people will live in more dusty regions where they will need to clean up their laptops from dust every half a year for example.Last edited: Jun 27, 2019hmscott likes this. -
A short view of the Asian leg of the AMD Ryzen / Navi presentation with a 3900x / 5700XT demo showing the new FidelityFX and Anti-lag features, along with a list of what game developers are including it in their future games:
AMD is playing differently this time with Radeon RX 5700 XT
Gadget Pilipinas
Published on Jun 27, 2019
We attended the asian leg of AMD's Ryzen and Radeon Briefing in South Korea. Here's one of the things we've learned so far.
AMD Ryzen 3900X + 5700XT a little faster than intel i9 9900K+ RTX2070 in the game, World War Z.Today, AMD hosted a media briefing in Seoul, Korea. air-cooled Ryzen, water cooled intel.
https://www.reddit.com/r/Amd/comments/c54hls/amd_ryzen_3900x_5700xt_a_little_faster_than_intel/
Benchmark 3900x + 5700XT/ 9900k + RTX 2070 Filtrado!
Wiki Geeks
Published on Jun 25, 2019
It is filtered in reddit benchmark of AMD Zen 2 with NAVI 5700XT VS intel i9 9900k with Nvidia rtx 2070, the battle is in the WWZ game, with dx11 filtering a few computers from S. Korea, who will win the battle in what remains of the year?
uzzi38 479 points 2 days ago
"For the record, World War Z doesn't scale well on multiple threads for DX11. So this is AMD showing that in terms of the GPU and single core CPU perf, AMD compete incredibly well. Course, we don't know much about the demo setup, and as such have no clue about RAM etc, so take what you want from these numbers."
Patient-Tech 42 points 1 day ago
"Looks like worst case, if AMD isn’t the absolute winner, they’re within a margin of error to match. At a price point that intel needs to be worried about. Glad AMD is doing well, intel was getting a bit greedy there..."
uzzi38 25 points 1 day ago
"...Okay, I thought I was being overly optimistic when I thought Ryzen 3000 might be on par with 9th Gen Intel for gaming a few months ago, but you my friend, you win."
AMD released a new video on twitter, it's not on YT yet:
https://twitter.com/AMD/status/1144040192325640192
AMD just released a trailer showcasing the Zen 2 launch. Only ~10 days to go, get hyped!
Submitted 9 hours ago by jedidude75
https://www.reddit.com/r/Amd/comments/c5z0lu/amd_just_released_a_trailer_showcasing_the_zen_2/
3rd Gen AMD Ryzen™ Technology
AMD
Last edited: Jun 27, 2019 -
I would be pumped if the 3950x were being released but sadly not yet.
ajc9988 likes this. -
The good news is that there are a bunch of Ryzen 3xxx CPU SKU's including a new Ryzen 9 3900x 12 core / 24 thread sku releasing, and lots of people are pumped about those CPU's coming out on July 7th.
Here's some Q&A's from AMD guys giving the inside scoop on Ryzen 3 / Navi:
AMD talks Ryzen 3000 and Radeon 5700 series launch | The Full Nerd special edition
PCWorld
Streamed live 2 hours ago
Join The Full Nerd gang as they talk about the latest PC hardware topics. In today's show we are joined by two very special guests from AMD, Scott Herkelman and Robert Hallock, to talk about the impending release of Ryzen 3000 series CPU's and Radeon 5700 series GPU's! As always we will be answering your live questions so speak up in the chat.
The Bring Up - Extra Bytes: AMD Radeon™ Software Updates for Radeon RX 5700 Series
AMD
Published on Jun 27, 2019
We Bring Up: New features for the AMD Radeon™ Software Adrenalin 2019 Edition with Radeon™ Software Product Manager, Scott Wasson!
01:17 “Navi” and the meaning of a new GPU architecture
02:34 AMD Radeon™ Image Sharpening
05:45 AMD Radeon™ Anti-lag
09:22 Measuring click-to-response latency
10:40 Cards supporting new features
11:12 Sneak Peek at The Bring Up Episode 10
Last edited: Jun 28, 2019 -
It seems they do plan to bump up core count on threadrippers. Now is the waiting for Q4 to Q1 to find out more.
Also, did you see the leaked internal Intel doc confirming their problems on desktop (look up Moore's law is dead video from this afternoon, then find the doc).hmscott likes this. -
I've been pointing this out clearly for many years, so for me it's not a surprise at all.
Eventually Intel was going have to come out publicly (even if only to employee's) with "bad news", but I think the news is still too "positive" coming out of Intel, there is still a long way to go. Acknowledging AMD is only the start.Last edited: Jun 28, 2019ajc9988 likes this. -
hmscott likes this.
-
Are there any Intel customers remaining who haven't heard from their engineers and systems people to not renew their Intel contracts and stop buying Intel hardware?
The Intel hardware, firmware, and software mitigations required to be compliant with security requirements have been killing performance and eating up capacity - space - power - and taking so much time from support people dealing with Intel that it negatively affects customer projects indirectly and directly.
Who wants to keep buying Intel server CPU's burdened with performance sapping and confidence killing vulnerability mitigations?
Who wants to overbuy hardware, rack space, power and cooling, and spend all of that extra time dealing with Intel's security unknowns - only to be caught out if another Intel vulnerability hits - sapping what's left of performance - causing another round of capacity re-designs?
Who wants to take on the responsibility of signing for millions of $ of Intel equipment with those knowns and unknowns hanging out there?
Here's that "internal memo" and AdoredTV's take on it:
Intel Internal Memo Reveals that even Intel is Impressed by AMD's Progress
UPDATED by W1zzard Wednesday, 02:11 Updated: Wednesday, 03:50 Discuss (119 Comments)
https://www.techpowerup.com/256842/...that-even-intel-is-impressed-by-amds-progress
The Intel Challenge
AdoredTV
Published on Jun 28, 2019
A video analyzing Intel's bizarre internal leaked memo reveals their desktop and server game plans.
Last edited: Jun 28, 2019Deks likes this. -
So, new info on Milan which can translate into what to expect for Zen 3. Milan is mid-2020 node. Here is a clip from Anandtech:
https://www.anandtech.com/show/14568/an-interview-with-amds-forrest-norrod-naples-rome-milan-genoa
What this means is DDR5 will not be coming to AMD with Zen 3, instead looking to Zen 4 (Genoa). It will take a new socket, which was refused to be talked about yesterday or the day before in the PCWorld interview @hmscott posted. So, there is a chance for one more year of socket compatibility before AM5 or another socket comes out (not a guarantee, but a possibility).
That also says TR3 and possibly TR4 will be compatible with current X399 boards, although there will be an updated PCIe 4.0 board lineup, surely.
Interestingly, 2021 was expected to be the earliest consumers would see DDR5 integration. Whether AMD plans a full lineup from bottom to top that is compatible with DDR5 is yet to be seen. But, I'll venture a guess they will either due a nightmarish (for the board engineers) dual compatibility DDR4/5 plug, or they will just switch to DDR5, but not PCIe 5, in 2021 (server is easier to predict on this than HEDT or Mainstream, but once we learn more on the I/O die, it may become clearer).
This doesn't address the rumors/desires to see AMD increase the number of memory channels for HEDT. Intel may have 6-channel memory on their entire HEDT Cascade-X lineup. If so, it will be interesting to see what AMD may counter with regarding memory channels and PCIe lanes moving forward. (My theory is TR moves closer to Epyc over time, but just a theory, as they could open up certain OCing MBs and certain speed optimized 1P server chips alongside TR for a couple hundred bucks more, but this is pure speculative desire, not rumor).
Let's hope with the proliferation on how Epyc OCing has gone that they don't close the way to OC for Rome or Millan.
Edit: and for those that didn't catch it in the PCWorld interview, it was said, either mistakenly or not, that to go off CCX, it required going to the I/O die and returning. Now, they may have meant this as a mistake, but I believe what they have done is standardized all inter-CCX communications to the same exact latency, thereby combating stale data (you've seen my explanation of stale data related to the 32-core 2990WX, so no need to rehash, it is sprinkled in this thread, the Ryzen vs Intel thread, and other places on this and other forums). Previously, the IF controller was on the die, now on IF (although there is the IF connection on the core die for Zen 2). Because of this, it may act, even though invisible to the OS regarding NUMA nodes, as 4x4-core chips tied together, which explains why the additional cache is SO important for Zen 2, along with them moving up that one item from Zen 3 (TAGE branch predictor I believe, but I'd have to go back to many articles and notes to confirm).
https://www.hexus.net/tech/news/cpu/131549-the-architecture-behind-amds-zen-2-ryzen-3000-cpus/
But, seeing the performance of the 3950X, it shows that memory bandwidth WAS NOT the problem with the 2990WX! That is over once and for all! Instead, between the scheduler (including the thread thrashing), the hardware controlling speed of cores, the latencies being uneven causing stale data (amplified by the scheduler problem), and the longer mem calls due to no direct mem access and issues with smaller caches (which were still large, just not enough for the uarch, IMO, even though the large caches showed work stored in cache rather than mem calls performed better, which is why Blender did so well on the architecture), we saw some issues on some workloads. What the 3950X has shown us is that AMD's 32-core and 64-core HEDT chips will now not have that performance issue, meaning the 32-core will bat against the 28-Core Intel Xeon overclockable chip, while the 64-core AMD TR chip will just demolish any multithreaded workload in an unparalleled fashion, so long as the software can handle that number of cores!Last edited: Jun 28, 2019hmscott likes this. -
im extremely curious on the power efficiency and frequency wall. with those i'll make up my mind on which 24/32 cores to go for, or zen3.
zen 3 still got it's issue being the last AM4 socket, DDR4 etc. waiting forever is a pain. -
AMD's AM4 socket is supposed to be supported *through* 2020, so the working theory is that AMD will continue to support AM4 through all the Ryzen CPU's released for AM4 in the year 2020...
That would included Ryzen 4 / Zen 3.
The working assumption for DDR5 is announced in 2020 and shipping in 2021, and that would fit with the Ryzen 5 / Zen 4 release on a new AM5(?) socket.
Typically at the transition point between sockets AMD tries to give 1/2 support between the old socket and new socket for 1 more CPU release. In this case the transition from DDR4 to DDR5 gives AMD the chance to support both in 1 generation of transitional CPU's.
If AMD does this for Ryzen 5 / Zen 4 that means you could use the new CPU on AM4 with DDR4 or buy a new socket motherboard - AM5? - and run the new CPU on AM5 with DDR5 memory.
It's possible, but not definite.
So there is at least another release cycle of AM4 CPU's to come beyond the Ryzen 3 / Zen 2 CPU's coming out on 7/7/2019, with that next CPU release being the Ryzen 4 / Zen 3 CPU's in 2020, on AM4, using DDR4.ajc9988 likes this. -
ddr5 sounds good and all but new memory always expensive, and crappy timing. AM5 (new socket name?) with DDR4 support. and when DDR5 somewhat matures I can just buy a mobo and swap it. -
Why wait for 16c/32t => 32c/64t on Zen 4, why not 32c/64t on Zen 3? If AMD waits till Zen 4 the CPU core count increase would only be for 1 or 2 SKU's (24 core too), so AMD would need to feed owners of the other sku's with something special to mandate a socket change, again DDR5 is all that comes to mind.
I suppose the simply adding more memory channels might be enough of a reason for a socket change, but why not add a memory architecture change to DDR5 to make it a real improvement? Usually the initial DDRx upgrade doesn't provide much of a performance improvement over the previous generation, but doubling or more the memory channels would put a DDR5 upgrade over the top.
It also depends on when Intel looks like it would throw down DDR5 / PCIE 5 as AMD will want to beat Intel to market by getting either or both out there first.
Should be fun, but I think the point really is AMD has a long year + ahead if it still. And, even into the AM5 time and beyond AM4 should sell well.Last edited: Jun 30, 2019ajc9988 likes this. -
In other words, from AMD's interviews, they have said DDR5 in 2021, the PCIe standards are 18-24 months from final to implemented (putting it around 2021 or further), so Intel's claims are not changing our plans because that is unrealistic on timeline for implementation.
Also DDR5 offers 30% extra performance at the same speeds, fwiw.
I just don't think MB MFRS, which were dragged kicking and screaming on pcie4.0, will be ready to go PCIe 5.0 until they recoup a bit.hmscott likes this. -
ajc9988 likes this.
-
Would be nice to skip Zen 2 and upgrade to Zen 3 in my Helios 500.
Would probably get a nice frequency and core increase at same power draw (or at least a frequency increase at same power draw).
Now all I need is a BIOS update to actually support that (along with fast frequency RAM in quad configuration - or remove the original sticks and replace them with two high frequency 32GB RAM sticks). -
Rumor: Ryzen 3 pre-orders start July 1st...I can't find official mention anywhere, but July 1st is getting quoted often. Why have pre-orders with only a week before release on 7/7? Maybe it is limited to some retailers / products?
MSI Godlike x570 is listed on newegg Canada - price in Loonies $1049, $800USD, I wonder why it's been listed without any other x570 showing?
I saw the X570 post yesterday and I decided to check it out, it's out of stock now
Submitted 3 hours ago by Batsinvic888
https://www.reddit.com/r/Amd/comments/c7n7hk/i_saw_the_x570_post_yesterday_and_i_decided_to/
PCPartPicker also recently added the official 3900X product box art
https://www.reddit.com/r/Amd/comments/c7kenv/pcpartpicker_also_recently_added_the_official/
A thing of beauty to behold...
Rumor: Ryzen 3600 tops single-threaded Passmark chart with only 2 submitted scores to Passmark, the margin for error is high:
https://www.cpubenchmark.net/singleThread.html
https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+5+3600&id=3481
With a 36% improvement in single-threaded score over Ryzen 2700x, this seems unlikely...it's only a 15% increase in overall score...but the Ryzen 2700x is 8c/16t vs Ryzen 3600 6c/12t...
https://www.cpubenchmark.net/compare/AMD-Ryzen-7-2700X-vs-AMD-Ryzen-5-3600/3238vs3481
Last edited: Jul 1, 2019 -
PassMark doesn't seem like a very valid bench to rely on.
36% increase in single thread?
That would be nice... although, we might be forgetting something that could contribute at least something towards the extra 21% differential (apart from the IPC).
Breaking it down:
15% for IPC.
10 -15% for Infinity Fabric improvements (if used with high frequency and low latency RAM).
Additional 6 -11% for frequency boost on single core (depending on how high the 3600 clocks on single core)... but according to AMD's own website, 3600 boost clock goes up to 4.2GhZ (that's 100MhZ less than 2700x... 3600x on the other hand boosts to 4.4GhZ though)?
So, not sure where the extra 6-11% would be coming from to account for 36% total difference... unless the chiplet design is doing it to reduce latency, or some other design feature we might not have accounted for (perhaps the difference being run on X470 and X570 motherboards)?
P.S. Could Infinity Fabric improvements cause more than 15% increase in performance?
We've seen AMD do it with Zen+, but not sure if Zen 2 could have gotten 21% improvement on IF alone... or could it have?Last edited: Jul 1, 2019hmscott likes this. -
oh boy looks like Ian wont be doing indepth review for ryzen 3000. kinda sux. good thing still got Paul at toms to do it.
-
He said he'll work on it when he comes back. He's seem's happy to be out of it this time...
Dr. Ian Cutress Verified account @IanCutress
Personal tweet: Sorry everyone, I won't be doing our Ryzen 3000 review. Some personal things have come up, and I'm taking a couple of weeks off or more to deal with them. This recent Q2/Q3 hectic industry schedule is taking its toll on my health and various other parts of my life - 9:14 AM - 1 Jul 2019
Dr. Ian Cutress Verified account @IanCutress 7h7 hours ago
Gavin is going to be doing some testing - he has all my benchmarks, and my pre-1903 data for comparison, so AT will likely have something up for launch. But I won't be taking a look at the hardware until I'm back in the officeLast edited: Jul 1, 2019ajc9988 likes this. -
Hope he gets the time he needs to get back to whole.hmscott likes this. -
This could be a good sign for Ryzen 3000 overclocking, if there is headroom for manual OC'ing beyond AMD autotune - as it was with Ryzen 1 and 2 series CPU's, as now Silicon Lottery thinks they can bin and offer Ryzen 3 CPU's:
Silicon Lottery to Bin and Sell Ryzen 3000 CPUs
by Matthew Connatser June 30, 2019 at 10:24 AM
https://www.tomshardware.com/news/silicon-lottery-amd-ryzen-3000,39774.html
"...So why has the company decided to provide bins for the 3800X and 3900X? Although the chips are already soldered, meaning delidding is not only unnecessary but actually dangerous, Silicon Lottery can still bin these CPUs and sell them at a premium.
The company likely believes Ryzen 3000 will at least sell enough to make a profit. Or perhaps, if AMD has left a good deal of overclocking potential on the table with the 3800X and 3900X, and if Silicon Lottery knew that already, then it's a no-brainer to bin the 3800X and 3900X just like they do with Intel CPUs. The higher the clock speed, the bigger the margins, after all.
At the very least, Silicon Lottery's new Ryzen segment of binned CPUs shows something we've been seeing for a while over the past two years and especially within the past few months: AMD is gaining an incredible amount of traction within the market with both old partners like MSI and new companies (new to AMD, anyways) like Silicon Lottery.
Of course, we're all hoping this also means Ryzen 3000 will overclock very well, but we'll just have to wait and see about that." -
But, with the new chips on a new node, there can be more variance. Also, with higher IPC, even if the spread stays small, the difference in performance is larger, helping to justify it.
Finally, due to the excitement and anticipation surrounding the launch, it makes it less likely to get stuck with aging inventory. -
-
Update: So this might be true for the 3950X as it turbo's to 4.7ghz, but this doesn't appear to apply to the lower sku's that max out at about 4.3ghz on air (maybe rarely 4.4ghz), but on LN2 the 3900x scales from 4.2ghz to 4.85ghz - see der8auerLast edited: Jul 7, 2019 -
Let's say a 4.7GHz boost comes to 4.55GHz all core, even though leaks do not support that (and the 4.7GHz boost single core is on the 16-core CPU, whereas the 8-core gets 4.4 and 4.5 for the 3700X and 3800X, respectively). With PB2, what they have is an extra 200MHz instead of the 100MHz of the Ryzen 2000 series. That means a potential single core 4.9GHz boost on the 3950X, 4.8GHz on the 3900X, 4.7GHz on the 3800X, and 4.6GHz on the 3700X.
We still do not have accurate final multi-core boost speeds.
Now, if you remove the limits on the three variables playing into PBO, PPT/TDC/EDC, then it can boost until the temps wind up too high to control, or when other limits on the board are found, or you destroy something. But, the point is, if you change the limits, you can then allow for higher boosts than even the 200MHz number mentioned here.
Here is an explanation of OCing on PBO on Ryzen 2000:
"All core OC is the worst way to OC on Ryzen. Its best to enable PBO, core performance boost, leave Performance Enhancer to default, enable C-States and theres numerous other tweaks depending on your motherboard, but once you want to get more boost you just bump up the base clock while using the offset voltage adjustments to keep it stable but be mindful of the baseclock overclocking the PCIe as well so memory and GPU get overclocked along with it. Also, within windows just use High Performance power plan and go into advanced settings and change the "minimum processor" to something like 10% It will ultimately follow what your P States are set to anyway which is usually around 2200mhz idle unless you have hibernate or sleep enabled then it will follow those as well. PBO is the most simple and most effective way to get a good overclock on Ryzen. Manual all core OC limits you in every way in comparison. In reference to the PPT, TDC, and EDC limits, once you enable PBO the limits for those values will be implemented. If you still arent hitting the max its because either you are limited by voltage or thermally limited as PBO will monitor thermals and adjust clocks. Baseclock overclocking helps with that as PBO and all automated overclocking on Ryzen is limited to a 43.5x multiplier so to go above 4.35ghz base clock OC is necessary when using PBO or Performance Enhancer options if your mother board supports it." https://www.overclock.net/forum/11-...ster-edc-limit-100-cpu-load.html#post27750684
As soon as it needs BCLK, it's a bad idea with Zen, because the first generations have it so over 104MHz, you switch from PCIe 3.0 to 1.0. Even though this IS fixed in Ryzen 3000 according to AMD, you run the risk of corrupting NVMe drives, etc.
What they need to do is leave Precision Boost in place, but allow for a more robust choice of multiplier to bring it up while having the different boost states. I have not played with Zen+, but have seen ignorant people using beta bioses to try to do PBO overclocks on Zen 1 and destroyed their 1950X (they also were running the board set at 107MHz BCLK because of the above type of advice). Then you have P-state overclocking, which due to the implementation in the Asrock Taichi X399 I abandoned quickly. Doesn't matter for me, I have 4.2 all core, but when the single core can go higher and allow better performance, then finding a way to keep that while pushing up the all core multiplier is desirable.
All I care is that AMD doesn't restrict the boosting OCs of people with water and chilled water, to a lesser degree, as there is NO NEED to force them into BCLK OCing when we have the 25MHz divisors. Sure, its a fun past time for people that used to have to OC that way (including myself), but the question is why create the need?
With a single core boosting to 4.6-4.9 on these chips, I'm just hoping they left the room for that plus maximizing the all core boost. As I said, I have not played with it, so am less versed in Zen+ OCing.
Either way, 3 days until we find out! -
Given AMD's presentation %'s improvement for the 5700XT/5700 against the 2060 and 2070, how do those "numbers" compare to the 2060 super / 2070 super? HardwareCanucks's takes on the comparison, and who should be worried?
RX 5700 vs RTX Super - Should AMD OR NVIDIA Be Worried?
HardwareCanucks
Published on Jul 3, 2019
The NVIDIA RTX 2070 Super and RTX 2060 Super will soon find themselves in benchmarks against the RX 5700 XT and RX 5700. But which will end up the winner? We ended up doing a quick and fun little experiment before the RX 5700 review by using AMD's own public performance numbers to see how things MIGHT line up. But we'll know how accurate these projections are in a few days.
Looks pretty close... hopefully AMD bumps up the whole shebang just before launch and passes Nvidia on the outside lane.ajc9988 likes this. -
AMD even said they can get similar results like DLSS from using MS API without quality losses, while industry devs said RT can be done via general compute (and even AMD submitted a patent for hybrid Raytracing... basically optimization of software and hardware to produce Raytracing results without the same performance impact as on NV and without needing dedicated hw... baiscally, open source raytracing - heck, even Crytek demo-ed RT on Vega56).
But, none of that matters because reviewers apparently never mention these points and they ALWAYS say 'AMD is lacking some new features'. I mean come on. If you want to write an unbiased review, you need to include other information that DLSS and RT aren't really all that special and technically CAN be done efficiently (and well) on AMD gpu's - but no, you don't hear about that.
This is what I think will continue to damage AMD's reputation even with releasing new GPU's.
Already some people are using FFXV benchmarks as an indication that AMD 'failed' in the GPU arena... oh well.Last edited: Jul 4, 2019hmscott likes this. -
If you watched that AdoredTV video on the Super's, Nvidia has lost a lot of sales due to the Turing / RTX GPU's not selling well, Nvidia's Gaming sales dropped by half still haven't recovered. Nvidia is still keeping the prices high for the Super's - with a paltry % of performance increase, so their sales won't recover.
That's where AMD should fill the gap, with enthusiasts building new Ryzen and Threadripper builds, needing a matching Navi GPU. Who wants a go nowhere failed RTX / DLSS GPU to bring down the cool of your build? Get an AMD GPU to match your new AMD Ryzen CPU.
Nvidia Gaming sales tank to half high sales end of 2018 and still isn't recovering...the chart FY dates are 1yr too high.
Raytracing on AMD hardware will be a big seller 2H 2020, for now AMD only needs to deliver to market the best performance / cost models against Nvidia, which will take tuning along the way with new higher / lower performance models and price matching Nvidia to keep AMD Radeon sales going and margin's up.Last edited: Jul 4, 2019 -
AMD should have tried to hit a lower price point, which is what I've said since prices were launched. Now, AMD can lower prices from here, but generally, their cards will likely be just competing head to head, depending on task/game.
Now, I don't think RT and DLSS were ready for prime time, and those features don't matter to me. But that means I'd look at performance on best performing API. With the Super cards, that is going to, price/performance, make the AMD cards a harder sale. We will find out more in a couple days.
Sure, some people will choose on brand loyalty or doing an all AMD build, but I was critical of AMD GPUs being pushed with the AMD desktop CPUs in those two builds to date (Asus and Acer). In the same way, I repeat that for picking graphics here. -
" ...for now AMD only needs to deliver to market the best performance / cost models against Nvidia, which will take tuning along the way with new higher / lower performance models and price matching Nvidia to keep AMD Radeon sales going and margin's up."
That means AMD only needs to maintain price / performance competitiveness as best they can to match Nvidia's offerings. AMD doesn't need to nor should they lowball their pricing - AMD doesn't need to give away competitive performance GPU's to "beat" Nvidia. AMD can now drop their price as needed to match price / performance moving forward.
There are always going to be a range of performance differences as some games perform better on one brand or another, and if an article / review skews the results with tests favoring one brand over the other there will be a skewed result - to prove a point or sell a brand.
I don't need to be a slave to performance any more than I need to be a slave to a brand, or technologies that exist for no particular useful purpose - RTX / DLSS, as they exist they are useless to me, so they count against a buying decision instead of for it.
If I bought an RTX / DLSS card I would be wasting time trying to make use of it like a fool who wasted a ton of $ on useless BS. Looking for news of new games featuring RTX / DLSS only to wait in vain for months upon months with nothing to show for my time and $ investment.
AMD can't provide top performance to top all tests and games at a price point against Nvidia, and neither can Nvidia, but you can get "close enough" and then decide to support / reward one brand or another and buy what you want instead of blindly buying to a feature or individual performance differences.
Besides things change over time, drivers update, games update, and your experience tuning of your hardware and software will improve over time to get better results.
Pick what you want that fits your build, that applies to all the parts that go into a build. If there are AMD GPU's that fit your budget and your build, then you don't have to buy an Nvidia GPU that costs more and is burdened with "must have" features that do nothing useful, get the AMD GPU instead and enjoy it, and learn to enjoy ignoring the Nvidia BS.Last edited: Jul 4, 2019 -
Rumor: Did AMD pull a fast one on Nvidia after all?...or is the joke on us?
https://twitter.com/sherkelman/status/1146827961162711045
https://videocardz.com/newz/amd-to-lower-the-price-of-radeon-rx-5700-series-ahead-of-launch
https://www.reddit.com/r/Amd/comments/c9djx8/amd_to_lower_the_price_of_radeon_rx_5700_series/
ryandtw 36 points 2 hours ago
- RX 5700 XT 50th AE - $449
- RX 5700 XT - $399
- RX 5700 - $349
Last edited: Jul 5, 2019 -
As such, I have a solid recommendation, as it currently stands, to purchase used Pascal cards only, at least until 1060/580 levels of performance, which switches to the 580 usually on pricing. Right above that, it depends on used Pascal price and the 1660. Anything above that, used Pascal if available. Make them both hurt for price gouging the market. -
-
Nvidia had a pop in sales during the first four weeks after launch, primarily because no one had the cards in hand. After that, sales plummeted. Why? No new performance, no reduction in price, and extra features that were unusable.
Because there was no new performance, but the pricing remained roughly the same to slightly higher, with the 2080 Ti costing like 70% more than the prior gen Ti for 35% more performance, it was expected to be a POO show.
Now, what does AMD do? They try to jump in the fray at the same price points. So no real change in performance, same pricing, all dating to what, 2015 or 2016? That is called stagnation. It should not be rewarded on either side.
Now, I'm aware of how low AMD's margins are on the GPU side. The crypto bust made that painfully aware, as with less demand and less sales on GPUs for crypto, while demand for CPU went up during the holiday, it showed that the CPUs have much more margin and that the GPUs were dragging down AMD's margins.
Even with that, it doesn't excuse stagnation in the market from both companies. One was behind, the other bilking. Now that AMD is roughly catching up, instead of undercutting to gain market share, they want to share in the bilking. Hence, screw them both, in regards to GPUs.
Hell, it was obvious from the start Nvidia was trying to slide the 2070 to a lower deck GPU die. But, even going back to where it should have been there, it doesn't get at how they have moved a mid-range GPU to be perceived as high end.
Now, you can commend AMD for flushing out the Super series and the price cuts to then respond before launch. Strategy wise, that is fine, EXCEPT that it shows the consumers are worthless in their equation. Expendable. By not coming in low and hot, instead testing the waters, it shows that had Nvidia not done this, AMD would have been content with those higher prices, further cementing the HORRIBLE situation we have on the GPU side. So, consumers should be livid! Both sides need a BIG rejection!
EDIT: Also, that is the problem with consumers. We ALLOWED prices to go up. See, there is a simple equation. All you have to do is look whether the reduced number of sales at the higher price point is made up for with the higher margin on the product. If the lower sales numbers with the higher margin is more profitable or equally profitable, then you go with lower sales with higher prices. PERIOD. Why? Because you have to buy fewer wafers, use less fab time, expend less resources overall. Hence why so long as consumers buy at the higher price, we are all in a bad situation, all because of the actions of a few. Consumers, to a degree, need to stand consolidated in order to force innovation. If companies find out you don't have to go as far to get consumers money, they will not do so. They want our resources: CASH! If we don't make them earn it, we lose!
Edit 2: So, let's look at the rumors and leaks BEFORE release and where the current products slide into the stack for AMD. Nvidia has been bilking consumers for a long while now, so just accept that I already have an issue in that regard.
Granted, internal pricing models are always subject to change. Prices can change on bill of materials, yields, etc. Yes, AMD had to do a whole other tape out on the product (not cheap). We are getting these 6 months after original planned release. But, the pricing should also be compared to what they are putting out in the market.
The 5700 is to salvage the defective dies, which has what was between a Navi 12 3070 and a Navi 12 3060. That's fine. But then look at the pricing that was considered. What we got was the 40 compute unit as the 5700 XT at $450, with the binned ones costing $500. We can assume that AdoredTV has decent contacts due to having decent information in regards to other products. If that is the case, the price for the 3070, now 5700 XT, went from closer to $200 to $450.
Now, granted, the 40 compute units turned out to give closer to 2070 performance, albeit with larger power consumption (which AdoredTV mentioned in his second video it was running hot and needed more power, so we get 225W). That places it where the Navi 10 3080 was rumored to slide in on price and performance, which was a rumored $250, so that closes it to $200 over rumor. Even with the price drop to $400, that leaves $150 between the two, while consuming as much as a rumored Navi 20 die. But as mentioned, hotter and drawing more power than they estimated, while we also do not know how much beyond the efficiency curve they pushed the die.
Had they dropped the 5700 at $300 initially, 5700 XT at $350, and SE at $400, I likely wouldn't have said anything because it was moving down the pricing relative to what we got for performance, making the heat and power consumption more than fine. That while being about $100 more than expected. Little chaffed, but I think that would have been a good sign to the market they were going to make Nvidia truly come to a war on the GPU side. Instead, the did as I described above, trying to maximize profits and signaled to the market that the higher pricing was here to stay. Sure, I acknowledged their strategy. But their strategy had nothing about making consumers feel special and everything to do with chasing money, while cementing in the higher costs. What that brings when Nvidia drops Ampere next year on 7nm and AMD drops Navi 20 is maintaining to growing prices, not lowering them. Granted, with 7nm, Nvidia should have performance gains with it this time, but they would price it accordingly. AMD had a chance to cudgel them and chose to instead embrace their greed, much as Aniken embraced the Emperor. As such, trying redemption after takes a greater act for a betrayal.
This is why, generally speaking, I say both companies need to be made to feel pain. Buy the old and used cards where practicable in solidarity so that neither get money from the Super or the RX 5700/XT/SE. Give a big middle finger to both. If you can grab a 1080 Ti used for $435-550, compare that performance to the 2070/super/5700 XT/SE. Just grab the used card and say it is enough. Time for this to end.Last edited: Jul 5, 2019 -
Both are priced below the Vega 56 $399 / 64 $499 release pricing and outperforming both.
Are the prices as low as the rumors? No, but then again the sku's aren't the same as rumored, there are less of them and it looks like AMD kept the top sku's, which would be the most expensive. This is where the rumors don't do the release reality justice.
The AIB boards might be more expensive, but the same goes for the Nvidia AIB boards, but given the AMD Navi GPU's are coming out for less than the Nvidia Super GPU's, the AMD AIB boards should also come out cheaper than the Nvidia AIB boards.
I think you are being a bit over sensitive about the pricing.
So far I've seen a few leaked benchmarks that put the 5700XT at or around 1080ti performance, and that pascal GPU is going for used more than the MSRP of the 5700XT, and you don't get a warranty:
https://www.ebay.com/sch/i.html?_from=R40&_trksid=m570.l1313&_nkw=1080ti&_sacat=0
We need to wait for benchmarks for all 3 sku's to see where they place in the performance spectrum, but I don't think pricing is going to be a problem. -
I'm not doing simply a comparative competitive analysis. I'm looking at what Nvidia did marching up pricing and that AMD decided to go with it and bilk consumers.
I told people not to buy the 2000 series from Nvidia, and considering the performance/price on the AMD cards, I am saying the same thing.
That is called consistency and principled analysis, and the $50 price cut still isn't enough, as I said. $100 would have changed things. $50 changes nothing rather than a couple consumers making a choice to buy AMD over Nvidia. And I don't care which company it is if it is overpriced.
By buying used, neither company gets new revenue. No new revenue leads to price cuts and figuring out how to get that performance in the price envelope the consumer wants, thereby promoting innovation. If you want innovation, buy used and make their bottom lines suffer!Papusan likes this. -
AMD fed Nvidia high prices for their new Navi GPU's to get them to go in high with the Super GPU's, so AMD could drop their prices before release and gain the higher ground, better price / value. We don't have an official price change yet for the new Navi's...
You aren't going to get anyone to not buy a new AMD or Nvidia GPU, but you can recommend one over the other.
Trying to convince someone to buy used when they are ready to buy new isn't likely to work these days, unless they already have thought of it themselves. There is too much concern over mining GPU's, and most people don't realize mining usage puts less load on the GPU than gaming - when detuned for lower power consumption to increase profit margin.
I think you might have gone a bit over the edge, but I get the motivation, it's just not realistic.Last edited: Jul 5, 2019 -
2) AMD pulling that strategy doesn't prevent Nvidia from just doing their own price cuts. But there is no effort to create a true pricing war. Instead, all they are trying to do is undercut it enough to be the better value than Nvidia to some consumers, even though lacking RT which is still unusable, but is in the cards on the Nvidia side. Because they are not trying to force Nvidia into a pricing war, they used consumers as something to dangle in front while still having too high a price for what is being offered. Value is NOT ONLY set by examining the market equivalent and the pricing that market participant set. It is the value set by the consumer, i.e. whether the consumer think the price per performance is good enough. I think it fails on the latter and customers should sit it out until next year when Intel brings more pressure to the party, thereby creating a better situation for consumers, matched with all producers being on 7nm, whether it be TSMC or Samsung's node, or whatever it is Intel will be using.
3) You would be surprised. By showing that you get the same performance from old hardware which is marked down due to being used, you get them to pay less for the performance in the interim, and you educate them as to the value of collective bargaining. Instead of just going for whatever is shoved down your throat, you can select to a degree what you want.
4) As you just admitted, and as many youtubers have covered, if the cards during crypto weren't allowed to seize on their cooling, many are in good shape. A year and a used card isn't that much of an ask when considering the benefits of pressures on multi-national corps to pull their heads out of their bums. Graphics cards are not insulin. You do not have to buy them to survive.
As to whether you think it is realistic, that doesn't matter. Nvidia's sales say that consumers are smarter than the corps are giving credit for. In fact, the 1080 Ti still being able to pull $500 on ebay PROVES that the new cards lack of innovation has artificially kept that price higher than it would be otherwise. This is what, three years after release? The 980 Ti was already at lower than $400, often at $300, by the time the 1080 Ti was released. So here is to hoping enough people say enough is enough of this BS.
EDIT: Here is the point, before the 20 series, here is how the 20 series would have been priced:
2080 Ti = $700
2080 Super = $500
2080 = would not have been built
2070 Super = $350-400
2070 = 2060 = $250-280
2060 Super = not built
2060 = not built or 2050 Ti
Now, AMD is trying to get us to pay $400 for where the 2070 is. Look at that list. Not where the 2070 Super would be, just the 2070. In other words, Nvidia and AMD are both gouging consumers and still marching up prices more than they should be. Consumers need to stop the crap and refuse to buy these cards.
Above, I put the price of $350 for the 5700XT as high, but that I likely wouldn't have commented. You would have that performance at around where the $300 mark should have been, but maybe there is something in the bill of particulars of the cards that may have made it need the extra $50 at launch or whatever, or it was the re-tape. In any regards, it SHOULD NOT BE $400.
So, if I am saying Nvidia's prices are too high and should have been at those levels, I cannot say buy AMD GPUs at their current pricing either. I am very clear about what I am saying.
I also said the AMD CPUs are priced higher than they should be all to accommodate moving them to a 50% margin. Still, even with that, I gave it a buy recommendation. Why? Because even with the price being a bit high by my calculations, they are still a good value. That is NOT true on the GPU side! Instead, their actions on the GPU side help to keep the prices high in that market and marching upwards for low mid-range and mid-range cards.
I even made the comment that AMD's CPU pricing did what I thought could not be done: it made Intel's 8-core pricing at $500 seem more reasonable. That shows that the pricing is too high on their products. But, it also got Intel to cut pricing, showing even Intel knew it was unreasonable. Just Intel has nowhere to go on desktop. Nvidia just did a refresh and used the chips they should have been using for the numbering scheme from the beginning, gave 7-15% more performance, depending, and kept pricing at the overly inflated rates, meaning they screwed everyone that bought the 20 series before this release, and they are still screwing consumers on pricing. AMD doing just a little better than that on pricing, while still inflating prices from where they should have been will NOT get an applause or a seal of approval from me.Last edited: Jul 5, 2019 -
There are too many unknown's so let's wait until the benchmarks place the 5700 / 5700XT / 5700XT AE in their performance tier vs Nvidia Super's, and the AMD Navi release pricing firms up.
Here's an example of what I am saying about comparing performance vs price against the Super's and previous generation, starting with these leaked results with AMD reduced prices - this is a good review of the results and pricing comparison:
Navi Prices LOWERED, FIRST Navi Review Leaked!
Gamer Meld
Published on Jul 5, 2019
AMD lowers Navi's pricing in response to RTX Super, and the first RX 5700 and 5700XT review leaks!
Of course we need to wait for AMD release drivers (these results are on pre-release drivers) and release prices to have a good comparison.Last edited: Jul 6, 2019 -
Its just that they decided to drop the price down another $50 I think.
Question becomes, how low could AMD go with the prices and still retain profitability?
7nm GPU's aren't cheap to produce, so while people will happily justify NV charging more for its features (which AMD can also do via open-source anyway), they will say AMD has no grounds to charge as much money as they did (despite charging less initially to begin with). -
So what's the problem? I'd expect excited cheerful appreciation rather than the ingrateful non-appreciation some are expressing.
If you are getting an AMD CPU - or even an Intel CPU (why?), why not get an AMD GPU? AMD's GPU is affordable, fairly priced for performance against Nvidia, so why not reward AMD for their hard work and buy their GPU's?
It looks like the 5700XT OC'd might reach the Radeon VII performance for $250+ less, so IDK what all the consternation over AMD pricing is about.
Nvidia overcharges everyone for years, AMD's new architecture GPU on a new process 7nm and all of a sudden some are now penny-wise and pound-foolish, looking at a good thing with a blind eye toward common sense.
I'm not ready for a new GPU, so I don't have to decide for myself, but if I did need a 5700 series range performance GPU, they currently look like good choices to me.
Of course we need to see how they perform in games, applications, streaming, video, noise and power / thermals. Little things like that, you know, like waiting for it to actually happen before complaining about it?Last edited: Jul 7, 2019Deks likes this. -
TimmyJoe seems to have jumped the gun, probably thought the embargo lift was on East coast time, he uploaded at 3am Pacific...6am Eastern...anyway, here's the first uploaded review "after embargo lifted":
AMD Ryzen 3700x Review, 9900k for $329?!
Timmy Joe PC Tech
Published on Jul 7, 2019
The AMD Ryzen 3700x 8 core 16 thread CPU is a leap ahead for AMD in performance but can it beat the ultimate gaming CPU?
Multitechnopark (now Benchmark PC Tech) usually posts a bunch of head-to-head comparisons, and finally posts a combined graph of all their tests, so I'll post this early one - compares a 3700x vs 9900k favorably - looks like a "dead heat", and then the combined one later when it finally shows up.
Ryzen 7 3700X vs I9 9900K Benchmarks | Test Review | Comparison | Gaming | 13 Tests
Benchmark PC Tech
Published on Jul 6, 2019
AMD Ryzen 7 3700X vs I9 9900K BENCHMARK Included tests: Hitman, Total War, Tomb Raider, Far Cry | 7-ZIP | Truecrypt | X264 | Playerunknown's battlegrounds PUBG | FAR CRY 5 | WITCHER 3 4K| GTX 1080, GTX 1080 Ti, RTX 2080 Ti 1080p, 1440p and 4K test included – Gaming and Productivity | Encoding | Transcoding | Compression Tests Review on Windows 10
Last edited: Jul 7, 2019Papusan likes this. -
It's gonna take some time for reviewers / owners to get used to tuning the new CPU's / GPU's and getting familiar with getting the most out of them. Likely some AMD driver updates too, along with perhaps firmware updates. It's always that way with AMD, first impressions aren't showing their best. -
Review dump time
Gamers Nexus mentioned in their R5 3600 review their 3900X OC'ed to 4.3ghz.
Another review of 3900X with 4.3ghz all core OC:
Delid/temp info and showing 4.3/4.4/4.5 OC ceiling: (Der8auer mentions he's tested over a dozen individual CPUs)
Some Navi:
https://www.youtube.com/watch?v=EgOKi2Okn2g
and if you want your reviews with ASMR-voice...
-
Nvidia just recently stopped using the blower because they wanted to sell a premium GPU - to the detriment of their AIB partners.
As expected the 5700/5700XT are still price/performance winners. The Navi's beat the 2060/2070 and get close enough to the 2060S/2070S performance with a lower price.
Even better the Navi's don't have Coil Whine, while the 2060S did have Coil Whine - at least in this review their 2060S sample had Coil Whine:
AMD RX 5700 XT & RX 5700 Review - Benchmarks Are Finally Here!
HardwareCanucks
Published on Jul 7, 2019
Our Radeon RX 5700 XT and RX 5700 review with gaming performance and benchmarks is finally here! There's a lot to tackle with final pricing, power consumption and specs along with how well it does in gaming, value and how Navi performance compares with the RTX 2070 Super and RTX 2060 Super.
So far it Navi looks to be exactly as AMD advertised, and with the added price drop it's even better.
As always buying a used 1080ti for $400 or lower will also work:
https://www.ebay.com/sch/i.html?_from=R40&_nkw=nvidia 1080ti&_sacat=0&_udlo=300&_udhi=600&_sop=15&LH_BIN=1&rt=nc
Linus: AMD is ahead of Intel??
I had given up on AMD… until today - Ryzen 9 3900X & Ryzen 7 3700X Review
Linus Tech Tips
Published on Jul 7, 2019
AMD is launching ALL of their third-gen Ryzen CPUs today – We’ve got our hands on two of the best to see if they really can take the fight to Intel’s doorstep: Gaming.
Radeon VII is taken out by the 5700XT in many tests, making the Radeon VII's $650 price point obsolete when you can replace it with a $449/$499 5700XT/5700XT AE GPU. I wonder if the Radeon VII will now get a nice price drop as well?
Navi Takes The GPU Fight Right To NVIDIA's Wallet
BPS Customs
Published on Jul 7, 2019
Check out the review of the RX 5700 and RX 5700XT graphics cards from AMD!
AMD's BACK! Radeon RX 5700 & RX 5700 XT Review
Techtesters
Published on Jul 7, 2019
For the record, I didn't sleep for a week. So if anything is unclear.. ask!
Steve says: It's fine as a GPU, but buy it with a better AIB cooler or buy it and swap on a water block. Driver needs overclocking fixed, and we get funny colors at install before rebooting. Yup, it's those kind of frank and obvious conclusions that we have come to expect from GamersNexus:
AMD Radeon RX 5700 XT Review: Thermals, Noise, Gaming, & Broken Drivers
Gamers Nexus
Published on Jul 7, 2019
In this video, we're reviewing and benchmarking the new AMD RX 5700 XT Navi GPU. The RX 5700 XT is most directly compared versus the NVIDIA RTX 2070, 2060 Super, and 2070 Super cards, falling between all 3 in price. We'll review the AMD RX 5700 separately. The 5700 XT has gone through our bench for thermals, frequency, noise, broken drivers (again), and build quality, the last of which will be looked at in a separate video.
Nice graphs / review with stock 9900k vs 5.0ghz 9900k vs stock 3700x / 4.3ghz + 2700x:
Should You Buy the Ryzen 7 3700X for Gaming?
Science Studio
Published on Jul 7, 2019
AMD's Zen 2 launch is here! In this video, we assess the viability of the 3700X for gamers in light of Intel's current 9900K "king." Three CPUs are compared: the 2700X, 3700X, and 9900K. My decision to forego the 9700K's inclusion was a direct result of the 3700X's performance (which exceeded my expectations).
It looks like the Navi's do great multi-stream - outperforming the 2080ti / V100, it's new and there is an updated plugin to download for OBS:
FASTER THAN 2080ti?! - AMD Radeon RX 5700 & 5700XT for Streamers & Content Creators (AMF UPGRADE?!)
EposVox
Published on Jul 7, 2019
AMD Zen 2 architecture is here! And thanks to Wendell at Level1Techs, I got a chance to get my hands on the Ryzen 7 3700X and Ryzen 9 3900X and put them to the test for content creation, video production, and live streaming - and even look at Navi! In the last video we took a look at the CPUs for video production & content creation, in this video we're looking at the Navi GPUs - the Radeon RX 5700 and Radeon RX 5700XT and how the new AMD AMF/VCE encoder for OBS and FFMPEG works... or doesn't. And we talk about Anti-Lag and CAS (content adaptive sharpening) or Radeon Sharpening. I eat my words!
►► AMD BEATS EVERYTHING! (Including themselves...) AMD Ryzen 3700x & 3900X Content Creation Performance
►► BUDGET STREAMING ON A WHOLE NEW LEVEL | 3700X + B450 + RX 5700 = Production & RENDERING BEAST
So far, no 3800x reviews...and no AMD 5700XT 50th AE reviews either...Last edited: Jul 7, 2019
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.