Now there is an analysis I can get behind. But you missed a crucial factor for the difference as well: They used an Nvidia Founders Edition card, you are using the EVGA XC Ultra aftermarket design with a better heatsink on the card allowing for higher boost speeds at stock than the FE. That, too, likely contributed to the difference in performance you have seen versus their testing (in reality, that may be the most significant factor).
But, addressing the driver, the game optimizations, OS optimizations, etc., is quite a potent argument on the score being invalid.
I do remember the testing you are talking about with the 8700K and 9900K, but also isn't Farcry 5 one of them that received updates to allow better core scaling recently? Granted that is more on the CPU side and for processing of CPU items, but I don't know if that may effect the score. So this is just speculation entirely on this point and I may be outright wrong on the effect thereof. But it rolls into as a subpoint under subsequent game optimizations, so is a minor matter and would be addressed with retesting correcting the elements we identified above.
-
-
https://www.pcgamesn.com/nvidia/nvidia-amd-q4-market-share
Oh well. -
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-203#post-10877854
The upshot is even during a 10.7% overall downturn for the quarter in GPU's sales, during the time Nvidia launched their brand new RTX Turing GPU's, AMD only lost 7% to Nvidia in *a single quarter*!!
Check the numbers again, AMD was up quarter after quarter against Nvidia, and only lost market share against Nvidia during the time Nvidia should have done much better, during the best sales quarter of the year!
Nvidia is in trouble, against AMD's weak competition Nvidia is shooting themselves in both feet, repeatedly.
Those sales and market share numbers reflect why Nvidia's lost 50% of it's stock price, and lost it's biggest institutional invenstors, they've given up on Nvidia.
tl;dr Nvidia is in big trouble, AMD is doing well with nothing new on offer, Nvidia is going to continue to fail, AMD is going to continue to prevail. -
I'd rather not make assumptions about AMD or NV success or failure down the line.
However, you are correct that AMD lost a relatively small portion of market share during RTX launch and especially during the time when NV was expected to do better.
Hopefully, we will see the trend go further into AMD favor once RVII is included into the projections, and of course upcoming Navi (both as an iGP and also dGPU) - Zen 2 should help matters too (and its possible that AMD is releasing NAvi with it exactly to capitalize on Zen's success).
Its more along the lines that the 'tone' of the article bothers me. These people (or bots - depending who/what is writing) seem to be incapable of remaining objective and instead project extremes. -
Devil May Cry 5 is one of the 3 free games for new buyers of the AMD Vega GPU's:
Devil May Cry 5 | DX11 vs DX12 1440p Maximum, SSAO | Radeon VII 9900k
giantmonkey101
Published on Mar 9, 2019
ReLive Game Capture @ up to 3.8% performance hit.
Final game captured video may exhibit Stutter due to ReLive Capture.
ReLive Game Capture @100Mbps/1440p 60FPS AVC, 192kbps Audio.
AMD Radeon VII 16GB UV/OC Wattman Settings
1926MHz Boost Clock @1061mV (work in progess, may change)
+200MHz HBM2
Power Limit 10%
Windows 10 Home 64 Bit (v1809)
ASUS Maximus XI Hero Wifi z390
ASUS Ryujin 360 AIO
i9-9900k@5100Mhz
GSkill TridentZ 2x8GB 3333Mhz 14 14 14 32 416
EVGA 1000W G2 Power Supply
Radeon Adrenalin Software 19.3.1
Default Radeon Global Settings
[4K] Devil May Cry 5 on AMD Radeon VII - Gameplay Benchmark Test
TechEpiphany
Published on Mar 9, 2019
AMD Radeon VII Test, Devil May Cry 5
Radeon VII all Tests: https://bit.ly/2TrKAbA
AMD Ryzen 7 2700X
MSI B350M Gaming Pro
2x8GB DDR4-3466
Powercolor Radeon VII
Driver: 19.3.1
Recorded with external capture device. No performance hit!
Tech Talk 5 - Quitting Reddit, New Zen II "Leaks" and OC'ing the Radeon VII!
AdoredTV
Published on Mar 8, 2019
Plus why I'll never be writing for Anandtech.
Last edited: Mar 9, 2019Vasudev likes this. -
Radeon VII! Awesome; a 2080 in the (GTA V) streets; Creator Powerhouse w/Premiere feats
Level1Techs
Published on Mar 10, 2019
16 GB of Vram is potentially huge for content creators. No crashes in the driver stack even with complex projects involving RedGiant plugins.
Israel Lewis 8 hours ago
"AdoredTV mentioned and experienced the same stuttering in ROTR with the 2080 and it does seems to be a VRAM issue, but was flamed for it ofc ;P"
TheDarthTux 7 hours ago (edited)
"And yet the popular consensus is to say no games need more than 8GB of VRAM. Using DXVK, I have seen Resident Evil 2 yell at me for not having 13GB VRAM (though because Linux still doesn't recorgnize my Radeon VII properly) and in places Star Citizen 3.4.3 actually allocates over 11GB VRAM and uses over 9GB of VRAM at 4K mind you."
Carl Wells 6 hours ago
"Sooo it's REALLY good at Premiere ! Funny how that is ommited in all the reviews considering how many people cut Twitch content and YouTube stuff. Just needs that underclock[ / undervolt] to bring those fans quieter!"
Googlar 20 minutes ago
"It's important to undervolt it and to take advantage of the features which help it run more efficiently to allow it to run more quietly. If I were in the market for a card in this price range, I would strongly prefer to have an even beefier cooler, such as a hybrid liquid cooler or just full on liquid cooler. I mean, how much more would it cost to have put a liquid cooler on it instead of this one? I guess another good option would just be to put it in a very large well ventilated case which would help to dampen the noise."
Dead or Alive 6 | Radeon VII 9900k 4k Maximum Settings 60 FPS Performance test
giantmonkey101
Published on Mar 10, 2019
ReLive Game Capture @ up to 3.8% performance hit.
Final game captured video may exhibit Stutter due to ReLive Capture.
ReLive Game Capture @100Mbps/1440p 60FPS AVC, 192kbps Audio.
AMD Radeon VII 16GB UV/OC Wattman Settings
1926MHz Boost Clock @1066mV (work in progess, may change)
+200MHz HBM2
Power Limit 10%
Windows 10 Home 64 Bit (v1809)
ASUS Maximus XI Hero Wifi z390
ASUS Ryujin 360 AIO
i9-9900k@5100Mhz
GSkill TridentZ 2x8GB 3333Mhz 14 14 14 32 416
EVGA 1000W G2 Power Supply
Radeon Adrenalin Software 19.3.1
Default Radeon Global SettingsLast edited: Mar 11, 2019Vasudev likes this. -
I just got in the Matebook 14D. It blows the Celeron based Nextbook out of the water. A superb laptop in all.
hmscott and ALLurGroceries like this. -
Huawei Matebook D with AMD Ryzen Review - Value packed laptop
Lon.TV
Published on Oct 20, 2018
Huawei's Matebook D offers solid performance for its price point. The AMD Ryzen processor delivers adequate gaming performance too.
00:42 - Hardware Overview, Build Quality and Price
01:02 - Display
01:26 - CPU, RAM, Storage and Upgradeability
02:14 - Weight
02:19 - Ports
02:50 - Keyboard and Trackpad
03:53 - Huawei PC Manager Application
04:13 - Performance: Web Browsing
04:34 - YouTube 1080p 60FPS Video Playback
04:57 - Benchmarks: Speedometer
05:42 - Battery Life
06:31 - Benchmark: 3DMark Time Spy (DX 12)
07:24 - Benchmarks: 3DMark CloudGate
08:20 - Gaming: Fortnite
08:37 - Gaming: Rocket League
09:09 - Gaming: The Witcher 3
09:37 - Thermal Performance and Fan Noise
10:26 - Speaker Quality
10:45 - Kodi Performance
11:15 - Running Linux
11:44 - Final Thoughts
Huawei Matebook D 14 review
ModernDad
Published on Jan 7, 2019
Two surprising things about this Huawei Matebook D 14 Windows laptop (the review of which is sponsored by Huawei) — It's got Dolby Atmos on board, which I didn't really expect in a $600 midrange machine.
Huawei Laptop MateBook D 53010CRG AMD Ryzen 5 2500U (2.00 GHz) 8 GB Memory 256 GB SSD AMD Radeon Vega 8 14.0" Touchscreen Windows 10 Home 64-bit
Regular price:$629.99, Sale Price: $549.99 Save: $80.00 (13%)
https://www.newegg.com/Product/Product.aspx?Item=N82E16834324036
Gaming on AMD Ryzen 5 2500U Vega 8 Part 1. 20 Games Test. Ryzen Mobile Review
TechEpiphany
Published on Dec 1, 2017
Ryzen 5 2500U all Tests: http://bit.ly/2EOnm4p
00:01 - Destiny 2
03:12 - Titanfall 2
05:32 - Rise of the Tomb Raider
08:09 - Microsoft Flight Simulator X Steam Edition
10:51 - Project CARS
12:39 - The Elder Scrolls V: Skyrim
14:56 - Micro Machines World Series
16:56 - Redout
18:53 - The Crew
20:53 - Rainbow Six Siege
22:33 - Crysis 3
25:21 - Total War: Warhammer II
26:21 - Overwatch
29:56 - Battlefield 1 Multiplayer 64
32:40 - Fortnite
35:28 - Middle-earth: Shadow Of War
38:30 - Rocket League
40:55 - Dota 2
44:13 - Counter-strike: Global Offensive (CS:GO)
AMD Ryzen 5 2500U can be also found in:
Acer Swift 3 SF315-41
ASUS VivoBook 15 X505ZA
ASUS Laptop X570ZD
DELL Inspiron 15 5000 15 5575
HP ProBook 645 G4
HP EliteBook 735 G5
HP EliteBook 745 G5
HP EliteBook 755 G5
HP Envy x360 13
HP Envy x360 15
HP Pavilion 15-cw0003ng
HP 17-ca0305ng
HP 14-cm0202ng
Huawei The MateBook D
Lenovo Ideapad 720S
Lenovo Ideapad 330
Lenovo Yoga 530
Lenovo ThinkPad E485
Lenovo ThinkPad E585
AMD Ryzen 5 2500U can be also found in:
Acer Swift 3 SF315-41
ASUS VivoBook 15 X505ZA
ASUS Laptop X570ZD
DELL Inspiron 15 5000 15 5575
HP ProBook 645 G4
HP EliteBook 735 G5
HP EliteBook 745 G5
HP EliteBook 755 G5
HP Envy x360 13
HP Envy x360 15
HP Pavilion 15-cw0003ng
HP 17-ca0305ng
HP 14-cm0202ng
Huawei The MateBook D
Lenovo Ideapad 720S
Lenovo Ideapad 330
Lenovo Yoga 530
Lenovo ThinkPad E485
Lenovo ThinkPad E585
AMD Ryzen 5 2500U Gaming Performance - Part 2 - 9 Games Test - Ryzen 5 2500U Review
TechEpiphany
Published on Dec 4, 2017
Ryzen 5 2500U all Tests: http://bit.ly/2EOnm4p
00:01 - Wolfenstein II
02:46 - Pro Evolution Soccer 2018
05:36 - Mad Max
08:43 - Metal Gear Solid V
14:39 - TrackMania Trubo
15:49 - CoD: Infinite Warfare
18:23 - Overwatch
20:23 - Assasins Creed: Origins
22:33 - Far Cry Primal
AMD Ryzen 5 2500U Gaming Performance - Part 3 - 12 Games Test - Ryzen 5 2500U Review
TechEpiphany
Published on Dec 10, 2017
Ryzen 5 2500U all Tests: http://bit.ly/2EOnm4p
00:01 - Watch Dogs 2
02:12 - Deus Ex: Mankind Divided (DirectX 11)
04:13 - X-Plane 11
08:23 - The Surge
10:28 - Team Fortress 2
12:09 - Sniper: Ghost Warrior 3
13:40 - Prey
16:09 - Mortal Kombat XL
19:18 - Killer Instinct
21:04 - Fallout 4
24:25 - The Witcher 3
32:29 - Ashes of the Singularity: Escalation (DirectX 12)
AMD Ryzen 5 2500U Gaming Performance - Part 4 - 12 Games Test - Ryzen 5 2500U Review
TechEpiphany
Published on Dec 17, 2017
Ryzen 5 2500U all Tests: http://bit.ly/2EOnm4p
00:01 - The Division (DirectX 12)
02:08 - Dead Or Alive 5: Final Stand
04:15 - Payday 2
06:10 - Street Fighter V
07:53 - Dirty Bomb
10:01 - Path Of Exile
10:51 - FIFA 18
13:14 - Fly2K
15:25 - War Thunder
17:51 - Resident Evil 7
19:28 - Vanquish
20:56 - GTA V
Last edited: Mar 11, 2019 -
Yes it is. It is not the brute workforce the 7nm's will be, but more than enough. Now that I have my 1850x I would not be video encoding etc. on the laptop anyway. For daily cruising though it is very fast. I was going to get a large SSD but since there is no major productivity I really do not need it.
hmscott likes this. -
AMD Ryzen 5 3550H Vega 8 - Dead or Alive 6 - Gameplay BenchmarkTest (New Bios)
TechEpiphany
Published on Mar 11, 2019
Dead or Alive 6, Ryzen 5 3550H Vega 8 iGPU
Ryzen 5 3550H all Tests: https://bit.ly/2EIswSk
ASUS TUF FX705DY
Bios 306
AMD Ryzen 5 3550H Vega 8 integrated Graphics
2x8GB DDR4-2400
Driver: 19.3.1
Almost all tests:
Recorded with external capture device. No performance hit! -
still not the 7nm. This however is more than enough for my usage. I also went for refurb at $450 too.
hmscott likes this. -
Ryzen 5 2600 + Rx 580 | 4k Live Streaming YouTube & Twitch (AMD Relive) = Devil May Cry 5
Light Speed
Streamed live on Mar 11, 2019
Live Streaming 4k 60 FPS on Ryzen 5 2600 + Rx 580
Ryzen 2600 Live Streaming
Rx 580 Live Streaming
Ryzen 2600 1080p Live Streaming
amd relive live streaming
amd live streaming
worm 17 hours ago
"this is why devs gotta optimize more, for great stuff like this."
kahaneck 1 day ago
"how ?"
Light Speed 1 day ago
"Using the relive streaming option. This is insane. I'm able to stream at 4k 30 fps with very very little impact on performance.
MKS_GONZALEZYT 20 hours ago
"And at 1080p?"
Light Speed 20 hours ago
@MKS_GONZALEZYT "I did video on 2k and 1080p on my channel check it out." -
Using the laptop now for a couple of days my only 2 issues are the screen size and trackpad control.
1. The screen at 100% scale does not lend well to sitting back distance from the screen.
2. There is no simple Fn key to enable/disable the trackpad.
Otherwise it is as fast and smooth as any other laptop I have used so far. It feels at least as responsive as my old i7-3820 was, that also as a SATA SSD.Last edited: Mar 13, 2019hmscott likes this. -
-
Huawei Matebook 14D, AMD 2500U.
-
Devil May Cry™ 5
AMD
Published on Mar 8, 2019
Prepare to get demonic—the Devil Hunters are back in the newest iteration of the over-the-top, stylish action series Devil May Cry™.
Get Devil May Cry™ 5 when you purchase an AMD Radeon graphics card with the Raise the Game Fully Loaded bundle: https://www.amd.com/en/where-to-buy/r...
*Offer available through participating retailers only. 18+ only. Following purchase, Coupon Code must be redeemed by May 6, 2019, after which coupon is void. Residency and additional limitations apply. Combination with any other promotion is prohibited. For full Terms & Conditions, visit www.amdrewards.com/terms.
Campaign period begins November 15, 2018 and ends April 6, 2019 or when supply of Coupon Codes is exhausted, whichever occurs first. Eligible AMD Product must be purchased during Campaign Period. Offer void where prohibited. -
Real-time ray-tracing reflections on AMD Radeon RX Vega 56:
NEON NOIR: Real-Time Ray Traced Reflections - Achieved With CRYENGINE
CRYENGINE
Published on Mar 15, 2019
Technology Reveal: Real-Time Ray Traced Reflections achieved with CRYENGINE. All scenes are rendered in real-time in-editor on an AMD Vega 56 GPU. Reflections are achieved with the new experimental ray tracing feature in CRYENGINE 5 - no SSR.
Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.
CRYENGINE - The most powerful game development platform is now available to everyone. Full source code. 5% Royalties. No license fee. Join us over at http://www.cryengine.com
Subscribe: https://www.youtube.com/cryengine?sub...
Add us on Twitter: https://twitter.com/cryengine
Follow us on Facebook: https://www.facebook.com/cryengineoff...
Join the discussion on: https://forum.cryengine.com/ and Discord: https://discord.gg/cryengine
Browse our Marketplace: https://www.cryengine.com/marketplace
Crytek showcases real-time ray tracing in CRYENGINE with its Neon Noir tech demo
MARCH 15, 2019 JOHN PAPADOPOULOS 74 COMMENTS
https://www.dsogaming.com/news/cryt...ng-in-cryengine-with-its-neon-noir-tech-demo/ -
hmscott likes this.
-
Ray-Tracing without a performance hit won't happen unless the RT methods replace the pre-existing methods from the start of development.
The tacked on partial RT features add on a substantial performance hit. Optimizing RT over previous methods will take time, high performance optimized tools that still don't exist - RTX silicon isn't it.
Until then, RT / RTX is all a high cost dog and pony show to be avoided, focus on game performance and let the developers do the shiny stuff the most efficient way they enjoy to keep the game play fast and fluid.Last edited: Mar 17, 2019Raiderman likes this. -
Zen 2 will officially allow the Infinity Fabric ratio to be clocked 1:1 with the memory clock, instead of 1:2 like Zen and Zen+. That explains double the bandwidth, but if anyone wants to do the calculus to find the limit on reduction of latency, let me know so I can give data points for them to calculate the latency curve as speed of the fabric is increased. -
Zen 2 Analysis - A Different Perspective
AdoredTV
Published on Mar 17, 2019
What if AMD cherry picked the best scenarios and Zen 2 actually sucks?
If only AMD had talked to their engineers and pre-undervolted / tuned their Polaris / Vega GPU's which can get great power efficiency given tuning - or really bad efficiency out of the box if untuned with high power. Maximum performance tuning isn't the same as tuning for optimized performance at optimal efficiency.
Hopefully the new automatic power / performance tuning in AMD's software portends a move to do this out of the box by default with Navi and future GPU's / CPU's.Last edited: Mar 17, 2019 -
In any case the tweaking/overclocking potential is nice but if they are needlessly crippling the efficiency of their products by shoving higher than necessary voltages through them out of the box then they deserve to be criticised -
All the Intel CPU's need an undervolt (or voltage tuning) to run coolest with the best performance. Undervolting a laptop CPU -100mV reduces temperature at all core 100% load by 10c, and that's pretty consistent - if it will undervolt that far, and some will undervolt more.
Same for GPU's, same for Pascal - if you've been paying attention - the Pascal "tuning" is undervolting to all for higher clocks and performance.
AMD isn't any different. None of the vendors tune for best minimum voltage performance on the fly, which is what is needed. AMD has added an automatic tuning for voltage, but it's not continuous or on the fly, maybe someday AMD and other vendors will do that in hardware + software, of course that would end overclocking at stock cooling.
The AMD RX 4xx / 5xx / Vega GPU's all undervolt and set to a stable voltage, just like undervolting CPU's, will allow for higher clocks and lower power / temperature - with quieter running fans too.
It's worth doing if you aren't doing it now yourself...? -
My point is that since they ALL do it perhaps there is a genuine reason behind it. There is definitely excess voltage since they all can run stock speeds with an undervolt, Polaris vega Zen as well as most skylake based Intel CPUs at least -100mV, all 4 of my GP104Ms -150mV to -175mV. Maybe it is for OC headroom, but worse stock power figures seems a strange tradeoff to make (and considering overclocking is changing to less "run higher clock at same voltage" and more to tweaking around the edges of power limits and boost multipliers/curves and the internal algorithm being in ultimate control of what it does.)hmscott likes this. -
To do it the "right way", the optimal delivery of an individually tuned CPU / GPU would cost more than simply cranking up the voltage to a high enough value that all CPU's / GPU's pass as stable goods out the door. And, damned the power draw and thermal output.
Each CPU / GPU / Memory are all unique in that they can be tuned for optimal voltage and performance at best efficiency, but since they are all different - it's an either all or none proposition.
And, spending time and effort on none costs a hell of a lot less than spending time tuning each individual CPU / GPU to an optimized efficient level.
That's where boutique sellers, and places like NBR come in to help out.
If the vendors all shipped AI controlled fully tuned active optimizing products, we could all save a bunch of time and turn back to enjoying our laptops instead of spending time getting them to work optimally.
Maybe someday.Last edited: Mar 18, 2019 -
The problem usually lies in that the stable voltages are found early in the silicon development. As the process matures it does not seem better running voltages are not researched out. This can be seen with the 12nm CPU's where now they were running better cores both in power and clocks.
hmscott likes this. -
Google's Stadia Announcement at GDC 2019 in Under 14 Minutes
Engadget
Published on Mar 19, 2019
AMD, Linux, Vulkan, UNREAL Engine, Unity, havoc, Id Software, Multi-GPU support, Cross-Platform Play @ 04:00
Google’s Project Stream cloud gaming will use AMD Radeon Graphics
DEAN TAKAHASHI @DEANTAK JANUARY 9, 2019 10:15 AM
https://venturebeat.com/2019/01/09/googles-project-stream-cloud-gaming-will-use-amd-radeon-graphics/
"Lisa Su, CEO of Advanced Micro Devices, announced that Google’s Project Stream cloud gaming project is using AMD’s Radeon graphics.
Su announced AMD’s Radeon VII is coming in February, but she didn’t provide details on whether Google is using that. Late last year, Google announced that it was able to stream Ubisoft’s Assassin’s Creed: Odyssey, a demanding and new Triple-A game, from the cloud directly to gamers on a variety of client devices.
Su said that Google will use AMD Radeon graphics in the future as it tries to push the edge of computing forward via the cloud. Games that are played via the cloud can run in the data center. Video of the game is streamed to the user’s machine, where the images can be played in high resolution, regardless of the power of the user’s machine.
Su made the announcement at CES 2019, the big tech trade show in Las Vegas. But she did not elaborate on how the companies are collaborating.
The second-generation Vega-based graphics AMD Radeon VII chip uses a 7-nanometer manufacturing process to get a 27 percent to 62 percent boost on graphics-related benchmarks, Su said.
It has one terabyte per second memory bandwidth, 25 percent faster performance at the same power, and 60 compute units that that operate up to 1.8 gigahertz.
“AMD appears to be making a lot of progress in cloud game streaming with its Google Project Stream announcements and Microsoft’s veiled commitment, said Patrick Moorhead, analyst at Moor Insights and Strategy, in an email.”I will be digging into the levels of exclusivity of both these announcements. The cloud game streaming market is small now but will grow as the experience is improved and invested in by giants Microsoft and Google.”"
There are many discussion threads on Reddit about this announcement...
AMD to partner with Google's game streaming platform as the sole provider of graphics power in the data center.
https://www.reddit.com/r/Amd/duplicates/b30a0u/amd_to_partner_with_googles_game_streaming/
Full Google Stadia Announcement:
Google GDC 2019 Gaming Announcement
Google
Streamed live 5 hours ago
Gather around as we unveil Google’s vision for the future of gaming at #GDC19.
Live 3/19 at 10AM PDT.
Last edited: Mar 19, 2019 -
-
So, 1usmus has the first version of the Ryzen memory overclocking guide finished:
https://www.techpowerup.com/reviews/AMD/Ryzen_Memory_Tweaking_Overclocking_Guide/ -
Papusan likes this.
-
That video showed that with Polaris, had clocks been left much lower, voltage would have been very low. But due to performance requirements, they needed to push the clocks further, blowing the efficiency curve.
So I don't expect much to be left on the table with their products, although I do expect a decent performance increase. ...
Sent from my SM-G900P using Tapatalk -
AMD's CEO was at Google's big streaming video-game unveiling, and it may hint big plans for the future (AMD, GOOG)
Jonathan Garber, Mar. 20, 2019, 11:06 AM
http://forum.notebookreview.com/threads/google-stadia.828080/#post-10883931 -
AMD Threadripper 1900X Is On Sale for $300
by Zhiye Liu March 20, 2019 at 8:56 AM
https://www.tomshardware.com/news/amd-ryzen-threadripper-1900x-deal-sale,38869.html
"The AMD Ryzen Threadripper 1900X, which originally debuted at $549 (~£415.67), can now be purchased for a mere $300 / £274.79.
The price cut comes as the U.S. chipmaker is clearing the path for its upcoming third-generation Threadripper chips that are slated to launch later this year.
The Threadripper 1900X is an entry-level Threadripper processor that sports eight cores, 16 threads and 20MB of combined cache (L2+L3).
The chip runs with a 3.8GHz base clock and a 4GHz boost clock. However, the processor can hit 4.2GHz with the help of AMD's Extended Frequency Range (XFR) feature.
For help seeing if this is the right CPU for you, check out our CPU buying guide. In addition, below is a chart detailing how the CPU's specs compare to those of the Intel Core i9-9900K and AMD Ryzen 7 2700X."
Motherboard Vendors Prep for AMD Ryzen 3000 With BIOS Updates
by Zhiye Liu March 20, 2019 at 11:20 AM
https://www.tomshardware.com/news/motherboard-bios-update-amd-ryzen-3000-cpus,38872.html
"...It's apparent that support for Zen 2 is coming on multiple fronts. HWiNFO, a popular system information and diagnostics utility, recently released the changelog for its forthcoming version 6.03 build 3710. Among the list of upcoming changes, there is mention of improved support for Zen 2.
A recent roadmap detailing AMD's plans for 2019 confirmed that the chipmaker will launch its Ryzen 3000-series chips at the middle of the year. Although no concrete date was given, speculation around the hardware world has been running wild. Some think that AMD will announce the new processors on May 1 to commemorate the company's 50th anniversary.
Others think that the chipmaker will announce them during Computex, which starts on May 28, and launch them shortly after. In one way or another, AMD is poised to shake up the mainstream processor market with the Ryzen 3000-series processors."Last edited: Mar 20, 2019 -
And with that speed of the CPU with the number of threads and 8GB+ of memory, it would be roughly enough to do a small to mid-size business worth of traffic switching, if you go by pfsense estimates.
Granted, that will still be talking about $600-800 in NICs, $300 for the CPU, $120-240 on ram, $350 for a MB, up to $70 on an SSD, and whatever the daughter boards to split out the PCIe lanes cost, plus a power supply for $80 unless getting a double redundant PSU for a server rack, case, and like $50-100 for a cheap GPU. It would be a beast of a router/switch for about $1700.
When they move to pcie 4.0, that will be better value.
Sent from my SM-G900P using Tapatalkjclausius likes this. -
Polaris on GLOFO nodes operates efficiently at around 1200MhZ... past that point, you encounter larger increases in power demands.
RX580 for instance had higher base clocks than 480, and of course 590 really went to town with clocks.
And since 12nm was a minor revision of 14nm (but still low power node), it wasn't really suited for raising clocks... only for dropping power consumption at existing clocks.
As for oc headroom on 7nm... depends.
I don't think we can use 14/12nm as guidelines because those were low power process nodes... on the other hand, with 7nm+ being mature enough for production, it may mean that plain 7nm high perf. node has reached better yields... and if that's the case, AMD might just use up all the OC headroom.
If AMD can do that, then I don't see the problem as it saves users the trouble for doing it themselves (besides, if AMD does that, it basically ensures the silicon can run at those frequencies for extended periods).
Yes, undervolting and overclocking is doable, and in that particular case, I'd use that... but not if you're going to push the CPU or GPU with higher voltages and clocks for extended periods.
Furthermore, overclocking only yields limited increases in performance.
Intel's overclocked CPU's ended up dying after running for prolonged periods of time... but, this may have been due to how much the CPU's were overclocked and overvolted. -
-
It would also be nearly impossible to accurately segregate the CPUs that actually died from overclocking for prolonged periods from those that were destroyed as a result of the haphazard ignorance of an idiot using the keyboard. Big difference there, but again, I would expect the "all in" sample size to be so small as to be statistically irrelevant without any consideration or regard for the true cause of the failure.Last edited: Mar 21, 2019Johnksss, Dennismungai and Papusan like this. -
jclausius and tilleroftheearth like this.
-
AMD Radeon™ Software for Developers GDC 2019 Updates
Blog Post created by alexander.blake-davies on Mar 19, 2019
https://community.amd.com/community...deon-software-for-developers-gdc-2019-updates
"Every advance in gaming, whether on consoles, PCs, or the cloud, is only as good as the games delivered on it. That is why you’ll find AMD’s strong commitment to game developers reflected in every move we make in the gaming world. It’s also why our Radeon™ Software team gets so excited for the annual Game Developers Conference (GDC). It's a one-of-a-kind opportunity to share our latest and greatest tools, tech, and support resources directly with the passionate developer community.
"The Game Developer Conference is first and foremost about supporting the development community with everything it needs to make the next wave of incredible gaming experiences possible. At this GDC, AMD is doubling down on its commitment to open standards-driven games development by providing developers with the tools they need to bring their AAA visions to life. Gaming is moving in exciting new directions, both on consoles and PCs, as well as in gaming driven from the cloud. AMD is out in front of these developments, and ready with the support developers need to make the most of these opportunities."
– Andrej Zdravkovic, Corporate Vice President for Software
AMD has long distinguished itself by maintaining a strong commitment to open standards. We continue to believe that the freedom and flexibility made possible by open standards fosters gaming innovation in ways that proprietary solutions cannot. We are very proud to support developers who share that vision with GPUOpen, our website for developers stocked with valuable software and tools based on open standards that they can use to help maximize the performance of their games. For example, dedicated tools like the Radeon™ GPU Profiler help accelerate development cycles and optimize games for next-gen APIs such as DirectX® 12 and Vulkan®.
There’s a lot more to share this GDC, with the following updates to our Radeon Software for Developers tools being released on GPUOpen and presented during the AMD-sponsored sessions at the conference tomorrow, Wednesday, March 20.
Radeon™ GPU Profiler 1.5
GDC 2019 Session: “ AMD GPU Performance Revealed”, Room 3001, West Hall, Moscone Center
The Radeon™ GPU Profiler (RGP) is our low-level optimization tool for DirectX 12, Vulkan, and OpenCL™ that runs on Windows® and Linux® and provides detailed graphics and compute timing and occupancy information on AMD Radeon™ GPUs.
With RGP, developers can easily visualize precisely how their game or application is utilizing the GPU so they can fully optimize it for Radeon™ graphics cards, helping deliver a better experience to end-users using AMD.
At GDC we’ll be showing three new features that we are adding in the RGP 1.5 update. Instruction timing, available for all shader stages, lets you see instruction durations to find out what part of your program is “hot.” Shader ISA lets you see shader code in the pipeline state and user market display helps you better understand what the GPU is working on.
The Radeon GPU Profiler 1.5 update is coming soon with planned availability in April 2019.
LEARN MORE
Radeon™ GPU Analyzer 2.1
GDC 2019 Session: “ AMD GPU Performance Revealed”, Room 3001, West Hall, Moscone Center
The Radeon™ GPU Analyzer (RGA) is our offline compiler and shader performance analysis tool for DirectX®, Vulkan, OpenGL®, and OpenCL that runs on Windows and Linux which also integrates into popular third-party developer tools like RenderDoc, Shader Playground, CodeXL, and Pyramid.
RGA improves developer efficiency by providing an integrated shader code editing, compilation, and analysis environment outside of a game engine that helps quickly validate different optimization strategies. As with the Radeon GPU Profiler, the optimizations RGA helps achieve can deliver better end-user experiences on Radeon graphics.
The updated 2.1 version of RGA being presented at GDC has a new GUI interface for Vulkan and OpenCL analysis and the ability to use the shader compiler directly from the installed Radeon Software driver – rather than the one included with the tool.
Radeon GPU Analyzer 2.1 will be available on March 20, 2019.
LEARN MORE
Microsoft® PIX AMD-specific GPU Data Support
GDC 2019 Session: “ AMD GPU Performance Revealed”, Room 3001, West Hall, Moscone Center
Microsoft® PIX is the premier tool for debugging and analyzing DirectX 12 game performance on Windows® 10, and it has been updated for GDC 2019 to enable developers who primarily use PIX to debug and analyze their DX12 performance to better optimize their games for Radeon graphics.
The updated version of Microsoft PIX, 1903.12, in addition to providing AMD GPU-specific per command performance data and DirectX 12 stage wave occupancy data, now includes AMD GPU-specific high frequency counter data.
The high frequency counter data is based on AMD streaming performance metrics (SPM) which are heavily relied upon by console developers to understand the precise performance impact of overlapping command execution on the GPU.
Microsoft PIX 1903.12 is available now.
LEARN MORE
OCAT (Open Capture and Analytics Tool) 1.4
GDC 2019 Session: “ AMD GPU Performance Revealed”, Room 3001, West Hall, Moscone Center
OCAT is our lightweight open source capture and performance analytics tool with support for DirectX® 11, DirectX 12, and Vulkan. It runs in parallel with your game, sampling available DXGI presentation metrics to provide an accurate view of frame timing, pacing, and delivery.
OCAT can be used by developers to help discover and fix issues in frame pacing, and by end-users for detailed benchmarking of their system’s performance in DX12 and Vulkan games.
For the OCAT 1.4 update, we’ve added an audible indicator that capturing is taking place when you don’t want to use the in-game overlay or there are compatibility issues with certain titles prevent its use. We’ve also updated the in-game overlay to include a rolling frame time graph and to display what graphics API is currently being used. Lastly, there is a bar can be added to the overlay that changes color with each frame to aid with post-process video analysis of the capture.
OCAT 1.4 will be available on March 20, 2019.
LEARN MORE
AMD Radeon FreeSync™ 2 HDR samples
GDC 2019 Session: “A Blend of GCN Optimisation and Color Processing”, Room 3001, West Hall, Moscone Center
AMD Radeon FreeSync™ 2 HDR 1,2 technology raises the bar to the next level for gaming displays, enabling an exceptional user experience when playing HDR games, movies, and other content.
To help game developers properly set up their games to work with Radeon FreeSync 2 HDR, starting at GDC 2019 AMD is going to post a series of in-depth technical blogs along with sample code that show how to set up, activate and render with Radeon FreeSync 2 HDR enabled on AMD Radeon GPUs with the required Radeon FreeSync 2 HDR display attached.
The first blog planned is an introduction to the series and covers color spaces, with follow up blogs in the series covering tone mapping, gamut mapping, and ending with how to use the provided sample code to enable Radeon FreeSync 2 HDR in your application.
The Radeon FreeSync 2 HDR Color Spaces blog will be posted on March 20, 2019.
LEARN MORE
AMD TrueAudio Next (TAN)
GDC 2019 Session: “Powering Spatial Audio on GPUs Through Hardware, Software and Tools”, Room 3001, West Hall, Moscone Center
AMD TrueAudio Next (TAN) is our SDK for GPU-accelerated high-performance audio signal processing for realistic spatial audio, now supported in Steam® Audio. At GDC 2019, we are presenting with Valve to show how the latest Steam Audio Beta 17 released in February now supports dual real-time dedicated compute queues (RTQs) using TAN for accelerated audio convolution, and AMD Radeon™ Rays for accelerated real-time audio ray tracing.
This update to Steam Audio using GPU-accelerated TAN and Radeon Rays enables higher-order ambisonics spatialized audio for immersive and realistic audio environments with a low impact on the application’s Radeon graphics performance 3. This means developers can incorporate amazing GPU-accelerated spatial audio into their experiences without having worry about compromising its visual fidelity.
Steam Audio Beta 17 with dual RTQ TrueAudio Next and Radeon Rays support is available now.
LEARN MORE
Alexander Blake-Davies is a Software Product Marketing Specialist for Radeon™ Software for Developers at AMD’s Radeon Technology Group. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied. GD-5
Footnotes
- FreeSync 2 HDR does not require HDR capable monitors; driver can set monitor in native mode when FreeSync 2 HDR supported HDR content is detected. Otherwise, HDR content requires that the system be configured with a fully HDR-ready content chain, including graphics card, graphics driver and application. Video content must be graded in HDR and viewed with an HDR-ready player. Windowed mode content requires operating system support. GD-105
- Requires a monitor and AMD Radeon™ graphics, both with FreeSync support. See www.amd.com/freesync for complete details. Confirm capability with your system manufacturer before purchase. GD-127
- For more information about AMD TrueAudio Next’s impact on Radeon graphics performance please seehttps://steamcommunity.com/games/596420/announcements/detail/1743360777226556930
AMD News from GDC 2019
By News Guy, March 20, 2019
https://babeltechreviews.com/amd-news-from-gdc-2019/
Atari VCS Getting Its APU Upgraded to AMD Ryzen + Vega, Will Be Faster, Cooler and More Efficient
By Alessio Palumbo, Mar 19 2019
https://wccftech.com/atari-vcs-apu-upgraded-amd-ryzen-vega/
https://www.indiegogo.com/projects/atari-vcs-game-stream-connect-like-never-before#/updates/allLast edited: Mar 21, 2019 -
AMD Ryzen 3000 "Zen 2" BIOS Analysis Reveals New Options for Overclocking & Tweaking Techpowerup.com | March 22, 2019
AMD will launch its 3rd generation Ryzen 3000 Socket AM4 desktop processors in 2019, with a product unveiling expected mid-year, likely on the sidelines of Computex 2019. AMD is keeping its promise of making these chips backwards compatible with existing Socket AM4 motherboards.
At CES 2019, AMD unveiled more technical details and a prototype of a 3rd generation Ryzen socket AM4 processor. The company confirmed that it will implement a multi-chip module (MCM) design even for their mainstream-desktop processor, in which it will use one or two 7 nm "Zen 2" CPU core chiplets, which talk to a 14 nm I/O controller die over Infinity Fabric. The two biggest components of the IO die are the PCI-Express root complex, and the all-important dual-channel DDR4 memory controller. We bring you never before reported details of this memory controller.
The downside of this approach is the memory controller is no longer physically integrated with the processor cores. The 3rd generation Ryzen processor (and all other Zen 2 CPUs), hence have an "integrated-discrete" memory controller. The memory controller is physically located inside the processor, but is not on the same piece of silicon as the CPU cores. AMD isn't the first to come up with such a contraption. Intel's 1st generation Core "Clarkdale" processor took a similar route, with CPU cores on a 32 nm die, and the memory controller plus an integrated GPU on a separate 45 nm die. -
Both Intel and AMD need smaller process technology for the die shrink's needed to implement these goals. Otherwise there isn't enough physical space to implement the break up of the die into modules at the current socket size limitations.Last edited: Mar 22, 2019 -
It just amazes me the ignorance of tech progression. Trying to put down current process is like saying why did not Intel offer the current 9900's at the introduction of the icore CPU family. As time goes on things will undoubtably improve, so give it some time.....
Last edited: Mar 23, 2019Mr. Fox likes this. -
Google Stadia - Gaming REVOLUTION
AdoredTV
Published on Mar 22, 2019
http://forum.notebookreview.com/threads/google-stadia.828080/page-2#post-10885213Last edited: Mar 22, 2019 -
The Smallest Ryzen Yet! $150 Asrock Mini Gaming PC
Hardware Unboxed
Published on Mar 23, 2019
-
Why Raja Koduri Left AMD & Intel's vision For GPUs | #IntelOdyssey
RedGamingTech
Published on Mar 22, 2019
During Intel's GDC odyssey event, Raja Koduri took took to the stage to explain why he decided to leave AMD for Intel, and Intel's current vision for GPU computing. intel's odyssey event went into very few technical details concerning the next generation GPUs from the company (and we'll further cover the hardware based on the white paper soon) but instead Intel's desire was to explain the reasons they were involving themselves in the dedicated GPU space.
theexmann 1 day ago
"He didn't say specifically why he left AMD, but did say that Intel had 1000s of engineers which implies AMD doesn't. Ironic that he talks about security when it was announced a couple of weeks ago that Intel CPUs are now susceptible to yet another security flaw called Spoiler."
Digital Intellect 1 day ago
"This is just an incoherent ramble to justify he left for the money He wasn't chained to AMD, the manner of his departure was just so untidy Also, he mentioning Intel and security in the same breath is outright ludicrous"
theexmann 1 day ago
"I can't believe that Raja was going on about how Intel respects the gamer/user and how they have pushed Intel technology forward. That's a load of crap. If that was really true Intel still wouldn't have been producing only 4 core processors and charging a small fortune for anything above 4 cores. Raja, it has always been AMD who listens to the gamer/user. It's AMD who brought the 6 and 8 core processors to the average game player and user at reasonable prices. Intel didn't do that! But they did have to respond to AMD's Ryzen. Raja, you should be ashamed of yourself for giving Intel credit where credit is NOT DUE!"Last edited: Mar 24, 2019 -
It might take few years for Intel dGPU to establish themselves because aside from Enterprise users who used Xeon Phi Knight Landing and the like which were very powerful at their time, I was dumbfounded with their performance in HPC field but they axed and AMD/Nvidia took over. -
Sentry 2.0 - ITX Ryzen 2700 build -- From DR ZĄBER
Level1Techs
Published on Mar 24, 2019
LevelTechs Article: https://level1techs.com/article/dr-ząber-sentry-20-mini-itx-pc-build
Last edited: Mar 26, 2019 -
Deal | AMD seemingly slashes prices for Zen+ performance Ryzen 7 CPUs ahead of Zen 2 launch
https://www.notebookcheck.net/AMD-s...en-7-CPUs-ahead-of-Zen-2-launch.415002.0.html
"Prices for second-generation AMD Ryzen CPUs have dropped, with US$50 being shaved off a performance-level Ryzen 7 2700 CPU on Newegg.
A number of CPUs based on the Zen+ microarchitecture have been discounted, leading to speculation that AMD is making space in its inventory for the upcoming Zen 2-based Ryzen chips."
AMD Slashing Ryzen Prices Ahead of Zen 2 Launch??
UFD Tech
Published on Mar 27, 2019
Raiderman, ajc9988, TANWare and 1 other person like this. -
-
AMD Ryzen 5 3500U Vega 8 Benchmark Tests - 11 videos
TechEpiphany
Published on Mar 26, 2019
AMD Ryzen 5 3500U Vega 8 Test, Resident Evil 2
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.