Radeon VII: Fast & Stable Premiere Editing (part 2 of 2)
Level1Techs
Published on Mar 4, 2019
EposVox's Channel: https://www.youtube.com/eposvox
Faster Adobe Premiere: Edit PC & Render PC -- Networked Rendering (Part 1 of 2)
Level1Techs
Published on Feb 28, 2019
TLDW: https://forum.level1techs.com/t/adobe...
-
-
abaddon4180 Notebook Virtuoso
hmscott likes this. -
So, after servicing my loop and changing things around, cleaning the CPU block which got gunked up with dye that fell out, etc., my 1950X is now running 4.2GHz all core at 1.36875V and doesn't go above 68 for most benchmarks (there is an exception on the Sandra Financial hitting 69C and the blender benchmark suite which can push it up there), and the VRM don't go above 84 peak (usually in the 70s for most benches or lower).
So, 17 minutes and 50.46 seconds. As you can see, the sensors for the VRM go wonky sometimes on this board. I do not know if it is the sensor itself or the software polling at issue.
The CPU-Z bench is 10423.2 MT and 488.2 ST.
I got rid of the bench OSes, so this isn't optimized, and it uses HPET timer in Win 10, which can impact performance on some testing.
Either way, with the upcoming Ryzen 3000 chips hitting high 4GHz and potentially 5GHz, it clearly shows that HWBot may need to waive some of the issues regarding timers, if not resolved, or put restrictions on to allow submissions with systems that either cannot reduce the BCLK below 100MHz and that have the option of disabling spread spectrum or something, as the market share gains may make it where people stop using them due to their restrictions. TBH, that is why I got rid of my bench OSes. Windows 7 is too insecure at this point and about to lose support altogether next year. So if I cannot submit my benches to them, why bench? It isn't worth it to buy Intel and pay the premium for a hobby, if being honest.Last edited: Mar 5, 2019hmscott likes this. -
-
Meanwhile, considering Su's comment on adored TV's Twitter thanking him for following the company so closely awhile back, I'd be willing to suggest his leak is their targets (not final clocks).
Either way, if they get anywhere close to those cores at those speeds and prices, it will be a "red wedding" type scenario.
Also, on the rumor mill, did you see this one:
Sent from my SM-G900P using Tapatalk -
-
Sent from my SM-G900P using Tapatalk -
ajc9988 likes this.
-
Sent from my SM-G900P using TapatalkVasudev likes this. -
I have a foot injury now limiting myself to the living room and a small laptop. I have been using my 3 year old 11.6" Netbook with a z8300 and it is excruciatingly slow. I am not sure how long I am stuck in the chair either.
I can't normally get to the 1800x so time for a new laptop. I have ordered the Hauwei Matebook 14 D, 2500u with 8GB ram and 256 GB SSD. I figure this will be an improvement and now be AMD.Last edited: Mar 5, 2019 -
abaddon4180 Notebook Virtuoso
-
I am only planning an upgrade to a 1TB nVME drive. The stock 256 GB is just a SATA SSD drive and not all that fast. Then M$ office and dual boot to Linux.
I am surprised there are not more AMD laptop offerings out there. I guess once 7nm gets too mobile in a year or two. -
Also, if that 8 core that was demoed was 65W TDP, that would make a hell of a gaming laptop in the DTR space, most likely.
Sent from my SM-G900P using Tapatalk -
abaddon4180 Notebook Virtuoso
-
abaddon4180 Notebook Virtuoso
Looks like mobile Zen+ looks pretty good.
https://www.notebookcheck.net/Our-f...-i5-8300H-a-run-for-their-money.412698.0.html -
12nm though. AMD needs 7nm too start really hitting Intel where it hurts. As far as ram, 8GB is fine for the next couple of years until I can get a hold of a 7nm mobile.
Last edited: Mar 5, 2019Vasudev likes this. -
Vasudev likes this.
-
-
And I cannot be sure, but AdoreTV might have indicated the Zen 2 8c/16th that matched/slightly exceeded the i9900k under those conditions did so at half the power draw.
So, technically, at the time of the demo, the Zen 2 might have been at 45W... but if AMD is targeting existing TDP envelopes, and considering the fact the clocks hadn't been finalized... we could see 65W TDP parts.
Though, if AMD offers a Zen 2 at 45W with same performance as i9 9900K, it would be excellent for both desktop and mobile space... but 65W works too as Asus and Acer both used 1700 and 2700 in their productivity/gaming laptops... although, it wouldn't be a bad idea to also include an 8c/16th Zen 2 at 65W which also has a Navi iGP (for better battery life if you're on the go).hmscott likes this. -
What was shown is an Intel 9900K running 4.7GHz. It's already been shown to run in TDP, the 9900K would need clocked to around 4.2-4.3GHz.
What he explained was the wattages shown in the demo were total system draw from the wall. You then have to estimate what inefficiencies there are with the PSU, what goes to the ram, the graphics card, and the motherboard (including chipset).
After all of that, he showed the 8-core likely was a 65W chip, with the system pulling 134W, while the Intel system pulled around 190W. When you look at the 4.7GHz clock on the 9900K, that would be closer to 110-120W pulled, or 45-55W more than the AMD Zen 2 sample shown.
Does that make more sense now?
Sent from my SM-G900P using Tapatalkhmscott likes this. -
All I did was ask a question and posited a hypothesis based on the fact that the Zen 2 system drew less power as was demonstrated in the demo.hmscott, Talon and tilleroftheearth like this. -
Edit: scratch that, AMD has the Radeon VII in stock for $699, but with a 1 year warranty, yikes.
For just $709.99 after the promo code on Newegg you can get an overclocked triple fan 2080 with an "A" chip including 3 games. It's also in stock and ready to ship, and with a 3 year warranty.
https://www.newegg.com/Product/Product.aspx?Item=N82E16814932066&Description=rtx 2080&cm_re=rtx_2080-_-14-932-066-_-Producthmscott likes this. -
When undervolted, RVII pulls less power than 2080 (plus, most of the RVII or AMD gpu's in general ship overvolted out of the factory - and have Wattman in the drivers itself to help optimize the GPU).
Undervolting the RTX line is not exactly 'easy' considering that NV locked both voltages and frequencies together.
However, one of the primary reasons why RVII is more power demanding than NV RTX line is because it has far greater compute capabilities.
In case you might have missed it, AMD increased compute hardware on RVII vs Vega 64.
Look at the amount of FP64 and FP32 on RVII vs Vega. The FP64 alone is ridiculously high in RVII for a consumer product... and that takes power (the fact they crammed all of that compute in there and raised clocks at the same time while also doubling the VRAM and pulling less than 300W is actually amazing - even more so when you consider that Vega 64 easily consumer MORE than 300W in gaming).
Point being: GCN is a compute heavy uArch.
Compute hardware is power demanding.
If NV tried the same thing, they would have same issues with power consumption - but because they are a bigger company, they can afford to make specialized GPU's for specific industries.
All AMD did in the case of RVII was effectively repurpose a data center GPU (MI50) intended for AI, for the consumer space.
Also, RVII beats 2080ti in productivity/professional workloads and even keeps up with the Titan in a few things.
When you take into account that compute hardware is in far greater demand/use vs real-time raytracing and DLSS (both of which can be done on general compute anyway as industry devs and AMD both confirmed), you're actually getting MORE bang for buck with RVII as you can do pro stuff in addition to gaming at greater efficiency levels (why do you think the Polaris was so popular? Its compute capabilities were on par with 1070/1080 and could mine at those levels for a lower power draw and price than NV gpu's - its efficiency went up even further when it was undervolted - but mining alone was not its sole strength).
Not disputing the fact that GCN is old and has low amount of ROP's and texture units (which is hampering the GPU in some games) but the fact AMD is actually 'keeping up' on the gaming end and beats NV in compute on a consumer product (Several tiers above) for the same price, I think that they aren't doing bad for a company that has far fewer resources.
But in the case of Zen 2... the point was to illustrate how a lower power version (and what appeared to be a mid-range) CPU effectively tied/beat Intel's latest and the greatest (which leaves us to question what kind of performance can we expect on a similarly TDP rated CPU)... and what we could effectively see being used in the laptop space (where the OEM's are usually obsessed with TDP and cut corners in general - even more so when it comes to AMD hardware).Last edited: Mar 7, 2019 -
Undervolting is not the answer because not all cards will undervolt and reading posts it seems their is a large variance on undervolts and default voltages. Sorry that isn’t stock and cannot be used as a power consumption review. If that is how it was intended to be used then AMD should have done that at the factory. They didn’t for one reason or another.
Tech power up shows peak gaming 226w at 2080 vs 313w with Radeon VII. That isn’t even close to similar power consumption.Last edited: Mar 7, 2019 -
tilleroftheearth Wisdom listens quietly...
Just curious, what is the total power consumption for the same load/output over an hour (as an example)?
Peak, minimum and even avg can be misleading. The total actual power used isn't (for a given load).
And of course, with GPU's we're assuming they're at least comparable minimum fps...
bennyg likes this. -
What they show is inline power consumption, but way louder. The original driver sucked, but many complainants fixed in the new driver. With the difference in performance, lack of DLSS and RTX by games generally, etc., I estimate that if this card is $35-50 cheaper than the 2080 being considered, this should be in the running, especially if you are going to remove the cooler (which is way louder than Nvidia's) and water cool the card, which does have good OCing when water cooled.
That also varies by specific games you would want to buy and play. A couple games were nearly 2080 Ti levels, but those are only like one or two games. Generally, I think it is a 5% Delta between the RVII and the 2080, but it may have been 7%.
But those videos should have the power draw information mapping directly Nvidia against AMD Radeon.
Sent from my SM-G900P using Tapatalk -
tilleroftheearth Wisdom listens quietly...
Thanks for trying to answer my question. I won't ever have time to look at 1-hour videos to get that answer.
Let me restate what I've been saying on this forum since day one. I don't play games. Never have.
Videos are not written information. Where are the numbers, please?
I really don't get that they show the inline power consumption 'but way louder' either.
Total power for a given load for a given time frame (an hour is a good minimum time frame for me).
-
Sent from my SM-G900P using Tapatalkhmscott likes this. -
GN's articles are better signposted and easier to find stuff within than their YT vids
https://www.gamersnexus.net/hwreviews/3437-amd-radeon-vii-review-not-ready-for-launch
As for power consumption 2080 vs vega 7:
That was on the shoddy launch drivers but I don't think power use has changed -
I'm not sure if the actual power draw was affected, but it may be possible that it has been streamlined somewhat.
After all, the 19.2.3 drivers did increase battery life on Ryzen mobile, so its possible 'some' power modifications were introduced for RVII.
But yes, this is the graph I was referring to when I noted that power consumption in games of RVII and 2080 is pretty similar (and definitely goes into AMD favor when undervolted) - and in itself, that actually says something considering just how powerful RVII is from a compute point of view in relation to 2080.hmscott likes this. -
I've posted a number of updated reviews and reddit links and comments in those posts.
As with any new release of hardware, drivers + firmware + software updates follow and clean things up.
With the OC'ing headroom now available - even higher memory throughput + core boost is giving enough umph to pass the 2080 in more games.
As always the Nvidia Gameworks BS needs to be disabled - and tune the AMD tesselation settings to 8x/16x away from unlimited processing and FPS is much better.
Fun stuff. AMD has them in stock @ $699 for ordering right now:
https://www.amd.com/en/products/graphics/amd-radeon-vii
ajc9988 likes this. -
tilleroftheearth Wisdom listens quietly...
Nobody has a duty here but to be civil to each other.
If you don't want to support your statements by clarifying the previous 'info' you thought you had given, you didn't have to respond at all.
I stated my reasons (don't know what a 'caddy response' is?), certainly was not being disrespectful.
There is nothing to look up for me. Like I initially stated, I was just curious.
-
Undervolting pretty much works on all RVII cards and majority of Vega and Polaris GPU's. The range at which you might be able to undervolt will vary yes, but pretty much all of them will be able to do so.
The reason AMD doesn't ship every GPU with lower voltages from factory is because this adds time and cost to test each GPU so they can find its lowest stable voltage threshold - and as you may know, AMD doesn't exactly have the same resources as Nvidia.
AMD 'did' introduce the 'auto-undervolt' option with latest drivers... but manual undervolting works better as users can usually undervolt to a much larger degree with far better results.
Also, bennyg did post Gamers Nexus power draw testing in gaming.hmscott likes this. -
Sent from my SM-G900P using Tapatalk -
Imagine the mind share those poor RTX owners are losing every day - wondering when the next patch or game with RTX features is going to release, 7 months now and the 3rd game (Anthem? / Tomb Raider?) hasn't dropped yet.
It's gotta be getting to them, even if they don't admit it.
Then the RTX mobile Max-Q GPU's have their RT performance halved, making them useless for RTX features even at 1080p - even the Max-Q 2080 performs just above a desktop 2060 in RT.
After all that, only about 1/4 to 1/3 of the full RTX feature set is implemented in any one game, none have the full reflections, gi, shadows, ??? features implemented because even the 2080ti couldn't drive all that at 1080p 60fps.
So 2nd, 3rd, and 4th generation RTX GPU's need to arrive before full features can be implemented in a single game, and that's likely 3-5 years out at the normal "doubling" of GPU performance, if that is even attainable moving forward at the same time scale.
IDK, I don't see RT being a "thing" worth the mind share it takes right now, or costs in overpriced Nvidia RTX GPU's, it's just not worth the trouble - the games look different, but do they play better? Is gameplay positively affected? Do I like the visual changes? Nope to all of that.
If you don't have to have the top performance, maybe get a nice used Vega 56 and overflash it with Vega 64 firmware and tune it, spend a lot less and wait for the next generation AMD / Nvidia GPU's?
Too bad Nvidia threw this RTX crap on to the table, the non-RTX expanded performance GPU's we could have had might have been worth the extra $. I guess we'll never know now. -
tilleroftheearth Wisdom listens quietly...
bennyg, thank you.
Looks to me that total power is very, very close. This is/was a non-issue in my view.
Raiderman likes this. -
If water cooling, the OC is very impressive on the AMD cards. So if grabbing like a bykski block or waiting for EK, that would be a nice deal.
Sent from my SM-G900P using Tapatalk -
Yeah, IDK the 3 fan air-cooled AMD Radeon Vega VII with new drivers, firmware, and tuning software seems to get everything out of it that is available - although I guess we haven't really seen the water-blocked setup's in action yet.
https://twitter.com/EKWaterBlocks/status/1103717158692048898
Radeon VII Full Cover Water Blocks by EKWB | Teaser on Twitter
Submitted 5 hours ago by T1beriu
https://www.reddit.com/r/Amd/comments/ayfr4d/radeon_vii_full_cover_water_blocks_by_ekwb_teaser/
Still getting reports from people around the world finding the AMD Radeon Vega VII for sale in their country, so shipments must be continuing.
Finally got this in Turkey for just $860 and better yet it’s just a day before my birthday! Best present ever!
Submitted 10 hours ago by arinc9
https://www.reddit.com/r/Amd/comments/aycakg/finally_got_this_in_turkey_for_just_860_and/
And, tuning a Vega 56 as a Vega 64 is still a happening thing:
Vega 56 w/ 64 BIOS undervolting results after 1.5 years of ownership
Submitted 1 day ago * by 907Shrake
https://www.reddit.com/r/Amd/comments/axt1e8/vega_56_w_64_bios_undervolting_results_after_15/
"Card purchased off of /r/Hardwareswap in Fall 2017 for $410, when the mining craze was getting even more nutty. Sapphire Reference Vega 56, hasn't been re-pasted.
I decided to try the latest drivers with a MSi Afterburner fan curve and it seems I can run the card below 1V with 1150 MHz HBM2 and 1550ish MHz average using a decently aggressive fan curve.
- One run of FireStrike with the settings visible
- Decided to do a FS Stress Test to see if it is in fact, stable while sounding like a lower pitched hair dryer
Also, I'd love to see if any other Vega owners have stuck with their V56 or V64 while avoiding the allure of Radeon VII and RTX cards and I'd love to see your results from tinkering, too!"Last edited: Mar 7, 2019Raiderman likes this. -
Edit: https://www.gamersnexus.net/guides/...rplay-overclocking-results-liquid-cooling-mod
Sent from my SM-G900P using TapatalkLast edited: Mar 7, 2019hmscott likes this. -
And, other stock tuned on air AMD Radeon Vega VII owners have also gotten over 2ghz already without water cooling, and seem to have hit a limit imposed by the GPU (firmware?) that might not be exceed by cooling further on water - but it might be quieter overall using a 360mm radiator and quieter fans.ajc9988 likes this. -
Now, I can't remember if HU has done their OC numbers yet on the cards or not. But it does make a compelling argument seeing the performance on water.
Sent from my SM-G900P using Tapatalkhmscott likes this. -
I think the AMD Radeon Vega VII tuned is a great GPU to buy right now, while they are in stock.
The Navi might outperform it, or maybe not - AMD seems to believe Navi is going to be a mid-range GPU - at least initially - but either way the AMD Radeon Vega VII is still going to be the *first* 7nm GPU.Deks likes this. -
But Vega, which had known issues on being power hungry and clocking low moving to be where Nvidia are on performance and power efficiency shouldn't be ignored. Think of what a design made for 7nm will do on efficiency.
Also, Nvidia is looking to move to Samsung 7nm, which is optimized for lower power, but is less dense, iirc. So, we will have to see what effect that would have on Nvidia while AMD use the more high performance tsmc 7nm, then EUV next year giving about 35% more density.
Sent from my SM-G900P using Tapatalkhmscott likes this. -
Even with 7nm boost Nvidia could spend those improvements on more RTX doubling down and not get much more rasterization performance improvements - I mean why would they if RT is the "future", right?
It's gonna be a wild ride the next couple of years, with AMD / Nvidia - with AMD a process node ahead for the foreseeable future on both CPU and GPU, hopefully catching some tail-wind on GPU's to get further out there in the market place.
AMD needs to hit a tip over point in market share where people think of AMD GPU's like the new Ryzen CPU's. It's crazy how people waste $ on Nvidia GPU's when AMD models that perform as well or better are at the same price or lower.
Nvidia has to fall over a few more times I suppose, Jensen and the Nvidia gang won't let us down, they'll continue to screw up their good thing at the same rate moving forward.
Intel may stumble into town with a cart load of GPU's asking for directions sometime next year, that should be fun to watch too.Last edited: Mar 8, 2019 -
If you take into account gaming and compute together, that 5-7% gaming delta doesn't really sound like much (or anything)... so, wouldn't that actually make RVII a far better deal? -
-
Sent from my SM-G900P using Tapatalk -
Stock 2080 Ti XC Ultra vs Overclocked +150mhz/+800mhz memory. This is with programs open in the background, so less than ideal performance testing but the numbers still show GN is a little off.
This is 4K/High settings/DX11 just like they report. -
Edit: Here is their test setup:
So, you are comparing a 9900K, likely at 5GHz or above, to an 8086K@5GHz. You do not control for ram speed or timings. You don't control for MB being used. You don't control for cooling or ambient temp (by cooling, I mean specifically in case or test bench temp that may effect boost clocks of the GPU during testing), etc. Those factors were controlled for in their testing between the different cards. You changed those variables, then say you cannot trust their results because the cited result is different than theirs, without an examination of WHY it is different.
Edit 2:
This means it has NOT been tested with game updates which may have changed the FPS in the interim period. It also shows that when overclocked, it roughly comes in line with 82FPS, whereas they present both stock and overclocked 2080 Ti information in the graph.
But as explained and showed, there are variables you did not control for in your accusation that the numbers are not realistic.
Sent from my SM-G900P using TapatalkLast edited: Mar 8, 2019Talon likes this. -
My 2080 Ti XC Ultra was running stock power (no slider movements), fans at auto, everything default in afterburner. You are correct that ambient temps and therefore GPU temps will change GPU boost, but I looked after the run and saw clockspeed around 1905-1925mhz under load. Most 2080 Ti cards around this tier run this speed or higher out of box. This was checked and accounted for.
It would seem the biggest oversight by them here is the very old Nvidia driver they are using. They didn't actually test the game, which also likely saw updates and possible performance updates. They didn't account for Windows updates since their initial testing, etc. Those results are therefore not valid.Last edited: Mar 8, 2019
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.