The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Vasudev likes this.
  2. abaddon4180

    abaddon4180 Notebook Virtuoso

    Reputations:
    1,229
    Messages:
    3,412
    Likes Received:
    39
    Trophy Points:
    116
    Sleep is just much more convenient for me as I often keep apps open and have to keep going away from my desk fairly often. I want to stick with the new drivers because performance and stability are better but I did not have the brightness issue with the old drivers. I am going to try to install the new ones again and hopefully that will fix it.
     
    hmscott likes this.
  3. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    So, after servicing my loop and changing things around, cleaning the CPU block which got gunked up with dye that fell out, etc., my 1950X is now running 4.2GHz all core at 1.36875V and doesn't go above 68 for most benchmarks (there is an exception on the Sandra Financial hitting 69C and the blender benchmark suite which can push it up there), and the VRM don't go above 84 peak (usually in the 70s for most benches or lower).
    Screenshot (2) -yep.png
    So, 17 minutes and 50.46 seconds. As you can see, the sensors for the VRM go wonky sometimes on this board. I do not know if it is the sensor itself or the software polling at issue.

    The CPU-Z bench is 10423.2 MT and 488.2 ST.

    I got rid of the bench OSes, so this isn't optimized, and it uses HPET timer in Win 10, which can impact performance on some testing.

    Either way, with the upcoming Ryzen 3000 chips hitting high 4GHz and potentially 5GHz, it clearly shows that HWBot may need to waive some of the issues regarding timers, if not resolved, or put restrictions on to allow submissions with systems that either cannot reduce the BCLK below 100MHz and that have the option of disabling spread spectrum or something, as the market share gains may make it where people stop using them due to their restrictions. TBH, that is why I got rid of my bench OSes. Windows 7 is too insecure at this point and about to lose support altogether next year. So if I cannot submit my benches to them, why bench? It isn't worth it to buy Intel and pay the premium for a hobby, if being honest.
     
    Last edited: Mar 5, 2019
    hmscott likes this.
  4. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    Leaks, salt shaker required

     
    hmscott, Raiderman and ajc9988 like this.
  5. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Did you see I'm getting 4.2 stable at 1.368V? Definitely a golden sample. Working way better after I did a tear down and reconfigured my loop.

    Meanwhile, considering Su's comment on adored TV's Twitter thanking him for following the company so closely awhile back, I'd be willing to suggest his leak is their targets (not final clocks).

    Either way, if they get anywhere close to those cores at those speeds and prices, it will be a "red wedding" type scenario.

    Also, on the rumor mill, did you see this one:


    Sent from my SM-G900P using Tapatalk
     
  6. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    Isn't HPET disabled on AMD default BIOS settings? I reckon they use Dynamic ticks or TSC for counting ticks?
     
  7. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    No, it is enabled by default and many AMD boards don't have the option to turn it off (although I think the HEDT boards do). I've played with HPET on and off in the bios to switch between the three timers depending on the BCD settings in windows (bios on + use platform clock on = HPET, bios off + use platform clock true = RTC, use platform clock false = ITSC)

    Sent from my SM-G900P using Tapatalk
     
  8. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    HPET is really slow on modern CPUs that's why I steer clear of them esp. GPUPI. I was benchmarking a lot and my school project was loading very slow or sometimes very fast for even profilers to catch telemetry data. After I switched to TSC everything became alright! Even on most HPET is enabled and can't be disabled but OS can bypass it ans use its own timers.
     
    ajc9988 likes this.
  9. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Well windows I think has HPET off by default. So even though the bios has HPET on, it would default to ITSC instead of HPET. But, programs can change the clock used on install without telling the user if given administrative privileges. It really is a cooky setup.

    Sent from my SM-G900P using Tapatalk
     
    Vasudev likes this.
  10. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    I have a foot injury now limiting myself to the living room and a small laptop. I have been using my 3 year old 11.6" Netbook with a z8300 and it is excruciatingly slow. I am not sure how long I am stuck in the chair either.

    I can't normally get to the 1800x so time for a new laptop. I have ordered the Hauwei Matebook 14 D, 2500u with 8GB ram and 256 GB SSD. I figure this will be an improvement and now be AMD.
     
    Last edited: Mar 5, 2019
    hmscott, ajc9988 and Vasudev like this.
  11. abaddon4180

    abaddon4180 Notebook Virtuoso

    Reputations:
    1,229
    Messages:
    3,412
    Likes Received:
    39
    Trophy Points:
    116
    That's a great little notebook. I got one for my sister's graduation and she loves it. It is actually nicer-looking up close than the Macbook Air and much more powerful.
     
    hmscott, Vasudev and ajc9988 like this.
  12. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    I am only planning an upgrade to a 1TB nVME drive. The stock 256 GB is just a SATA SSD drive and not all that fast. Then M$ office and dual boot to Linux.

    I am surprised there are not more AMD laptop offerings out there. I guess once 7nm gets too mobile in a year or two.
     
    Vasudev, hmscott and ajc9988 like this.
  13. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Yeah, Renoir will have zen 2 and possibly Vega 7nm or Navi.

    Also, if that 8 core that was demoed was 65W TDP, that would make a hell of a gaming laptop in the DTR space, most likely.

    Sent from my SM-G900P using Tapatalk
     
  14. abaddon4180

    abaddon4180 Notebook Virtuoso

    Reputations:
    1,229
    Messages:
    3,412
    Likes Received:
    39
    Trophy Points:
    116
    It does kind of suck that you can't upgrade the RAM but 8GB is enough for the average user that the laptop is targeting.
     
  15. abaddon4180

    abaddon4180 Notebook Virtuoso

    Reputations:
    1,229
    Messages:
    3,412
    Likes Received:
    39
    Trophy Points:
    116
  16. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    12nm though. AMD needs 7nm too start really hitting Intel where it hurts. As far as ram, 8GB is fine for the next couple of years until I can get a hold of a 7nm mobile.
     
    Last edited: Mar 5, 2019
    Vasudev likes this.
  17. Raiderman

    Raiderman Notebook Deity

    Reputations:
    742
    Messages:
    1,004
    Likes Received:
    2,434
    Trophy Points:
    181
    I really dont like the new price breakdown as opposed to the one released earlier. I sure like the 499 price point of the 3850x not the 761 :(
     
    Vasudev likes this.
  18. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    The $761 is due to tariffs and taxes at the location of the retailer. $500-560 is estimated without those included. So don't worry about that. If you are from a country without VAT, the number may seem enormous. But if in the US, don't worry, it will be reasonable.
     
    TANWare and Raiderman like this.
  19. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Wasn't the system power draw 45W lower on Zen 2 than the system with i9 9900k?
    And I cannot be sure, but AdoreTV might have indicated the Zen 2 8c/16th that matched/slightly exceeded the i9900k under those conditions did so at half the power draw.
    So, technically, at the time of the demo, the Zen 2 might have been at 45W... but if AMD is targeting existing TDP envelopes, and considering the fact the clocks hadn't been finalized... we could see 65W TDP parts.

    Though, if AMD offers a Zen 2 at 45W with same performance as i9 9900K, it would be excellent for both desktop and mobile space... but 65W works too as Asus and Acer both used 1700 and 2700 in their productivity/gaming laptops... although, it wouldn't be a bad idea to also include an 8c/16th Zen 2 at 65W which also has a Navi iGP (for better battery life if you're on the go).
     
    hmscott likes this.
  20. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    No, it wasn't. Stop with the lies. Intel doesn't run at its TDP. It runs higher.

    What was shown is an Intel 9900K running 4.7GHz. It's already been shown to run in TDP, the 9900K would need clocked to around 4.2-4.3GHz.

    What he explained was the wattages shown in the demo were total system draw from the wall. You then have to estimate what inefficiencies there are with the PSU, what goes to the ram, the graphics card, and the motherboard (including chipset).

    After all of that, he showed the 8-core likely was a 65W chip, with the system pulling 134W, while the Intel system pulled around 190W. When you look at the 4.7GHz clock on the 9900K, that would be closer to 110-120W pulled, or 45-55W more than the AMD Zen 2 sample shown.

    Does that make more sense now?

    Sent from my SM-G900P using Tapatalk
     
    hmscott likes this.
  21. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Stop with the nonsensical accusation which you made by calling me a liar.

    All I did was ask a question and posited a hypothesis based on the fact that the Zen 2 system drew less power as was demonstrated in the demo.
     
    hmscott, Talon and tilleroftheearth like this.
  22. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Who cares about TDP anyways, look at the laughable Radeon VII, it's power consumption to compared to Nvidia parts on old 12nm is far higher. Much more than an extra 45w, and yet people still buy it.

    Edit: scratch that, AMD has the Radeon VII in stock for $699, but with a 1 year warranty, yikes.

    For just $709.99 after the promo code on Newegg you can get an overclocked triple fan 2080 with an "A" chip including 3 games. It's also in stock and ready to ship, and with a 3 year warranty.

    https://www.newegg.com/Product/Product.aspx?Item=N82E16814932066&Description=rtx 2080&cm_re=rtx_2080-_-14-932-066-_-Product
     
    hmscott likes this.
  23. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    According to independent testing, RVII power consumption in games is pretty similar to 2080, and pulls less than 300W even though its rated at such.
    When undervolted, RVII pulls less power than 2080 (plus, most of the RVII or AMD gpu's in general ship overvolted out of the factory - and have Wattman in the drivers itself to help optimize the GPU).
    Undervolting the RTX line is not exactly 'easy' considering that NV locked both voltages and frequencies together.

    However, one of the primary reasons why RVII is more power demanding than NV RTX line is because it has far greater compute capabilities.
    In case you might have missed it, AMD increased compute hardware on RVII vs Vega 64.
    Look at the amount of FP64 and FP32 on RVII vs Vega. The FP64 alone is ridiculously high in RVII for a consumer product... and that takes power (the fact they crammed all of that compute in there and raised clocks at the same time while also doubling the VRAM and pulling less than 300W is actually amazing - even more so when you consider that Vega 64 easily consumer MORE than 300W in gaming).

    Point being: GCN is a compute heavy uArch.
    Compute hardware is power demanding.
    If NV tried the same thing, they would have same issues with power consumption - but because they are a bigger company, they can afford to make specialized GPU's for specific industries.

    All AMD did in the case of RVII was effectively repurpose a data center GPU (MI50) intended for AI, for the consumer space.

    Also, RVII beats 2080ti in productivity/professional workloads and even keeps up with the Titan in a few things.

    When you take into account that compute hardware is in far greater demand/use vs real-time raytracing and DLSS (both of which can be done on general compute anyway as industry devs and AMD both confirmed), you're actually getting MORE bang for buck with RVII as you can do pro stuff in addition to gaming at greater efficiency levels (why do you think the Polaris was so popular? Its compute capabilities were on par with 1070/1080 and could mine at those levels for a lower power draw and price than NV gpu's - its efficiency went up even further when it was undervolted - but mining alone was not its sole strength).

    Not disputing the fact that GCN is old and has low amount of ROP's and texture units (which is hampering the GPU in some games) but the fact AMD is actually 'keeping up' on the gaming end and beats NV in compute on a consumer product (Several tiers above) for the same price, I think that they aren't doing bad for a company that has far fewer resources.

    But in the case of Zen 2... the point was to illustrate how a lower power version (and what appeared to be a mid-range) CPU effectively tied/beat Intel's latest and the greatest (which leaves us to question what kind of performance can we expect on a similarly TDP rated CPU)... and what we could effectively see being used in the laptop space (where the OEM's are usually obsessed with TDP and cut corners in general - even more so when it comes to AMD hardware).
     
    Last edited: Mar 7, 2019
    Raiderman and hmscott like this.
  24. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    So you’re saying all of the mainstream reviewers got it wrong on power consumption?

    Undervolting is not the answer because not all cards will undervolt and reading posts it seems their is a large variance on undervolts and default voltages. Sorry that isn’t stock and cannot be used as a power consumption review. If that is how it was intended to be used then AMD should have done that at the factory. They didn’t for one reason or another.

    Tech power up shows peak gaming 226w at 2080 vs 313w with Radeon VII. That isn’t even close to similar power consumption.
     
    Last edited: Mar 7, 2019
  25. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Just curious, what is the total power consumption for the same load/output over an hour (as an example)?

    Peak, minimum and even avg can be misleading. The total actual power used isn't (for a given load).

    And of course, with GPU's we're assuming they're at least comparable minimum fps...

     
    bennyg likes this.
  26. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681



    What they show is inline power consumption, but way louder. The original driver sucked, but many complainants fixed in the new driver. With the difference in performance, lack of DLSS and RTX by games generally, etc., I estimate that if this card is $35-50 cheaper than the 2080 being considered, this should be in the running, especially if you are going to remove the cooler (which is way louder than Nvidia's) and water cool the card, which does have good OCing when water cooled.

    That also varies by specific games you would want to buy and play. A couple games were nearly 2080 Ti levels, but those are only like one or two games. Generally, I think it is a 5% Delta between the RVII and the 2080, but it may have been 7%.

    But those videos should have the power draw information mapping directly Nvidia against AMD Radeon.

    Sent from my SM-G900P using Tapatalk
     
    Raiderman, hmscott and Talon like this.
  27. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Thanks for trying to answer my question. I won't ever have time to look at 1-hour videos to get that answer. :rolleyes:

    Let me restate what I've been saying on this forum since day one. I don't play games. Never have.

    Videos are not written information. Where are the numbers, please?

    I really don't get that they show the inline power consumption 'but way louder' either. ;)

    Total power for a given load for a given time frame (an hour is a good minimum time frame for me).

     
  28. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Not my duty to give you information laid out in the way you want. You want it, go get it yourself. I took the time to post where the info can be found. Your caddy response is a sign of disrespect. As such, look it up yourself then!

    Sent from my SM-G900P using Tapatalk
     
    hmscott likes this.
  29. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    Raiderman, hmscott and Deks like this.
  30. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    The newer drivers fixed various bugs, overclocking, undervolting, introduced more stability and some performance gains.
    I'm not sure if the actual power draw was affected, but it may be possible that it has been streamlined somewhat.
    After all, the 19.2.3 drivers did increase battery life on Ryzen mobile, so its possible 'some' power modifications were introduced for RVII.
    But yes, this is the graph I was referring to when I noted that power consumption in games of RVII and 2080 is pretty similar (and definitely goes into AMD favor when undervolted) - and in itself, that actually says something considering just how powerful RVII is from a compute point of view in relation to 2080.
     
    hmscott likes this.
  31. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yes, power use has changed, the voltage curve works and people are getting better performance at lower power and quieter fan's + tuning fans and everyone is happy. :)

    I've posted a number of updated reviews and reddit links and comments in those posts.

    As with any new release of hardware, drivers + firmware + software updates follow and clean things up.

    With the OC'ing headroom now available - even higher memory throughput + core boost is giving enough umph to pass the 2080 in more games.

    As always the Nvidia Gameworks BS needs to be disabled - and tune the AMD tesselation settings to 8x/16x away from unlimited processing and FPS is much better.

    Fun stuff. AMD has them in stock @ $699 for ordering right now:
    https://www.amd.com/en/products/graphics/amd-radeon-vii
    AMD has the Radeon Vega VII in stock at 699.JPG
     
    ajc9988 likes this.
  32. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Nobody has a duty here but to be civil to each other.

    If you don't want to support your statements by clarifying the previous 'info' you thought you had given, you didn't have to respond at all. ;)

    I stated my reasons (don't know what a 'caddy response' is?), certainly was not being disrespectful.

    There is nothing to look up for me. Like I initially stated, I was just curious.

     
  33. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Mainstream reviewers seem to have focused on the rated 300W TDP, and generally had issues with press-release drivers (which was solved with drivers released a few days after launch).

    Undervolting pretty much works on all RVII cards and majority of Vega and Polaris GPU's. The range at which you might be able to undervolt will vary yes, but pretty much all of them will be able to do so.

    The reason AMD doesn't ship every GPU with lower voltages from factory is because this adds time and cost to test each GPU so they can find its lowest stable voltage threshold - and as you may know, AMD doesn't exactly have the same resources as Nvidia.

    AMD 'did' introduce the 'auto-undervolt' option with latest drivers... but manual undervolting works better as users can usually undervolt to a much larger degree with far better results.

    Also, bennyg did post Gamers Nexus power draw testing in gaming.
     
    hmscott likes this.
  34. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    I still stand by my $35-50 delta. So I'd grab the zotac. But at $809 normally, this card is a better deal. The harder decision is like an EVGA XC black edition at $730. At that point, cannot say, other than liking the unlocked power delivery of AMD. Once you get above $750 on a 2080, the Radeon VII should be more compelling. This assumes getting the $700 Radeon VII. The prices at $840 for the RVII losses the value proposition. But that is my assessment.

    Sent from my SM-G900P using Tapatalk
     
    Raiderman and hmscott like this.
  35. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Staying away from the whole RTX debacle is worth missing out on any savings by getting a lessor Nvidia GPU. AMD at any added cost / price - within reason - just to avoid RTX is worth it.

    Imagine the mind share those poor RTX owners are losing every day - wondering when the next patch or game with RTX features is going to release, 7 months now and the 3rd game (Anthem? / Tomb Raider?) hasn't dropped yet.

    It's gotta be getting to them, even if they don't admit it.

    Then the RTX mobile Max-Q GPU's have their RT performance halved, making them useless for RTX features even at 1080p - even the Max-Q 2080 performs just above a desktop 2060 in RT.

    After all that, only about 1/4 to 1/3 of the full RTX feature set is implemented in any one game, none have the full reflections, gi, shadows, ??? features implemented because even the 2080ti couldn't drive all that at 1080p 60fps.

    So 2nd, 3rd, and 4th generation RTX GPU's need to arrive before full features can be implemented in a single game, and that's likely 3-5 years out at the normal "doubling" of GPU performance, if that is even attainable moving forward at the same time scale.

    IDK, I don't see RT being a "thing" worth the mind share it takes right now, or costs in overpriced Nvidia RTX GPU's, it's just not worth the trouble - the games look different, but do they play better? Is gameplay positively affected? Do I like the visual changes? Nope to all of that.

    If you don't have to have the top performance, maybe get a nice used Vega 56 and overflash it with Vega 64 firmware and tune it, spend a lot less and wait for the next generation AMD / Nvidia GPU's?

    Too bad Nvidia threw this RTX crap on to the table, the non-RTX expanded performance GPU's we could have had might have been worth the extra $. I guess we'll never know now.
     
    Raiderman and Deks like this.
  36. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Raiderman likes this.
  37. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Who said anything about RTX or DLSS? The average game performance Delta without it is still 5-7% at stock. Multiply 5% by 700, you get $35. Multiply 7% by 700, you get $49. So, my price assessment is performance per dollar. RTX and DLSS are no shows, mostly, so are not factored in at all.

    If water cooling, the OC is very impressive on the AMD cards. So if grabbing like a bykski block or waiting for EK, that would be a nice deal.

    Sent from my SM-G900P using Tapatalk
     
    Raiderman and hmscott like this.
  38. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, I'd rather avoid the RTX " Cooties" and get an unafflicted GPU. Completely disabling the RTX mind virus by avoiding it altogether. ;)

    Yeah, IDK the 3 fan air-cooled AMD Radeon Vega VII with new drivers, firmware, and tuning software seems to get everything out of it that is available - although I guess we haven't really seen the water-blocked setup's in action yet.

    https://twitter.com/EKWaterBlocks/status/1103717158692048898

    Radeon VII Full Cover Water Blocks by EKWB | Teaser on Twitter
    Submitted 5 hours ago by T1beriu
    https://www.reddit.com/r/Amd/comments/ayfr4d/radeon_vii_full_cover_water_blocks_by_ekwb_teaser/

    Still getting reports from people around the world finding the AMD Radeon Vega VII for sale in their country, so shipments must be continuing. :)

    Finally got this in Turkey for just $860 and better yet it’s just a day before my birthday! Best present ever!
    Submitted 10 hours ago by arinc9
    https://www.reddit.com/r/Amd/comments/aycakg/finally_got_this_in_turkey_for_just_860_and/

    And, tuning a Vega 56 as a Vega 64 is still a happening thing:

    Vega 56 w/ 64 BIOS undervolting results after 1.5 years of ownership
    Submitted 1 day ago * by 907Shrake
    https://www.reddit.com/r/Amd/comments/axt1e8/vega_56_w_64_bios_undervolting_results_after_15/

    "Card purchased off of /r/Hardwareswap in Fall 2017 for $410, when the mining craze was getting even more nutty. Sapphire Reference Vega 56, hasn't been re-pasted.

    I decided to try the latest drivers with a MSi Afterburner fan curve and it seems I can run the card below 1V with 1150 MHz HBM2 and 1550ish MHz average using a decently aggressive fan curve.
    I'm thinking about investing in an AIO cooling solution or go through the hassle of Morpheusing my card and skipping Navi. What do you guys think?

    Also, I'd love to see if any other Vega owners have stuck with their V56 or V64 while avoiding the allure of Radeon VII and RTX cards and I'd love to see your results from tinkering, too!"
     
    Last edited: Mar 7, 2019
    Raiderman likes this.
  39. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Go to the first GN video I posted above. He had core on a mod 360mm rad at over 2000MHz, IIRC.

    Edit: https://www.gamersnexus.net/guides/...rplay-overclocking-results-liquid-cooling-mod

    Sent from my SM-G900P using Tapatalk
     
    Last edited: Mar 7, 2019
    hmscott likes this.
  40. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, the GN "custom" hybrid water cooled video's are fun - and I do watch them - as they show future potential but it's a "hack", the EK block and other products are a better real world solution.

    And, other stock tuned on air AMD Radeon Vega VII owners have also gotten over 2ghz already without water cooling, and seem to have hit a limit imposed by the GPU (firmware?) that might not be exceed by cooling further on water - but it might be quieter overall using a 360mm radiator and quieter fans. :)
     
    ajc9988 likes this.
  41. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Regardless of it being a hack, check out the performance numbers. Overclocked on water, it beat a stock 2080 Ti on Far Cry 5. You also will see it trade blows with an overclocked 2080 regularly. This is why I mentioned the quiet and the bit extra you get with water.

    Now, I can't remember if HU has done their OC numbers yet on the cards or not. But it does make a compelling argument seeing the performance on water.

    Sent from my SM-G900P using Tapatalk
     
    hmscott likes this.
  42. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, the same great results are being seen by the guys on tuned stock air cooled Vega VII's that are getting 2000mhz or more, and yes! the tuned 2000mhz+ Radeon Vega VII is outperforming the 2080ti in some tasks, as well as outperforming / matching the 2080 in more games.

    I think the AMD Radeon Vega VII tuned is a great GPU to buy right now, while they are in stock.

    The Navi might outperform it, or maybe not - AMD seems to believe Navi is going to be a mid-range GPU - at least initially - but either way the AMD Radeon Vega VII is still going to be the *first* 7nm GPU. :D
     
    Deks likes this.
  43. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    From all leaks, Navi released this year will be closer to GTX 1080/2070 performance, but at a lower price. The large Navi die or arcturus is next year. Having to tape out again really was a setback.

    But Vega, which had known issues on being power hungry and clocking low moving to be where Nvidia are on performance and power efficiency shouldn't be ignored. Think of what a design made for 7nm will do on efficiency.

    Also, Nvidia is looking to move to Samsung 7nm, which is optimized for lower power, but is less dense, iirc. So, we will have to see what effect that would have on Nvidia while AMD use the more high performance tsmc 7nm, then EUV next year giving about 35% more density.

    Sent from my SM-G900P using Tapatalk
     
    hmscott likes this.
  44. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yup, I wonder what problem - or opportunity too good to pass up - was found?
    And, it's the first production turn of a 7nm GPU, improvements in both performance and power savings will hopefully continue.
    Yeah, after the boondoggle of RTX this generation I'm worried Nvidia will flip out even further with their first 7nm generation of GPU's, too many "what will they screw up next" opportunities for Nvidia to go into the weeds again.

    Even with 7nm boost Nvidia could spend those improvements on more RTX doubling down and not get much more rasterization performance improvements - I mean why would they if RT is the "future", right?

    It's gonna be a wild ride the next couple of years, with AMD / Nvidia - with AMD a process node ahead for the foreseeable future on both CPU and GPU, hopefully catching some tail-wind on GPU's to get further out there in the market place.

    AMD needs to hit a tip over point in market share where people think of AMD GPU's like the new Ryzen CPU's. It's crazy how people waste $ on Nvidia GPU's when AMD models that perform as well or better are at the same price or lower.

    Nvidia has to fall over a few more times I suppose, Jensen and the Nvidia gang won't let us down, they'll continue to screw up their good thing at the same rate moving forward.

    Intel may stumble into town with a cart load of GPU's asking for directions sometime next year, that should be fun to watch too. :)
     
    Last edited: Mar 8, 2019
  45. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Correct me if I'm wrong, but that 5-7% delta is only in relation to games. What about the compute side of RVII (which has massive FP64, double the RAM and 1TB bandwidth among other things)?
    If you take into account gaming and compute together, that 5-7% gaming delta doesn't really sound like much (or anything)... so, wouldn't that actually make RVII a far better deal?
     
  46. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    Ya, no.
     
  47. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Look at the graph

    Sent from my SM-G900P using Tapatalk
     
  48. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    far cry 4k.jpg

    Stock 2080 Ti XC Ultra vs Overclocked +150mhz/+800mhz memory. This is with programs open in the background, so less than ideal performance testing but the numbers still show GN is a little off.

    This is 4K/High settings/DX11 just like they report.
     
  49. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    That is not controlling for all test rig factors, such as the CPU used and speed used, ram speed, hardware variance, etc. Can you show that, controlling for all those factors and ambient temps, their numbers are not correct. If so, contact them. Otherwise, you are using anecdotal evidence without other factors controlled for to try to discredit a larger set of data.

    Edit: Here is their test setup:
    upload_2019-3-8_5-34-34.png

    So, you are comparing a 9900K, likely at 5GHz or above, to an 8086K@5GHz. You do not control for ram speed or timings. You don't control for MB being used. You don't control for cooling or ambient temp (by cooling, I mean specifically in case or test bench temp that may effect boost clocks of the GPU during testing), etc. Those factors were controlled for in their testing between the different cards. You changed those variables, then say you cannot trust their results because the cited result is different than theirs, without an examination of WHY it is different.

    Edit 2:
    upload_2019-3-8_5-42-40.png
    This means it has NOT been tested with game updates which may have changed the FPS in the interim period. It also shows that when overclocked, it roughly comes in line with 82FPS, whereas they present both stock and overclocked 2080 Ti information in the graph.

    But as explained and showed, there are variables you did not control for in your accusation that the numbers are not realistic.

    Sent from my SM-G900P using Tapatalk
     
    Last edited: Mar 8, 2019
    Talon likes this.
  50. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,694
    Trophy Points:
    331
    My 9900K is at 5Ghz for gaming, but this is 4K high settings and the bottleneck is on the GPU. GN themselves have shown themselves that at just 1440p normal settings an 8700K @5ghz (which is what they used) performs the same as a 9900K at 5.2Ghz in Far Cry 5. I don't remember ever seeing a motherboard making any notable difference in gaming performance when taken from the same tier and both are fully functional and operating as intended. As far as ram speed and timings go, sure my ram is faster than their 3200mhz cl16 but that is unlikely to make up the large variance there.

    My 2080 Ti XC Ultra was running stock power (no slider movements), fans at auto, everything default in afterburner. You are correct that ambient temps and therefore GPU temps will change GPU boost, but I looked after the run and saw clockspeed around 1905-1925mhz under load. Most 2080 Ti cards around this tier run this speed or higher out of box. This was checked and accounted for.

    It would seem the biggest oversight by them here is the very old Nvidia driver they are using. They didn't actually test the game, which also likely saw updates and possible performance updates. They didn't account for Windows updates since their initial testing, etc. Those results are therefore not valid.
     
    Last edited: Mar 8, 2019
    Papusan and ajc9988 like this.
← Previous pageNext page →