The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.

  1. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    We both have valid points and I'll concede that we both have some statements that make us look 'absurd' too.

    I don't care about what is coming down the pipe. I don't care if I'm using 14nm or 10nm silicon if the performance is better than what I had last year (and it always is, otherwise I wouldn't buy). I do care about moving to a new platform and seeing drops in responsiveness I haven't seen in years though (and so should anyone else that cares for a balanced platform).

    There are always 'gotchas' with new platforms which I've mentioned before, but here is the latest.

    See:
    https://www.tomshardware.com/news/ryzen-burnout-amd-board-power-cheats-may-shorten-cpu-lifespan

    Much more anecdotal stories that are not worth repeating here about general board/port failures and other 'gotchas' that make taking the jump to a new platform (yeah; even after almost 4 years, this is still AMD trying to make things work, in my view) riskier with very little motivation $$$-wise or for the increased performance mainly in still-born BM's.

    Nobody forces us to buy every new offering even from the manufacturer we support by default. I buy when the productivity increase is tangible.

    I don't see this situation changing so much over the next couple of years (Intel vs. AMD). I know of a few AMD users that are replacing their platforms (or parts for them) for the first or second time already in a year (yeah, still under warranty but warranty means little to me when it means zero productivity until a replacement/fix is up and running again). Note this is not upgrading items. This is replacing failed parts.

    You have very good points and I can see that if all else was equal the scales would have tipped to AMD a long time ago. That simply is not what I see though.

    And right now, the past is still a good predictor of the future. AMD will leapfrog Intel and a little while later Intel will have an 'answer' that will match or possibly leapfrog AMD once again. Either way, the overall balance won't have changed. Not when dependability, reliability, and stability are important (for equivalent or slightly more $$$).
     
    ajc9988 likes this.
  2. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Now, that is one thing I did not say, but which I agree with Linus of LTT on in his "Why I still love Intel" video: if you want plug and play, Intel just works (to quote Jensen).

    Anyone here knows enough to tinker and overclocking. That's fine. But if you don't want to fine tune, Intel really works on making sure the software, drivers, etc. Just work.

    Now, I won't blame AMD for board partner's failures. That is on them. And early boards were not as robust. And if you are throwing something over powered in a cheap board, I blame EBKAC (error between keyboard and chair).

    If it is memory support on early boards, I blame AMD and their early memory controller on Zen and Zen+.

    If it is a chipset oddity issue, depends which board on who I blame.

    So I don't really care about the one off situations, but it is worth noting that there are times to blame AMD.

    But now AMD motherboards often are MORE ROBUST than Intel's from the X570 and B550 lines. Doesn't mean the firmware is as good, but what is on the board... in that way, z490 is finally catching up in some ways (but not all ways).

    Sent from my SM-G975U1 using Tapatalk
     
    tilleroftheearth likes this.
  3. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Yeah, sounds about right. :)

    Still, an AMD platform begins with AMD for me and most users. The quality issues are partly theirs too.

    Intel 'just working' is a huge benefit that they've managed to hang on to. I value this more and more as my energies are better spent elsewhere. The 'cool' tech jumps to 10nm, 7, 5, and 3nm will come when they do. I hope the 'just works' part never becomes passé (and I'll eagerly be waiting for AMD to consistently offer that too).
     
    ajc9988 likes this.
  4. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681


    Sent from my SM-G975U1 using Tapatalk

    And aside from the video by GN showing that article is trash, here, for example, is what I meant by MB quality for Intel chips. I don't blame Intel for it. And GN has shown MB mfrs playing with Bclk, etc., to get an edge in the past, which is what was somewhat being discussed for the AMD article. Here, the Z490 boards are just bad for some of them. So be careful on aspersions on AMD for MB faults. Glass houses and all.



    Sent from my SM-G975U1 using Tapatalk
     
    Last edited by a moderator: Jun 10, 2020
    Papusan, raz8020 and saturnotaku like this.
  5. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    AMD B550 Motherboards Prices Unveiled, More Expensive Than B450 But Also Feature Rich wccftech | Today

    Prices for almost all AMD B550 chipset-based motherboards by all major manufactures have been revealed. Featuring support for 3rd Gen Ryzen CPUs and beyond, the B550 boards come with new and enhanced features over their B450 predecessors but also come at higher prices, some of which even surpass X470 pricing.

    ----------------------------------------------------------------------------------

    For those who hoped AMD would let you get colder running machines.... Nothing will be better as long the notebook manufacturers keep on with destoying own products.

    [​IMG]
    Dell G5 15 SE with AMD Ryzen 7 4800H records above 100 °C average CPU temperatures leaving the Asus TUF Gaming A15 as the better Ryzen 4000 gaming laptop option
    The new all-AMD Dell G5 15 SE (5505) gaming laptop has been put through its paces in a new video from Jarrod’sTech, mainly to discover how hot the device runs. Worryingly, the AMD Ryzen 7 4800H APU inside reached average temperatures of over 100 °C during stress testing. It seems the Asus TUF Gaming A15 with the same Renoir APU offers a better cooling solution.



     
    Deks, raz8020, ajc9988 and 1 other person like this.
  6. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    TUF A15 runs cooler? Really?
    Hmmm... fellow Aussie reviewer from... probably another side of where Jarrod is, commented in another video review that TUF A15 runs hot! Ironically both configurations seems to be the same, except, perhaps in a different air-conditioned environment? I do noticed that in some Jarrod's videos, his assessment in done in a 22~ish degree celcius room, which is typically more "colder" than most office environment?
     
  7. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Pretty much.
    I'm sick of OEMs putting inadequate cooling into all AMD laptops (or laptops in general) [hugs his predator Helios 500 with 2700 and vega 56 which is one of the coolest and quietest laptops on the market].

    The g5 15 se cooling looks like it was designed for a laptop with half the power requirements of 4800h and 5600m.
    Unacceptable.

    Both 5600m and 4800h are quite efficient and powerful.
    But neither can reach or maintain their proper performance in the g5.

    Some 'experiment' this turned out to be.
    Are Dell engineers that incompetent or have they made a real mistake?
    Lol... I'm thinking this was intentional.
    Dell would need to test out the machine to see how it performs and they had ample time to put in proper cooling, but they chose not to.

    In fact the whole design of the g5 is really lacking what with the screen blocking some of the air vents but the main problem is in the cooling assembly which is inadequate
     
  8. rlk

    rlk Notebook Evangelist

    Reputations:
    146
    Messages:
    607
    Likes Received:
    316
    Trophy Points:
    76
    That's what the rage for thin, light laptops is going to do. A thin laptop -- and that largely means the main chassis, since the keyboard and display can only be thinned so much, particularly if it's only 14 or 15" -- simply doesn't have the surface area on the back to allow for enough airflow to dissipate the heat a fast processor with lots of RAM and a fast GPU will inevitably generate. Of course, if the cooling design is subpar for the dimensions available, that's only going to make matters worse.

    The Helios 500 looks like it's designed properly for a fast machine.
     
  9. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    I've been looking this latest clip thrice...

    ...just couldn't help noticing that the temperature of the GPU generated due to clocking rate is that high when compared to Nvidia's contestants.
    And if the heat pipes were linked to one another for "better" cooling purpose...
    [​IMG]
    ...more heat generated from the GPU gets transferred to CPU side!
    I think we might need reviewers to do somekind of "Smoke Test" to determine how the air flow really help in keeping the system cool.
     
  10. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    Other thin gaming gears are designed for portability. The Helios is designed bulky to be used... as desktop replacement. So as the smaller screen Clevo NH5# and their variants.
     
  11. senso

    senso Notebook Deity

    Reputations:
    560
    Messages:
    1,645
    Likes Received:
    788
    Trophy Points:
    131
    Heatpipes only take heat from the hot side to the cold side, shared heatpipe heatsinks are worse than isolated heatpipes for each half, but even that design should run cooler than that, because almost all laptops currently share that design of shared heatpipes..

    It can be as simple to explain as a crap tier stock paste and a warped heatsink.
     
  12. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    Bob of All Trades had just shared a tweak by fuddling with the Registry that can bring down the heat by up to 20degrees! What's more interesting, he mentions somewhere that tweak works for High Performance CPUs? Hmmm~~~
     
  13. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    Same heat problems in Jarrod's Tech review unit as well (two in a row). Either quality problem or trash cooling design. I will think it's the latter. 100c is the new golden standard.
     
    Deks, raz8020 and tilleroftheearth like this.
  14. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    To be fair, the Acer Helios 500 with Intel/NV isn't particularly great and requires of Intel CPU to be undervolted to behave as it should (mainly because of its ability to run away from its TDP by a large amount).
    Acer specifically revamped the cooling for both 2700 and Vega 56.
    2700 stays fairly close to its TDP (as does Zen 2 - in fact, both 2700 and 3700x get up to same power consumption under boost clocks) and Vega 56 was TDP limited in the BIOS to 120W (but it doesn't consume that much under stock when maxed out, hence it can be further overclocked to reach 2070 Max-q level of performance to just about reach 118W) - still, we've seen desktop replacements before with higher temperatures than on the PH517-61 (2700/V56)... or 17" laptop designs which were thinner yes, and had slightly higher temps than AMD Acer.

    A 15" laptop like the G5 15 SE is not particularly 'thin and light'.
    It has decent thickness... however, the overall design of the chassis (along with the cooling assembly) seem to be excessively BAD in regards to heat management and thermal performance.

    DELL just made a very bad decision for the cooling (which prevents both CPU and GPU from behaving as they should).

    Ideally, any 15" laptop should allow both the CPU and dGPU to be fully stressed simultaneously for extended periods of time and NOT exceed 80-85 degeees C.
    Its more than possible to construct such a cooling assembly without going overboard on the thickness either.

    OEM's are just too lazy to get off their rear ends and design proper cooling for individual hw (as they should) and test it extensively to make sure it can be maxed out virtually indefinitely and not have register anywhere close to 95 degrees Celsius (nevermind over 100).


    Looking at the Eluktronics Matrix laptop with 4800H and RTX 2060, now THAT is the kind of design/cooling to use for 4800H and 5600M:


    Also, the G5 15 SE is THICKER and overall BIGGER than Elukronics RP-15 (which manages much lower temperatures on both CPU and GPU when both are MAXED OUT - and the 2060 goes to 110W - which is 10W higher than the 5600M maximum).

    Nope, I'm gonna say that DELL messed things up BIG TIME in regards to cooling for 4800H/5600M
     
    Last edited: Jun 14, 2020
  15. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    100c cannot possibly be the new 'golden standard'.
    At those temperatures, the surrounding components inside laptops would probably be far more susceptible to failure (along with the CPU and dGPU themselves), and more often than not, consumer hw (such as CPU's and GPU's) operating at those high temperatures cannot usually perform as they should.

    The OEM's shouldn't be allowed to get away with this kind of nonsense.
     
  16. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    You forgot add in the cost aspect. Cheaper means the earnings/the margins will be higher. I'm sure they could do better if they spent $10 more in parts for each model.
    upload_2020-6-15_4-16-21.png
    upload_2020-6-15_4-19-10.png
     
    Last edited: Jun 14, 2020
    tilleroftheearth, raz8020 and ajc9988 like this.
  17. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Case in point, OEM's CAN design cheap laptops that are of decent quality if they bothered to do so.
    Too much cutting corners with 0 regards to quality... and that $10 more usually translates to $30-$50 more for the consumer because OEM's love to over-inflate everything (even though it probably wouldn't have added much/anything to the overall cost).
     
    Papusan likes this.
  18. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Remember, Dell is the OEM that didn't want to expand AMD offerings for Epyc 2 AT ALL. They wound up having to expand it by 3X because Epyc 2 was that good, and Intel just didn't really have anything to compete with the 64- core offerings, among the rest of the server line.

    Sent from my SM-G975U1 using Tapatalk
     
  19. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    Dell OEM was one of the few that have support AMD the last years. One of the few who put in AMD graphics in their models before AW went fully BGA

    See also... https://twitter.com/AzorFrank/status/1268657743482757122?s=20

    It's a brand new technology and to " @dell credit they jumped on it first"
     
    raz8020 likes this.
  20. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Yeah, it just looks to me Dell is too lazy (or incapable) when it comes to designing decent laptop cooling, so they are using those specs to justify ridiculous temperatures which more often than not can cause problems.
    Sorry, but I'm not accepting their poor standards.
     
    raz8020 likes this.
  21. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    They may have decided to 'support AMD', but you have to admit their 'execution' in the said department is lacking (to say the least).
    It wouldn't surprise me if they intentionally did this to 'discredit' AMD. Showcase a laptop with all AMD hw, but oh wait look, its going up to 100 degrees C...
    What do people usually say when they see that and relatively poor performance?
    'AMD is hot, loud and a mess'... and that's pretty much it.

    Of course, in light of 0 evidence to support the idea that DELL did this intentionally to discredit AMD, its a mere hypothesis - plus, DELL was called out on this when comparing the temps to other units which have better cooling designs, but some idiots would indeed jump to 'AMD sucks' bandwagon anyway without examining the details.
     
    raz8020 likes this.
  22. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    As I said... Dell was one of the very few outside Apple to support graphics cards from AMD. A cheaper choice. None of the other OEMs offered AMD graphics in laptops around 2013-2015.
    upload_2020-6-15_4-34-40.png

    The flock of sheep (all other OEMs) will follow after.
     
    electrosoft and raz8020 like this.
  23. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    When it came to the server side, Azor did not call the shots. Literally, Dell's had of that department said they would not expand Epyc offerings. He had to come out and eat his words. I don't care if Azor liked AMD and is now working for them. You are wrong about the server side.

    As to now, really couldn't care less. Since I went back to building desktops and am looking at building a water cooled server rack, I'd pick up some cheap laptop whenever my current P770ZM dies.

    I'm pricing out a water block, another 4u and 2u chassis, a wheel based server rack, rails, etc. Home servers and figuring the best way to run cables to other rooms for HDMI and USB is what I'm looking into. With 20in fans for the 480mm rads.

    https://www.itpro.co.uk/server-storage/33538/dell-set-to-triple-its-amd-server-offering

    Before this, the same VP was emphatic saying they would not expand offerings. This is what I was getting at.

    Sent from my SM-G975U1 using Tapatalk
     
    raz8020 and Papusan like this.
  24. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Hopefully not all of them.
    Acer could very well decide to make yet another version of Helios 500 with Zen 2 or Zen 3 and RDNA 2 for example (but this time with all RAM slots on the easily accessible side)... or hopefully (at least) release a BIOS update for PH517-61 so it could support Zen2/3.

    Eluktronics certainly isn't following DELL.
    The RXT 2060 in Eluktronics is boosting to 110W (10W more than what 5600M's maximum is rated at), and max temps don't seem to exceed 85 degrees Celsius, whereas the 4800H goes up to 88 degrees Celsius during maxed out workloads (I think).

    Asus on the other hand does allow the 4800H and HS versions to go up to 90 degrees Celsius - MSI Bravo 17 though allows 4800H to go up to 95 degrees Celsius.
    Asus is certainly better than DELL, but those high temps aren't going to do any favors to the motherboard and other components in the long run (same for the MSI Bravo).

    Mind you, they're doing basically the same with Intel CPU's... but DELL is certainly giving Intel more options.

    What are the chances DELL would issue a recall of existing units they sold and retrofit the G5 SE 15 with proper cooling?
    My educated guess would be 'very low' to 'non-existent'. At best they may issue a BIOS mod which would prevent the CPU from boosting (which would certainly help the thermals - but obviously, you lose performance as a result - in fact, the 5600M in DELL is already only drawing between 70-80W, not 90-100W as it should - Azor himself confirmed it).
     
    Papusan likes this.
  25. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    On a different note, here's Navi 12 with HBM2. Custom built for Apple, though, and it seems to be in mobile-only flavor.
    (I'd post this in the AMD RDNA GPU thread, but I found it hasn't had a post for a few months.)
     
    ajc9988 likes this.
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Hm...
    That's just 5600M with 8GB HBM2 and lower boost clocks than standard 5600M, resulting in 50W TBP (which is really good).
    It may be voltage optimized too... I wonder what its performance is like.
     
  27. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    AMD cut costs to increase the bottom line. Yeah, why not go the same line as Intel.

    AMD Ryzen 7 3800XT and Ryzen 9 3900XT Will Not Get Coolers Included guru3d.com By Hilbert Hagedoorn on: 06/16/2020 08:26 AM
    [​IMG]
    Most previous Ryzen 3000 models received some sort of WRAITH cooler, however, the upcoming Ryzen 7 3800XT and Ryzen 9 3900XT will not get boxed with a cooler.
     
    ajc9988 likes this.
  28. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,615
    Trophy Points:
    931
    The AMD Ryzen XT series skimps on bundled coolers and clock speeds pcworld.com | Today

    The AMD Ryzen XT series will offer moderate clock speed boosts over the prior generation, but the cooler goes away for all but one model.

    Coming almost a year after the original mind-blowing Ryzen launch, the XT-series’ mere 4-percent clock boost, is likely one big letdown to many hardware fans. Intel took flak for a slow drip of single-digit clock bumps over many generations of new CPUs.

    Why did AMD take away the cooler?
    While the official word is “you should buy aftermarket anyway,” the most obvious reason AMD is changing its mind is that the coolers aren’t really free—they’re bundled with the CPU, but AMD paid a price for them. By losing the stock cooler, the company can increase revenue. The savings isn’t passed on to the customer, unfortunately, as the launch prices for the Ryzen 3000XT parts are essentially the same as they were for the Ryzen 3000X chips which included the cooler.

    Yeah, AMD have changed course.. First with higher MB prices, tried to cut compatibility, then remove the added cooler and now a wimpy 4% boost increase. Sonn not much more can be done to increase the revenue. Maybe increase the prices further?
     
    Last edited: Jun 16, 2020
    tilleroftheearth likes this.
  29. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    The main thing is that this release won't see much of a performance increase (if any). I wouldn't see a point in having people spend money on this refresh (just get the original Zen 2 or wait for Zen 3 instead).
    The refresh would in fact result in negligible performance difference (not worth the price).
    I don't understand why AMD decided to refresh Zen 2 so close to Zen 3 release and with such a minor bump up in clocks... if its just a way to increase revenue... well, that's one way of explaining it - but it still doesn't make sense so close to Zen 3 (which hadn't been delayed according to latest AMD news).

    If the refresh offered a moderate boost in performance (say 15%), then perhaps... but the overall cost of this refresh would need to be LOWER for the consumer, not same.
    Even Zen+ was done on a new node (or basically a revised 14nm node) which resulted in price drops across the board.

    As for higher MB prices... uhm, isn't that determined by the OEM's, and not AMD itself?
    Some B550's are more expensive than X570's.
     
  30. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    One note on the MB prices: some of the B550 boards are significantly better than B450 and Z490 boards. The bill of materials used and feature sets are more robust. There are a couple videos on this over the past week or so. This makes the X570 and B550 tomahawk and the giga b550 Auros, which has the same VRM as GBs $700 board at the $280 (I think it was, need to recheck hardware unboxed video) price point or so AND has more Gen 4 pcie than any other B550 board, excellent buys.

    For Zen 1, they ported Intel's low end designs over. Now, Intel's boards are lower on features in some ways. So not quite something to shout about. Plus, with Samsung's new gen 4 pcie nvme drive, you have to have a Z490 or an AMD board x570 or newer.

    Sent from my SM-G975U1 using Tapatalk
     
  31. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Initial tests shows impressive performance gains with new 5600M AMD GPU in 16-inch MacBook Pro

    https://9to5mac.com/2020/06/17/init...ith-new-5600m-amd-gpu-in-16-inch-macbook-pro/

    50W TBP for that GPU.
    Impressive results given the low power requirements and running on battery... but I'm not entirely certain how it compares to RTX 2060 laptop, etc.

    Although, I am questioning as to why isn't this version available for Windows laptops?
    Or, why isn't AMD optimizing the regular 5600M in a similar way?
     
  32. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    With the refresh you have to look at scaling to Zen3. If they were to have 15% improvement on the refresh where then would the additional 15% for Zen3 come from?
     
  33. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Yeah, the refresh won't be 15%. Base clock on them is like 5-7%. But boost is only like 2%. So for the extra price, it isn't worth it.

    Also, Jay of JayzTwoCentz on a video today said the AMD rep told them since the rumor of a delay that the rumor is crap and it is on for this fall.

    Almost makes you wonder who started the rumor.

    In any case, the 15% IPC plus the latency reduction, Zen 3 will be really good.

    Some people here mistake banking on compatibility with a MB with whether or not you can believe in the performance gains coming. You can trust the performance. AMD has shown that. Just not on whether or not you can use the same board.

    Cannot wait for this fall to grab a cheap used 9900k and mod my Z170 board (I have the board, so Zen 3 being so good may force prices to reasonable levels, along with large corps firing people in October because that restriction on paycheck loans is over, so people will dump hardware). But i like projects like that! And own a maximus viii extreme already.

    That should hold me over to Zen 4 for my other systems.

    Sent from my SM-G975U1 using Tapatalk
     
  34. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    The other thing is having even 4% better performance makes them a better option for a new build now and a more viable one once Zen3 is here too.
     
    ajc9988 likes this.
  35. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Relatively simple:

    Zen 2 refresh would need to get 10-15% increase in performance from clock increases mostly (and maybe some further adjustments to the existing uArch) to make it a noteworthy buy... otherwise, a 4-5% increase in performance is not exactly useful , and can easily be reached/surpassed with existing Zen 2 cpu's if they're mildly overclocked).
    Zen 2 refresh is also made on the same 7nm process as Zen 2 which would provide minimal improvements in frequency scaling only a year after using it (especially if AMD wanted to keep existing TDP).

    Zen 3 is a new uArch and we can likely expect IPC increase of 15-20%, possibly more (which is expected of a completely new architecture) and is also manufactured on an IMPROVED 7nm process.

    So, Zen 2 refresh would need to have notable performance increase from frequency improvements (better yields and whatnot on existing 7nm) - but since this isn't happening, its only 4-5% increase.

    Zen 3 = performance increase from new uArch IPC improvements (with some frequency increases due to using an improved 7nm process).

    I know there are limits to what AMD can do, but as I said, this 'refresh' seems pointless so close to Zen 3 release.
     
  36. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    What makes you think Zen 3 would force prices of Intel HW down?
    I mean, sure, that's how things USUALLY work... however, Intel didn't exactly show signs of 'budging' too much on the price of their hw despite Zen 2 success (in fact, their 10900K is more expensive than 3900x, and requires twice the power consumption to match it).

    Also, it doesn't necessarily mean that Zen 3 will be expensive.
    It will retain existing core counts with new uArch resulting in much better IPC and some frequency increases due to using better manuf. process.
    If anything, I think Zen 3 prices should be comparable, if not same as Zen 2 (though Zen 3 refresh may have thrown that possibility into the dumpster).

    But I guess we'll see.

    Its also possible AMD got a bit greedy based on Zen 2's success and are releasing Zen 2 refresh (some without coolers) at same price as Zen 2 to increase revenue... and from that point they could increase prices further on Zen 3 (even though technically, Zen 2 refresh shouldn't have happened, and Zen 2 would have moved down a tier on pricing, whereas Zen 3 would have been priced the same as Zen 2).
     
  37. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Secondary market. Had nothing to do with Intel's pricing. It is people trying to sell hardware to buy new hardware.

    Further, the reason to put it out was they binned them, then have to sell them before Zen 3. The binned chips were planned in case Intel had something in the 10th series. Three new chips at specific segments to try to work against three specific offerings: Intel's 10600K with the new 3600XT, the 10700K with the 3800XT, with the 3700X at almost the price of the 3600XT and the 3800X where the 3700x used to be, then the 3900XT against the 10 core at that price point, with the 3900X much cheaper, like 9900k/10700K pricing.

    It is clear they just wanted to have a price point chip spec. And likely also clear out binned chips.

    Intel may have passed the rumor on AMD delaying, suggesting just buy Intel chips now for gaming, not when AMD gets another jump in competitiveness, especially with that road map showing Intel may not get rocket lake out this fall.



    Sent from my SM-G975U1 using Tapatalk
     
    Last edited: Jun 25, 2020
  38. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
  39. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Vasudev and etern4l like this.
  40. Vlad_The_Unknown

    Vlad_The_Unknown Notebook Guru

    Reputations:
    8
    Messages:
    65
    Likes Received:
    21
    Trophy Points:
    16
    etern4l likes this.
  41. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    if itis not production silicon we can test, then...
     
  42. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Meanwhile, Intel's inability to satisfy demand is even worse.
    https://marketrealist.com/2019/12/intels-cpu-shortage-impact-chip-giants-pc-makers/#:~:text=Then, in August 2018, Intel,and 10nm chip production delays.

    XMG is complaining too much (and besides, AMD has been delayed by mere 1 month) and XMG are stupidly attacking AMD implying 'Intel is still better for gaming'.
    Excuse me?
    When equalized at same TDP, Renoir easily surpasses Intel in both single threaded and multithreaded tasks while being much more efficient.
    Yes, Intel can boost in single core to higher levels, and that gives it a marginal edge (3%), but it quickly loses that as the CPU cannot possibly sustain that under those conditions... meanwhile, Zen 2 has virtually no issues with sustaining its boost clocks and stays much closer to its TDP than Intel.

    Furthemore, the idiots at XMG used a paltry 15" chassis and stuffed a 12 or 16 core Zen 2 CPU inside rated for 65W.
    Acer at least had the sensibility to use a 17" chassis and create really good cooling (which is even equal or better than most desktops) for Ryzen 2700 (also 65W) and V56 (120W) in PH517-61

    Also, I asked XMG about using AMD gpu's and they blatantly said that they have 0 plans on using Navi mobile gpu's because NV's are typically 'better' (even though they're not).
    Mobile Navi seems same as (at least) or BETTER than NV's equivalent offerings in both performance and efficiency.
    Drivers also hadn't been a particular problem in that department.
     
    Last edited: Aug 6, 2020
    Vasudev and etern4l like this.
  43. Ed. Yang

    Ed. Yang Notebook Deity

    Reputations:
    86
    Messages:
    751
    Likes Received:
    199
    Trophy Points:
    56
    Vasudev likes this.
  44. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    One thing that intrigues me about RDNA 2 is how much more powerful it can turn out to be.
    AMD mentioned that RDNA 2 comes with 50% more performance per watt.
    Also, its been confirmed that both Zen 3 and RDNA 2 will be manufactured on 7nm+ EUV after all (which allows for 20% higher density and 15% increase in efficiency/or 10% increase in performance over existing 7nm).

    The next are just some fanciful musings of my own (don't consider this accurate) :

    This could mean that a RDNA 2 at 40CU's would have the same performance of 2080ti at 115W (at 4K).
    48CU RDNA 2 would be 20% more powerful at 161W
    60CU RDNA 2 would be 50% more powerful at 230W
    72CU RDNA 2 would be 80% more powerful at 299W
    80CU RDNA 2 would be 100% more powerful at 345W.

    That's assuming RDNA 2 gets 10% IPC boost (and AMD drops frequencies by about 20% - essentially IPC enhancements keep the performance - otherwise, the above power consumption values go all the way up to 540W for 80CU version)... if not, then frequencies would need to drop about 20% (which would drop performance by about 10% across the board) for the above TDP values to remain the same (resulting in 48CU RDNA 2 being 10% more powerful, 60CU RDNA 2 being 40% more powerful, 72CU RDNA 2 being 70% more powerful, and 80CU RDNA 2 being 90% more powerful than 2080ti - at 4k).

    If the increase in performance per watt however also includes node enhancements... then it would certainly affect the above values... by how much... not entirely sure.
    48CU RDNA 2 would in that case be more like equivalent of 2080ti at 4k and 161W (with IPC improvement of 10%... if not, then 10% slower than 2080ti).
    60CU RDNA 2 would be about 25% faster at 230W (with IPC improvement of 10%... if not, then 15% faster)
    72CU RDNA 2 would be about 50% faster at 299W (with IPC improvement of 10%... if not, then 40% faster)
    80CU RDNA 2 would be about 80% faster at 350W (with IPC improvement of 10%... if not, then 70% faster).

    That's even assuming AMD decides to use 72-80CU GPU's (and there's also the fact AMD will have raytracing too on these new GPU's).
    Instead of being twice as big as 5700XT, the 80CU GPU's would be about 80% larger on the EUV node... and probably expensive.

    I could be just dreaming up ridiculous numbers... but if the rumors turn out accurate... then this could be in the ballpark perhaps?

    EDIT: these TDP values would probably remain if AMD did nothing to optimize voltages from factory (its possible they will finally improve upon this - but not certain - if they don't then manual undervolting could further drop power consumption by about 20%).
     
    Last edited: Aug 10, 2020
    Vasudev likes this.
  45. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Yes
     
    raz8020 and Papusan like this.
  46. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Well, scratch what I said.
    Looks like I misread that AMD will be using 7nm+ (EUV) for RDNA 2 and Zen 3 after all.
    N7P will be used and as such, it has no density improvements over existing 7nm... just some efficiency improvements (of about 10%... or about 7% performance improvement apparently).
     
  47. senso

    senso Notebook Deity

    Reputations:
    560
    Messages:
    1,645
    Likes Received:
    788
    Trophy Points:
    131
    Then you add the buggy mess that are AMD drivers and you subtract 50% performance because the drivers dont even know how to use the hardware..
     
    raz8020 likes this.
  48. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,278
    Likes Received:
    8,814
    Trophy Points:
    931
    30% performance loss is realistic and it will take many years I mean many years until AMD drivers are mature enough to tackle already deprecated/EOL nvidia drivers.
    At those early times, nvidia were too expensive to buy and I went with AMD to save some cash and after 8 years the performance is better. With nvidia drivers performance should stay relatively similar with +/-5% variance (if Nvidia deems Amd has better perf. after driver update). You can almost skip updating nvidia drivers unless its a new game or new Windows 10 forced upgrade to latest version.
     
    raz8020 likes this.
  49. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Aren't you over-exaggerating?
    To say the drivers don't know how to use the hw is a proverbial fallacy.
    Where is the case of AMD drivers not using GPU's to their fullest extent (least of all lacking 50% performance)?
    Sure, initial drivers weren't always best from AMD, but they stepped up their game by releasing far higher quality drivers at much lower intervals (once or twice a month), and their Zen profits have probably been shuffled somewhat into driver development too.

    I cannot recall a single case where drivers caused 50% performance loss... or even 30% or that it took 'many years' to recover this performance.
    10-20% perhaps... and NV has a similar track record in doing this (not to mention that their drivers aren't the most stable - something which is apparently not overtly talked about).

    AMD positions its products in the market and that's usually the performance we can expect (which is achieved with the drivers of that time).
    Driver improvements are always welcome and if they happen to improve performance of a GPU years down the line unexpectedly, so be it.
    AMD for example cannot magically give you more performance if the developer decided to code the game for Nvidia gpu's (I'd like to say the same applies in the case of AMD coded games, but history shows that since games coded for AMD use open-source features, the performance enhancements CAN be done by NV via drivers... and yes, while AMD may hold an advantage in those games, the differences are not as pronounced as on games coded for NV).

    I sincerely doubt AMD or NV can easily produce drivers which will max out GPU performance in the first available drivers without faults (especially when you factor in the differences in systems people have on desktops, etc.)... not unless they start using AI to analyze the hw and modify the driver coding accordingly to each and every individual system... and I doubt either are using this method in any (or significant) capacity as of yet.
     
  50. senso

    senso Notebook Deity

    Reputations:
    560
    Messages:
    1,645
    Likes Received:
    788
    Trophy Points:
    131
    There is lots of reports of wonky working drivers on the AMD sub-reddit, and an attempt to explain everything with "YoUr RaM iS BaD" excuse or the your PSU is crappy even if its some Platinum 1000W CPU.

    And then there is my personal experience with a couple AMD GPUs, high power draw, drivers where hit or miss 3-4 years ago, the layout of the Catalyst or whatever they call their thingy was awfull to navigate, and AMD saving grace was the price, and nothing more.
    Sorry to say so, but AMD/ATI always left a bad impression after I tried to use them in desktops.
     
    Last edited: Aug 11, 2020
    Vasudev, raz8020 and Papusan like this.
← Previous pageNext page →