The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    AMD GPUs up to 20-30% faster than Nvidia when paired with low-end CPU

    Discussion in 'Gaming (Software and Graphics Cards)' started by yrekabakery, Mar 12, 2021.

  1. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331




    TL;DW Nvidia driver increases CPU load in DX12/Vulkan (and properly multithreaded DX11?) games compared to AMD driver. Results in reduced performance when CPU-bound. Theorized as being due to Nvidia doing software thread scheduling in their driver, versus AMD's hardware scheduling, to improve performance in lightly threaded DX11 games at the cost of increased overhead.

    Digital Foundry saw the same behavior when testing Doom Eternal (Vulkan) on an Xbox One X APU. Going from an RTX 2060 to RX 6800 XT at CPU-bound settings doubled framerate.
     
    thewizzard1, Vasudev and JRE84 like this.
  2. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Hardware Unboxed trying to get banned again.
     
  3. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    This is really only going to be relevant for gaming with low graphics settings at low resolutions. It doesn't really seem relevant unless you are all about competitive FPS and you for some reason are still on an old CPU. I definitely respect the research though.
     
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Except that wasn't the case??? With the lower end CPUs, WD:L and H:ZD were limited to less than 100 FPS on average, with minimums closer to 60, on the Nvidia cards. On a high refresh display, those drops are noticeable, not to mention the noticeable frametime stuttering when CPU-bound at low framerates. BTW, the i3-10100 is the same as the i7-6700K and 7700K, which are still relatively popular among gamers who like to stay on CPUs for several generations and just upgrade the GPU.

    The Nvidia driver overhead probably explains why I was getting such poor performance in the large modes on CoD Modern Warfare/Warzone and Black Ops Cold War, which are DX12 exclusive, on my previous overclocked i5-8600K with all the settings on lowest. In Cold War especially, it was a nearly constant stuttery 90-100 FPS with maxed out CPU utilization.

    blackopscoldwar_2020_qfjll.jpg
     
    Vasudev, hfm and BrightSmith like this.
  5. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I'm not saying it's not happening, but if you look at the 1080p ultra and higher numbers the higher tier GPUs pull away convincingly on stronger CPUs, over and above the lesser GPUs on those same better CPU. Your example is a 6c/6t part? That would perform worse than a 4c/8t part in most gaming workloads I would think. No one is pairing that CPU with an RTX 3070+ or RX6800+. I watched the video, I had to rewatch to make sure I didn't misinterpret the data. You can basically see the 5700XT hitting a brick wall in 1080p Ultra on more than one game they test where the 3070/3090/6900 is still scaling fine. Anyone building a desktop system with these higher tier GPUs is going to be using a far better CPU than an i3-10100 or 1600X, the only reason to show this is to make a case for the research. The real world application isn't there.

    It was however somewhat interesting to me as someone who uses an eGPU with lesser CPUs, but the bottlenecks for eGPU systems are still elsewhere. It would be interesting to repeat this data again once we start seeing Radeon mobile parts in shipping laptops.
     
  6. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    4 Hyper-Threaded cores is slower than 6 physical cores.

    image0.jpg image1.jpg

    The 6900 scales fine, the 3090 does not. In WD:L, the 3090 is CPU bottlenecked in all scenarios except at 1440p Ultra with the 5600X.

    Again, you're missing point. Desktop gamers generally do not upgrade their CPU/platform as often as they do their GPU. Someone pairing their new RTX 3070 with the enormously popular 3600/3600X is gonna find their performance being no better than the 5600 XT, and much worse than the 5700 XT, both of which should be far slower cards than the 3070. And if they did have a new 5600X, their 3070 is still no better than a 5700XT. That is a problem.

    HZD.png
     
    Last edited: Mar 13, 2021
    hfm likes this.
  7. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I can cherry pick graphs out of the research as well, instead of looking at 1080p Medium in Watch Dogs lets look at 1440p Ultra.
    upload_2021-3-14_1-53-53.png
     
  8. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Way to miss the point again.
     
    hfm likes this.
  9. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Dunno if it helps anyone. I managed to somewhat fix constant micro-stutters or fps drops by reverting to MSFT driver which includes just display driver, control panel and hd audio driver (hdmi) and no Physx and other bloatwares. Had to use wumt x64 to scan my system after DDu'ing 430.xx Dell driver, stock nvidia 440.xx, 460.xx which all had micro-freezes and high battery drain in Optimus mode on 980M with 6700HQ.
    Reduced battery drain to from 45W to 7-12W. Also, nvidia GPU utilisation tray icon seems to be buggy and had to un-check it to fix dGPU freezing on windows desktop.
     
  10. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    So everyone is playing on old CPUs with $550-1500 MSRP (3070 thru 3090/6900) GPUs at 1080p medium settings to realize gains over competing GPUs? I don't get it. If you're not cranking it up to High/Ultra at 1080p (discounting 1440p is the fastest growing segment) with a GPU that expensive you're either a serious competitor in esports FPS titles or ..I can't think of another reason. I crank everything up as high as I can go and only have a 2070 constrained by a 15W 3 year old CPU over a TB connection. I don't know why I would go to medium settings when the FPS gain there is meaningless in all but esports. Every single case where they showed the Radeon outperforming the nvidia cards it was 1080p lowered settings.. for a GPU that can handle better. It's just in the name of finding a thing that most people won't encounter. But if you are in that situation it's good information, never countered that.

    I'm not arguing that the research hasn't proven that when wildly CPU constrained (cranking out as many frames as possible at low resolution lower settings) the evidence is that radeon cards/drivers are more efficient, but I am arguing that at the price points of those cards it seems weird to pair it with a super inexpensive CPU when something that would make better use of the cards from either vendor is a fraction of the cost of the video card. Even at the $550-650 price point. Even then a lot of AMD boards can take a better Ryzen CPU without upgrading the motherboard, just drop in a 3800XT, unlike on the Intel side.

    Maybe I'm just in the camp of people that think spending another $300 for a CPU when you're spending $750 on a 3080 makes more sense and I am out of touch with the multitudes of people that are using old CPUs with brand new expensive video cards.
     
    JRE84 likes this.
  11. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181

    not everyone games at ultra...the gains are real I say medium is the sweet spot and i personally game on all low with some settings on medium...looks identical to ultra..

    but now since i lost my 1060 laptop I game ultra but with geforce now which is like a desktop 2080...streaming is the future so who cares about hardware these days
     
    hfm likes this.
  12. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Not everyone plays at ultra settings at 60Hz even in AAA titles. High refresh rates aren’t just for for eSports, besides those games have very low CPU/GPU requirements.
     
    JRE84 likes this.
  13. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
  14. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    It's good that they bring this to the limelight so Nvidia can fix it. Imagine how much performance our laptops are leaving on the table.
     
    Vasudev likes this.
  15. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It could probably be improved, but I’m not sure if Nvidia can completely fix it on current GPUs if they lack the hardware scheduling that AMD has.
     
    Vasudev likes this.
  16. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431


    All this even becomes more confusing when you use an eGPU. The RDNA2 GPUs (probably RDNA as well) show horrible performance over TB with any driver newer than April 2020.

    https://egpu.io/forums/pc-gaming/amd-egpu-driver-bad-performance-benchmarking/#post-94039

    https://egpu.io/forums/builds/2021-...6900-xt-32gbps-m-2-adt-link-r43sg-win10-2004/

    In the end, I like the testing and they did a great job. I think their console CPU limitation angle is probably spot on for why AMD took the approach they did. I could definitely see situations where people would benefit from this data and stay away from nVidia GPUs if they are not going to upgrade their CPU. It's super interesting how confusing it made upgrading for those without a certain CPU performance level.

    I think it just comes down to what your goals are, what CPU you use and what upgrades you're going to do in the near future if you are only upgrading the GPU. For me nVidia GPUs are a better option due to TB My 2070 completely destroys that 6900XT over TB unless it's using old drivers, even with a BAD CPU limit . My CPU is currently thermal throttling at 10W even undervolted because I need to reapply TIM, and still crushed it. I also tend to crank details up as far as I can go without going below ~40-50fps which favors nVidia. I also play 99.9% single player story based games where you want to crank everything up.
     
    Last edited: Mar 15, 2021
    BrightSmith and yrekabakery like this.
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    [​IMG]
    Bottleneck with NVIDIA and DirectX 12? MSI GeForce RTX 3090 SUPRIM against MSI Radeon RX 6900XT Gaming X and its own drivers igorslab.de
    First of all, I have to put a small paragraph in front of today's post as a preface, because I want to take away the touch of sensation. Nevertheless you have to write about it and you have to test it out emotionlessly beforehand. The video from Hardware Unboxed didn't even surprise me that much, because we were able to make very similar observations on lags and latencies in a current test project (thanks to Fritz Hunter!) And are initially downright desperate at the inconsistency of some measurement data.
     
    Vasudev likes this.
  18. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    So the 6800XT/6900XT beating the 3080/3090 at lower resolutions in reviews (usually using top CPUs) was because of driver overhead after all. :biglaugh:
     
    Vasudev likes this.
  19. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    More testing

     
    Vasudev, hfm, Papusan and 1 other person like this.
  20. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    jesus....do you guys realize what this means for egpu
     
  21. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I was waiting for him to address that no one aside from competitive users chasing frames on old cpus are going to hit this as based on the framerates I'm seeing in all these graphs no one is going to use 1080p medium.. just crank to ultra.. Too bad the 1440p result was patreon only. Still good data, but not very relevant even for me with a weaker CPU and eGPU (as Radeon is just HORRID on eGPU for any driver newer than almost 12 months ago)
     
  22. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Well, 1080p med seems like a good fit for midrange cards like the 2060 and 5600XT in AC:V if you want to stay above 60 FPS. Interesting that the 5600XT significantly beats the 2060 with all CPUs there.

    EE60E4FA-A0BB-4AFA-B28E-FAECD5C7A55A.jpeg
     
    Vasudev, JRE84 and hfm like this.
  23. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    This conclusion (CPU usage) seemed like the best way to simply state it. YouTube timestamps never seem to work in the embeds, it's at 23:36

     
    Vasudev and yrekabakery like this.
  24. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    As a former i5-8600K user, I can answer that with an emphatic no. I saw so many games with 100% or close to 100% CPU usage.

    blackopscoldwar_2020_qfjll.jpg desktop_2020_12_16_220qk13.jpg image0qik7s.jpg image1.jpg witcher3_2019_06_23_1jsjq2.jpg
     
    Vasudev likes this.
  25. unlogic

    unlogic Notebook Evangelist

    Reputations:
    24
    Messages:
    310
    Likes Received:
    56
    Trophy Points:
    41
    So you’re planning to buy an AMD GPU next time?

    I guess AMD cards suit better for gamers that prefer to keep their CPU & GPU for a long period of time.

    AMD users also claimed AMD new drivers always improved their older generation graphic cards performance. I don’t know the truth (data & evidence) behind this statement.
     
    Vasudev likes this.
  26. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
  27. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Depends on how RDNA3 vs. Lovelace shakes out.
     
    unlogic and Vasudev like this.
  28. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    I was little bit shocked to see mid-range CPUs getting outdated when paired against Turing or Ampere GPUs losing max FPS.
    I'm planning to buy AMD CPU+GPU next year. Not much stocks available and most of them are over-priced. Linux support is better on AMD/Intel than nvidia. I lost my mind after upgrading nvidia drivers on Linux and they booted into black screen yet again and I switched to Intel iGPU and disabled nvidia GPU.
     
    jc_denton, unlogic and hfm like this.