AMD GPUs up to 20-30% faster than Nvidia when paired with low-end CPU

Discussion in 'Gaming (Software and Graphics Cards)' started by yrekabakery, Mar 12, 2021.

  1. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,383
    Messages:
    3,326
    Likes Received:
    3,503
    Trophy Points:
    331




    TL;DW Nvidia driver increases CPU load in DX12/Vulkan (and properly multithreaded DX11?) games compared to AMD driver. Results in reduced performance when CPU-bound. Theorized as being due to Nvidia doing software thread scheduling in their driver, versus AMD's hardware scheduling, to improve performance in lightly threaded DX11 games at the cost of increased overhead.

    Digital Foundry saw the same behavior when testing Doom Eternal (Vulkan) on an Xbox One X APU. Going from an RTX 2060 to RX 6800 XT at CPU-bound settings doubled framerate.
     
    thewizzard1, Vasudev and JRE84 like this.
  2. Kevin

    Kevin Egregious

    Reputations:
    3,258
    Messages:
    10,704
    Likes Received:
    1,693
    Trophy Points:
    581
    Hardware Unboxed trying to get banned again.
     
  3. hfm

    hfm Notebook Prophet

    Reputations:
    2,070
    Messages:
    4,971
    Likes Received:
    2,660
    Trophy Points:
    231
    This is really only going to be relevant for gaming with low graphics settings at low resolutions. It doesn't really seem relevant unless you are all about competitive FPS and you for some reason are still on an old CPU. I definitely respect the research though.
     
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,383
    Messages:
    3,326
    Likes Received:
    3,503
    Trophy Points:
    331
    Except that wasn't the case??? With the lower end CPUs, WD:L and H:ZD were limited to less than 100 FPS on average, with minimums closer to 60, on the Nvidia cards. On a high refresh display, those drops are noticeable, not to mention the noticeable frametime stuttering when CPU-bound at low framerates. BTW, the i3-10100 is the same as the i7-6700K and 7700K, which are still relatively popular among gamers who like to stay on CPUs for several generations and just upgrade the GPU.

    The Nvidia driver overhead probably explains why I was getting such poor performance in the large modes on CoD Modern Warfare/Warzone and Black Ops Cold War, which are DX12 exclusive, on my previous overclocked i5-8600K with all the settings on lowest. In Cold War especially, it was a nearly constant stuttery 90-100 FPS with maxed out CPU utilization.

    blackopscoldwar_2020_qfjll.jpg
     
    Vasudev, hfm and BrightSmith like this.
  5. hfm

    hfm Notebook Prophet

    Reputations:
    2,070
    Messages:
    4,971
    Likes Received:
    2,660
    Trophy Points:
    231
    I'm not saying it's not happening, but if you look at the 1080p ultra and higher numbers the higher tier GPUs pull away convincingly on stronger CPUs, over and above the lesser GPUs on those same better CPU. Your example is a 6c/6t part? That would perform worse than a 4c/8t part in most gaming workloads I would think. No one is pairing that CPU with an RTX 3070+ or RX6800+. I watched the video, I had to rewatch to make sure I didn't misinterpret the data. You can basically see the 5700XT hitting a brick wall in 1080p Ultra on more than one game they test where the 3070/3090/6900 is still scaling fine. Anyone building a desktop system with these higher tier GPUs is going to be using a far better CPU than an i3-10100 or 1600X, the only reason to show this is to make a case for the research. The real world application isn't there.

    It was however somewhat interesting to me as someone who uses an eGPU with lesser CPUs, but the bottlenecks for eGPU systems are still elsewhere. It would be interesting to repeat this data again once we start seeing Radeon mobile parts in shipping laptops.
     
  6. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,383
    Messages:
    3,326
    Likes Received:
    3,503
    Trophy Points:
    331
    4 Hyper-Threaded cores is slower than 6 physical cores.

    image0.jpg image1.jpg

    The 6900 scales fine, the 3090 does not. In WD:L, the 3090 is CPU bottlenecked in all scenarios except at 1440p Ultra with the 5600X.

    Again, you're missing point. Desktop gamers generally do not upgrade their CPU/platform as often as they do their GPU. Someone pairing their new RTX 3070 with the enormously popular 3600/3600X is gonna find their performance being no better than the 5600 XT, and much worse than the 5700 XT, both of which should be far slower cards than the 3070. And if they did have a new 5600X, their 3070 is still no better than a 5700XT. That is a problem.

    HZD.png
     
    Last edited: Mar 13, 2021
    hfm likes this.
  7. hfm

    hfm Notebook Prophet

    Reputations:
    2,070
    Messages:
    4,971
    Likes Received:
    2,660
    Trophy Points:
    231
    I can cherry pick graphs out of the research as well, instead of looking at 1080p Medium in Watch Dogs lets look at 1440p Ultra.
    upload_2021-3-14_1-53-53.png
     
  8. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,383
    Messages:
    3,326
    Likes Received:
    3,503
    Trophy Points:
    331
    Way to miss the point again.
     
    hfm likes this.
  9. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    10,316
    Messages:
    10,985
    Likes Received:
    8,350
    Trophy Points:
    931
    Dunno if it helps anyone. I managed to somewhat fix constant micro-stutters or fps drops by reverting to MSFT driver which includes just display driver, control panel and hd audio driver (hdmi) and no Physx and other bloatwares. Had to use wumt x64 to scan my system after DDu'ing 430.xx Dell driver, stock nvidia 440.xx, 460.xx which all had micro-freezes and high battery drain in Optimus mode on 980M with 6700HQ.
    Reduced battery drain to from 45W to 7-12W. Also, nvidia GPU utilisation tray icon seems to be buggy and had to un-check it to fix dGPU freezing on windows desktop.
     
  10. hfm

    hfm Notebook Prophet

    Reputations:
    2,070
    Messages:
    4,971
    Likes Received:
    2,660
    Trophy Points:
    231
    So everyone is playing on old CPUs with $550-1500 MSRP (3070 thru 3090/6900) GPUs at 1080p medium settings to realize gains over competing GPUs? I don't get it. If you're not cranking it up to High/Ultra at 1080p (discounting 1440p is the fastest growing segment) with a GPU that expensive you're either a serious competitor in esports FPS titles or ..I can't think of another reason. I crank everything up as high as I can go and only have a 2070 constrained by a 15W 3 year old CPU over a TB connection. I don't know why I would go to medium settings when the FPS gain there is meaningless in all but esports. Every single case where they showed the Radeon outperforming the nvidia cards it was 1080p lowered settings.. for a GPU that can handle better. It's just in the name of finding a thing that most people won't encounter. But if you are in that situation it's good information, never countered that.

    I'm not arguing that the research hasn't proven that when wildly CPU constrained (cranking out as many frames as possible at low resolution lower settings) the evidence is that radeon cards/drivers are more efficient, but I am arguing that at the price points of those cards it seems weird to pair it with a super inexpensive CPU when something that would make better use of the cards from either vendor is a fraction of the cost of the video card. Even at the $550-650 price point. Even then a lot of AMD boards can take a better Ryzen CPU without upgrading the motherboard, just drop in a 3800XT, unlike on the Intel side.

    Maybe I'm just in the camp of people that think spending another $300 for a CPU when you're spending $750 on a 3080 makes more sense and I am out of touch with the multitudes of people that are using old CPUs with brand new expensive video cards.
     
    JRE84 likes this.
Loading...

Share This Page