"Not Having a Ray Tracing GPU in 2019 is Just Crazy!" - Nvidia CEO

Discussion in 'Gaming (Software and Graphics Cards)' started by hmscott, Aug 17, 2019.

  1. yrekabakery

    yrekabakery Notebook Virtuoso

    Likes Received:
    Trophy Points:
    The performance difference in Witcher 3 between HairWorks preset high (visually comparable to 64x tessellation factor in the AMD driver) and low (comparable to 8x) is around 10% on my Pascal GPU, which is hardly a crippling performance loss. Again, it’s not Nvidia’s fault that AMD’s tessellation performance is weak. Put down the tinfoil hat.

    As for the visual difference, I do notice the difference in the roundness of the hair strands between high and low. It’s visually similar to the difference between higher and lower poly count 3D geometry. But as with most discussions centering around the merits of higher visual settings in games, YMMV.

    As verified by independent testers like Battle(non)sense, FreeSync monitors do not work as well as G-Sync monitors with the physical FPGA inside. The refresh rate on FreeSync monitors is not as stable in adjusting itself to match the in-game frame rate, which results in more stutter. This is also apparent when comparing mobile G-Sync, which is based on the same Adaptive-Sync standard as FreeSync, with moduled G-Sync monitors side-by-side. The monitor with the G-Sync module is noticeably smoother.

    Nvidia did not claim this. In fact, they allow DXR on GPUs without dedicated RT hardware (GTX 1060 and faster) using compute fallback, but as expected the quality is lower and performance is vastly slower.

    Control has by far the most robust real-time ray tracing implementation to date in a AAA title. This is why cranking all ray tracing settings to the max is so expensive, as it is very much a future looking game. You can watch the Digital Foundry video and judge for yourself whether the visual difference are indistinguishable or not:

    This is wrong. AMD’s GPUs lose to their non-RTX Nvidia performance equivalents at all levels. 5700XT/R7 loses to 1080 Ti, V64 loses to 1080, and 580 loses to 1060.

    saturnotaku likes this.
  2. hfm

    hfm Notebook Prophet

    Likes Received:
    Trophy Points:
    I agree with everything that person said to a tee. All of it. Very well reasoned argument, and they are correct. A lot of features we take for granted from the birth of these cards in 1996 to now were spearheaded on new metal where the game devs had to catch up to it or we had to go through a couple gens to get the full scope of what it meant.

    Like i've been saying all along, nVidia KNEW full well people would complain to no end about this, but they had to do it to keep pushing things forward. If AMD is already launching RT features next year they were well on their way to adding it to their designs as well. It takes MANY YEARS to get this stuff from the design phase to actual shipping product. AMD was doing it a while back as well if they are launching stuff in 2020.
    Prototime likes this.
  3. hertzian56

    hertzian56 Notebook Geek

    Likes Received:
    Trophy Points:
    Idk man maybe they're better now but when I had an hd 6850 the drivers seemed whack to me and not as upkept as nvidias that was way back then though. Thermally the ones I mentioned stand, they just run way too hot. My quadro k4000m is an example of a crippled by nv card, I think they're ok in the newer quadros though, I'm thankfully out of that hole. The latest drivers for my m4000 are pretty old, win10 does all that, that is a good thing about win10 bc it was a pain getting drivers for such an old card on win7. Drivers don't even recognize it as an m4000 I think it says 7700m series hd or something like that in the amd utility that win10 installed auto. AMD's website is a mess, what a maze to find older drivers or drivers at all when I tried it for my m4000.
  4. Kevin

    Kevin Egregious

    Likes Received:
    Trophy Points:
    So basically you will accept it when AMD starts selling GPUs with dedicated raytracing hardware built into the die space, only because AMD isn't Nvidia.

    You are also falsely believe Nvidia is using some proprietary implementation, when they are just using Microsoft's open DirectX Raytracing APIs (DXR), which are exactly what AMD is going to do as well.
    hfm likes this.

Share This Page