Nvidia Thread

Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    3,481
    Messages:
    5,935
    Likes Received:
    4,729
    Trophy Points:
    431
    I figured we'd see that kind of behaviour with RTX on, in terms of low GPU usage, for the reasons that yrekabakery states above this post, but I did wonder whether GPUz and other tools would actually report it as such, or whether it would take RT cores into account. Would be interesting to have monitoring of both RT cores and GPU cores for their %'age utilisation - as a user I'd like to know how much they're seperately being used.
     
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    5,076
    Messages:
    17,819
    Likes Received:
    21,841
    Trophy Points:
    931
    I am pointing out how the RTX Cores are mismatched to the GPU's, RTX features are blocking performance and wasting the GPU you paid 2x $$$$ to buy.

    Nvidia should have included RTX only on the GPU's that matched performance so RTX doesn't block the GPU performance.

    A single RTX 1060 would have made much more sense, instead of hobbling all the top GPU's with RTX BS.

    The RTX 1060 would have performed the exact same as the 2080ti RTX features perform at 1080p with RTX ON.

    WTH was Nvidia thinking reducing RT / Tensor Core counts from 2080ti on down the line as well? Every one of the RTX 20 series GPU's could have fully driven the 2080ti count RT / Tensor cores, and still had idle GPU's during RTX ON.

    The RTX 20 series is a total screw up release, with RTX configurations mis-matched ruining the performance and wasting silicon real estate, and then over charging the poor saps that buy them for the privilege of buying RTX gimped GPU's.

    Slowly it will dawn on the saps they've been had, and when they try to unload their RTX "white elephants" they will be shocked to find noone wants them, even at heavily reduced prices. o_O
     
  3. yrekabakery

    yrekabakery Notebook Deity

    Reputations:
    540
    Messages:
    1,670
    Likes Received:
    1,571
    Trophy Points:
    181
    You sound like such an expert on GPU architecture, you should be working at Nvidia. /s
     
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    5,076
    Messages:
    17,819
    Likes Received:
    21,841
    Trophy Points:
    931
    Given the power and load reduction with RTX ON, GPU cores idling at 60% utilization, I can't wait to see the full performance 4k DLSS power draw side-effects -> full power GPU + full power Tensor cores.

    Maybe they'll need to turn on the RT cores to hobble the GPU so it can slow down enough to all run with-in power spec?

    Where are all of those Nvidia AI driven DLSS driver updates with rule-sets for 4k DLSS support????

    NVIDIA’s Saturn V DGX-based supercomputing cluster was supposed to be up and running and ready to deliver those rule-sets and driver updates without developer assistance. Just send us your game.

    WTH is up with DLSS? Problems getting it working? :D
     
    Last edited: Dec 7, 2018
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    5,076
    Messages:
    17,819
    Likes Received:
    21,841
    Trophy Points:
    931
    The "big patch" reduces the ray-trace processing to improve FPS, so while faster it's only because there is less load on the RT cores. Reports of the same high input lag added with RTX ON after the patch, and frame drop with hitching crap response locking up for seconds in DX12.

    Battlefield V Ray Tracing Performance Revisited, Has Nvidia Fixed RTX?
    Hardware Unboxed
    Published on Dec 6, 2018


    demos1132 2 hours ago
    "Patch has not yet fixed the price... i'll pass. >___"

    Samsonite Dove 23 hours ago (edited)
    "Leaves. There's a massive difference in the amount of leaves blowing with the wind on the ground. Pre-patch there's loads of them, post-patch there's none, and if there are some, they're mostly static. Note I wrote this comment after the first two scenes shown. So Tim is definitely wrong that we wouldn't notice. It's such a glaringly obvious massive degradation in visual quality it's impossible to miss in my opinion."

    Battlefield 5 RTX Update - HUGE Performance Boost!
    HardwareCanucks
    Published on Dec 5, 2018
    Battlefield 5 got its Tides of War update today and it is supposed to improve performance of NVIDIA's RTX ray tracing by a huge amount. But does this patch really boost framerates by that much? Let's find out!
    deenycest10710 1 day ago
    "Good point about the memory usage, Dimitri. RTX is exciting and adds a lot of visual flair and we should expect even more performance improvements down the line. But maybe this first generation of RTX cards are already obsolete. If an expensive 2080/2070(less expensive) will run out of vram then our new shiny cards were inadequately equipped from nvidia. A LOT of memory compression and improvements will still have to be made by nvidia and game developers utilizing RTX."

    Did NVIDIA REALLY Fix DXR in Battlefield V?
    Joker Productions
    Published on Dec 6, 2018
    Is DXR Ray Tracing FIXED on Battlefield with the latest patch and driver from NVIDIA? Let's find out!
    Nathan A. Torres 11 hours ago
    "Let's fix Ray tracing performance by removing Ray tracing LOL"

    Bigdaddy Watt 7 hours ago
    "i'm still not buying this crap"

    50 FPS More With Battlefield V RTX Overture Update!

    Tech of Tomorrow
    Published on Dec 6, 2018
    Battlefield V Tides of War Chapter 1 Overture DXR Update

    michael jovi14 hours ago (edited)
    "have a chip dedicated to RT and still lose half of the performance in 4k makes it clear that this technology is not yet ready for launch"

    michael jovi 14 hours ago
    "4k 30fps???.. no thanks."

    Towdeee 10 hours ago
    "Am I the only one who sees no difference between RTX modes?"

    stan4648 hours ago
    "Towdeee gimmick flop"

    BF V RTX Blind Test - Can You Tell The Difference?
    Optimum Tech
    Published on Nov 23, 2018
    Jaz 1 week ago
    "Holy crap i was way wrong. I thought the right side (B) looked better and had shinier rock surfaces and clearer water, also better lighting on the boat on the shore scene so it had to be the RTX on...i was wrong"

    Gaming Tech UK 1 week ago
    "Only difference I see is RTX ripping your wallet and ruining your framerate."

    Ram Gopal 1 week ago
    "I legit thought right side was RTX ON, if that's what you can get with a 1080Ti without RTX I'm ok with it."

    John Petrov 1 week ago
    "Probably the most useful comparison of RTX/GTX published. Thank you."

    Declan Gallagher 1 week ago
    "Outside of reflections from water sources I can't tell ****."

    10 Series GPU's are doing just fine in games, no need to spend 2x-3x for 20 series RTX BS...

    GTX 1060 vs GTX 1070TI vs GTX 1080 vs GTX 1080TI | Tested 13 Games |
    For Gamers
    Published on Dec 6, 2018
     
    Last edited: Dec 7, 2018
  6. Bobbert9

    Bobbert9 Notebook Enthusiast

    Reputations:
    15
    Messages:
    39
    Likes Received:
    22
    Trophy Points:
    16
    I have no reason to trade up my 1080 Ti for this fluff. The way I see it, screenspace lighting and rasterized reflections make for a "forced" shiny world. Humans like shiny things, so yeah it looks "better" in a lot of those scenes. Plus it plays a lot better. That's what's gonna matter in a fast paced game where everyone is trying to kill you. Now maybe in Prey, Dishonored, etc. it would be better placed. Anyway, the raytraced world might look less shiny because maybe the real world isn't really as shiny as we would like it to be, but hey, games are an escape from the real world anyway so why not have it look really shiny?? (Now I've got that song "Shiny" from Moana going through my head!)
     
    hmscott likes this.
  7. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    3,481
    Messages:
    5,935
    Likes Received:
    4,729
    Trophy Points:
    431
    They can't really keep the same number of RT/Tensor Cores all the way from RTX 2080ti down to say RTX 2060 because that mean the top tier card would perform the same in BF V with ray tracing on as the RTX 2060, and that's just not providing enough market differentiation - it would severely put 2080ti owners noses out of joint if their card was performing the same as an RTX 2060 when using the most key feature of the card (ray tracing).
     
  8. Talon

    Talon Notebook Virtuoso

    Reputations:
    931
    Messages:
    2,753
    Likes Received:
    2,794
    Trophy Points:
    181
    No, the most key feature of a 2080 Ti is being able to play 1440p 144hz in newer AAA titles. No other card can provide this performance, aside from the Titan cards. RTX/DXR is just icing on the cake for owners. A GTX 1080 Ti and the laughable options from AMD can't even come close.
     
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    5,076
    Messages:
    17,819
    Likes Received:
    21,841
    Trophy Points:
    931
    The GPU price differentiation is safe within the realm of the rasterization performance.

    The RTX features should be an On / Off effect, not scaled by reducing the RT / Tensor core count. There was no need to double scale the performance by scaling down the RT / Tensor core count.

    Scaling the ray-tracing effect down the sku's is not presenting the RT performance as "better", it's starting out "bad" and getting worse the lower down the sku's buyers go. It's doubling down on presenting the RTX features as a failure.

    The 2080ti with the maximum RT / Tensor cores is the "least worst" ray-tracing result, and scaling the RT / Tensor cores fewer and fewer as you go down the line of GPU's makes ray-tracing worse and worse.

    It's not at all giving developers that solid base of RTX GPU's to develop for, as now they have to develop to the least performant RTX GPU, fracturing the installed base of RTX GPU's.

    With 1 stable measure of RTX feature performance across all of the new RTX GPU's Nvidia would have avoided this failing and shipped a much larger performance stable pool of RTX feature enabled GPU's for developers to target.

    IDK why this wasn't obvious to Nvidia, they should have seen that RTX needed to be a performance stable sized pool of GPU's to offer to developers.

    Nvidia can still come out with 1 RTX GPU paired with a GPU performance matched to the 2080ti RT / Tensor core count, and make that RTX GPU so inexpensive that everyone with a non-RTX GPU can add RTX features to their PC by installing a 2nd GPU with RTX features.

    If Nvidia had standardized the RT / Tensor core count across all RTX GPU's to the 2080ti RT / Tensor core count, that RTX feature add-on "2nd GPU" would simply be the lowest priced model in the RTX line.

    Standardizing on the RTX feature performance would result in creating a lower barrier for entry to hardware with RTX features that would allow everyone to turn RTX ON in new RTX games.

    Nvidia would have sold many times as many RTX enabled GPU's into the market, rapidly creating a vastly greater pool of RT enabled PC's.

    Nvidia totally screwed up the RTX GPU release.
     
    Last edited: Dec 8, 2018
  10. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    5,076
    Messages:
    17,819
    Likes Received:
    21,841
    Trophy Points:
    931
    Nvidia's 20 series GPU release announcement barely mentioned standard GPU performance, and Nvidia spent their entire release presentation on RTX features.

    Nvidia considered the 20 series as the RTX GPU re-birth of gaming GPU's, setting Nvidia up as the only solution for Ray-tracing moving forward.

    Measured by what Nvidia sold everyone as being the reason for this series of GPU's, Ray-tracing and DLSS, Nvidia has failed miserably.
     
Loading...

Share This Page