Nvidia Thread

Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    680
    Messages:
    1,449
    Likes Received:
    1,115
    Trophy Points:
    181
    hmscott likes this.
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,852
    Messages:
    20,203
    Likes Received:
    25,018
    Trophy Points:
    931
    That's what I expected. :)

    The 2xxx series are RTX, and the GTX 11xx series are normal GPU's - pure non-RTX GPU's that have a lower power requirement to fit into laptops.

    The higher power and cooling requirements for the huge RTX die doubling size to add the RT and Tensor cores can't be scaled, as in MaxQ, due to the poor FPS performance in ray-tracing even with the 1080ti at full power, so Nvidia might as well either disable those sections or not include them in laptop GPU dies, and non-RTX desktop GPU's.

    But it is confusing when we've already seen GTX 2060/2050 GPU's mentioned. Maybe the GTX 11xx series are for laptop's only?

    Still many questions to be answered...like will there be desktop versions of the GTX2080/GTX1180 / GTX2070/GTX1170, or will there only be high-end GTX laptop versions?
    twilight-zone-get-out-promo.jpg

    More confusing is that they are making an RTX 1080 / RTX 1070... I can't imagine how slow those are going to be at raytracing @ 1080p given the full power 2080ti isn't holding firm with 60fps @ 1080p

    "Perhaps the most intriguing tidbit from the announcement is that the aforementioned Sky models will support future GPUs such as the "GTX 1180, GTX 1170, RTX 1080, RTX 1070" due to their versatile MXM 3 slots. Had Eurocom mentioned these GPUs before Gamescom, then we would have been quick to label them as placeholder names. However, the reseller is explicitly mentioning these GPU names almost two full days after the public reveal of the desktop RTX series in Cologne.

    It's possible that Nvidia will introduce a different naming convention yet again for its next generation of laptop GPUs. At best, the diverging names could simply be an attempt by the chipmaker to better distinguish between its laptop and desktop GPUs since current mobile Pascal GPUs have the exact same names as their desktop counterparts. At worse, however, we could be seeing a relatively minor refresh for mobile gamers."
     
    Last edited: Aug 22, 2018
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,852
    Messages:
    20,203
    Likes Received:
    25,018
    Trophy Points:
    931
    DLSS doesn't have to be perfect, only good enough
    https://www.reddit.com/r/nvidia/comments/99g245/dlss_doesnt_have_to_be_perfect_only_good_enough/

    Tech_AllBodies 3 points 4 hours ago
    "The idea is it's better/cheaper computationally than rendering a native higher resolution and/or other forms of MSAA.

    i.e. you could run 1080p + DLSS on a 4K monitor and it'd look nearly as good as native 4K. Or you could run 1440p + DLSS and it'd look better than 1440p with 4x MSAA, and also run faster.

    Latency is just the time it takes to render and deliver the frame. So the idea is you'll take advantage of DLSS to do MUCH less render time (because you'll render at lower res and/or turn off all AA), then add DLSS as post processing, but the whole thing will take less time than doing that level of quality natively, while looking 95%+ as good."

    DLSS could work in tandem with DSR.
    https://www.reddit.com/r/nvidia/comments/99idv2/dlss_could_work_in_tandem_with_dsr/

    RSF_Deus 2 points an hour ago
    "Ah, I think I understand what you mean, for example DSR at 4K, but game set to 50% resolution scale instead of 100% but with DLSS to get an "AI upscaler" ?"


    So, DLSS is a "cheat"?, not really rendering at 4k - or DLSS is padding frames to fluff up the FPS numbers - so at what resolution are those Nvidia published benchmarks using DLSS for the RTX2080 rendered? :D
     
    Last edited: Aug 23, 2018
    Robbo99999 likes this.
  4. mitchega

    mitchega Notebook Consultant

    Reputations:
    23
    Messages:
    103
    Likes Received:
    130
    Trophy Points:
    56
    I think it's safe to assume the actual renders are at 4k per Nvidia's graph. I just wish they had included a baseline for the quality settings and AA utilized.
     
    hmscott likes this.
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,852
    Messages:
    20,203
    Likes Received:
    25,018
    Trophy Points:
    931
    Why would you think it's safe to assume that? :)

    They show 4k DLSS is 2x the performance of the 1080, a huge jump up just from changing the anti-aliasing method to DLSS, that's not possible without changing some other variable, like the resolution.

    Otherwise simply disabling anti-aliasing would cause the performance to jump up even higher, and that's also not possible.

    The non-DLSS sections of the performance graph bars make sense, the DLSS jump in performance doesn't, unless the rendering resolution or some other quality setting(s) was also dropped to reduce the load on the GPU, increasing the FPS.

    It could be that DLSS is generating "additional" frames other than those rendered by the GPU, padding / inflating the FPS numbers. Not real rendered FPS, but DLSS "fluffed" FPS.

    Something is going on, maybe someone can point me to the docs that explain it all?
     
    Last edited: Aug 23, 2018
  6. Donald@HIDevolution

    Donald@HIDevolution Company Representative

    Reputations:
    12,818
    Messages:
    9,110
    Likes Received:
    5,298
    Trophy Points:
    681
    From the http://forum.notebookreview.com/thr...7-with-6-core-coffee-lake-cpus.804068/page-17 thread.
     
    Vistar Shook, bennyg, Papusan and 2 others like this.
  7. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,023
    Messages:
    6,608
    Likes Received:
    5,717
    Trophy Points:
    681
    Well, in those charts DLSS is actually providing anything from 25% to 55% increase in performance over non-DLSS Turing, not quite the 50-100% you mention. But yeah, I was gonna say that TAA is not a massive performance hit, although TAA x2 (x3 if it exists, not sure) has a bigger hit if I recall correctly. So yes, maybe that indicates that a more expensive form of AA was being used in the control group, because like you intimate DLSS doesn't magically increase fps, it just replaces any existing more expensive forms of AA, so DLSS won't increase frame rates above a no AA control group for instance.
     
    Last edited: Aug 23, 2018
    hmscott likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,852
    Messages:
    20,203
    Likes Received:
    25,018
    Trophy Points:
    931
    Geforce RTX 2080 vs GTX 1080 Comparison!
    The Good Old Gamer
    Published on Aug 22, 2018
    Due to backlash over the Gamescom Demos Nvidia has decided to give us a first look at the performance between the RTX 2080 and GTX 1080.

    Geforce RTX: The Cost of Progress
    The Good Old Gamer
    Published on Aug 22, 2018
    Nvidia had to choose with their Geforce RTX Lineup. Advance Technology, or Improve performance. They made their choice, and now the community seems divided.
     
    Dr. AMK likes this.
  9. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,009
    Messages:
    1,944
    Likes Received:
    4,116
    Trophy Points:
    281
    RTX 2080 TWICE The Performance Of GTX 1080??
     
    hmscott likes this.
  10. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,852
    Messages:
    20,203
    Likes Received:
    25,018
    Trophy Points:
    931
    People are MAD about NVIDIA RTX... and I think this is why
    JayzTwoCents
    Published on Aug 22, 2018
    Every since NVIDIA held its press keynote on Aug 20, 2018 there has been a lot of discussion about RTX and most of the viewers are angry and I think I know why... tell me what you feel about this situation in the comments below.
     
    Aroc likes this.
Loading...

Share This Page