Next gen Laptop GPU timeline

Discussion in 'Gaming (Software and Graphics Cards)' started by shinryu744, Aug 14, 2018.

  1. BruceEdwards

    BruceEdwards Notebook Enthusiast

    Reputations:
    42
    Messages:
    47
    Likes Received:
    25
    Trophy Points:
    26
    I believe we will see some monstrous gaming laptops with 2080's (e.g. the Titans), but I also believe we may have seen the last of SLI (or equivalent dual-gpu tech) laptops. I can't see how they can bridge the gap in required power draw for a pair of 2080's and a decent CPU.

    I hope to be proven wrong!
     
  2. hfm

    hfm Notebook Virtuoso

    Reputations:
    1,777
    Messages:
    3,780
    Likes Received:
    1,539
    Trophy Points:
    231
    I dunno, the estimated TDP on the 2080 is pretty high.. It exceeds the 1080 by a decent margin it seems... we'll see I suppose! :) I'm sure if it can be done we will see it happen.
     
  3. Kevin

    Kevin Egregious

    Reputations:
    3,128
    Messages:
    10,302
    Likes Received:
    1,282
    Trophy Points:
    581
    Of course there will be. The RTX 2080 Max-Q was already leaked.

    The days of us getting fully closed full bore desktop cards like Pascal are probably over, but there's no way Nvidia will leave hundreds of millions on the table.
     
  4. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,020
    Messages:
    5,817
    Likes Received:
    372
    Trophy Points:
    251
    Is ray tracing the next big thing?
    I really don’t see it. I can’t see what the big deal is about improved reflections but maybe I’m just unimpressed, but that’s just me.
     
  5. hfm

    hfm Notebook Virtuoso

    Reputations:
    1,777
    Messages:
    3,780
    Likes Received:
    1,539
    Trophy Points:
    231
    It's more than that, it's lighting, the whole nine yards. We've been able to fake these things well enough though. It'll take some years before it's commonplace.
     
  6. Prototime

    Prototime Notebook Evangelist

    Reputations:
    116
    Messages:
    494
    Likes Received:
    671
    Trophy Points:
    106
    I hope not. Though that has been my biggest fear since max-Q was first accounced. With Pascal, we finally, finally got desktop GPUs in notebooks. What a joke it'll be if nvidia gives us that for only a single generation.

    Wouldn't surprise me though.
     
  7. yrekabakery

    yrekabakery Notebook Deity

    Reputations:
    660
    Messages:
    1,936
    Likes Received:
    1,930
    Trophy Points:
    181
    Except it's not the whole nine yards. Current RTX is hybrid raster-RT rendering. It'll be a long, long time before hardware catches up to the point where everything can be ray traced at the resolutions and frame rates gamers demand.

    [​IMG]
     
  8. hfm

    hfm Notebook Virtuoso

    Reputations:
    1,777
    Messages:
    3,780
    Likes Received:
    1,539
    Trophy Points:
    231
    I'm just saying it's not just reflections.. well.. all ray tracing is just light reflecting off of and being manipulated by objects.. Again just reflecting and diffusing. It can increase realism above and beyond the tricks implemented today.. it's just going to take a while to get there without making a lot of sacrifices and optimization decisions..

    I applaud nvidia for taking the first step even if it's of limited practicality right now.. the first step is always the hardest.
     
  9. franzerich

    franzerich Notebook Evangelist

    Reputations:
    4
    Messages:
    493
    Likes Received:
    113
    Trophy Points:
    56
    Let the speculations begin...

    Since there was a rumor that the mobile GPUs will be announced at CE2019, let us speculate how the lowend and midrange GPU's would look like in terms of basic hardware configurations. I asked the magic ball and it gave me following information based on existing data and guesswork:

    TU102 (RTX 2080Ti) has 6 GPCs with each having 6 SM Units
    TU104 (RTX 2080) has 6 GPCs with each having 4 SM Units
    TU106 (RTX 2070) has 3 GPCs with each having 6 SM Units

    Which can result in:
    TU107 (GTX 2060) has 2 GPCs with each having 6 SM Units
    TU108 (GTX 2050) has 2 GPCs with each having 4 SM Units

    Each SM having 128 cores makes a MAXIMUM of:
    TU102 (RTX 2080Ti) = 6x6x128 = 4608 cores
    TU104 (RTX 2080) = 6x4x128 = 3072 cores
    TU106 (RTX 2070) = 3x6x128 = 2304 cores
    TU107 (GTX 2060) = 2x6x128 = 1536 cores
    TU108 (GTX 2050) = 2x4x128 = 1024 cores

    There is also a chance, that the GTX 2050 is still only 1x6x128 = 768 cores. However, as shader cores are increased throughout the whole series it will probably apply to the midrange series as well, therefore 2x4x128 make more sense. Of course the GTX 2050 Ti would have the maximum number of cores (1024 cores) and the normal GTX 2050 would have slightly less cores (896 cores). It would be a bummer if they gave the normal GTX 2050 only 768 cores, and it is also unlikely since the GTX 2050 Ti (1024 cores) would be 33% faster.

    Then the lowend series:
    There could be the TU109 (GT 2030), which might be 1x4x128 (maximum 512 cores) or 1x6x128 (maximum 768 cores). To be honest, I don't think they make the GT 2030 as strong as the GTX 1050 (640 cores), but slightly slower. Which can be achieved with 1x4x128 = 512 cores. This GT 2030 (512 cores) would be the exact half of the GTX 2050 Ti (1024 cores), which would be the same as in current generation, where the GT 1030 (384 cores) is the exact half of the GTX 1050 Ti (768 cores).

    So, what do you think?
     
    Last edited: Nov 1, 2018
  10. yrekabakery

    yrekabakery Notebook Deity

    Reputations:
    660
    Messages:
    1,936
    Likes Received:
    1,930
    Trophy Points:
    181
    I think you copy-pasted this from some article without citing the original source. :p
     
    derpsauce likes this.
Loading...

Share This Page