How will Ampere scale on laptops?

Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. Kunal Shrivastava

    Kunal Shrivastava Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    7
    Trophy Points:
    6
    With RTX 3080 and 3090 having 300-400 watts power draw on 3rd party boards will the performance delta between laptops and desktops be even greater than Turing? Assuming nvidia sticks with the Samsung 8nm node and not 7nm it's hard to imagine massive gains, especially since laptops prevent cramming in anything more than 200 watts. This seems to be the upper limit in alienware/MSI/Clevo because anything higher is very hard to cool down.
    Makes sense that performance of the Pascal generation was the closest among laptops and desktops, given that gtx 1080 FE had 180 watts of TDP.
    My guess is gains on laptops will be a lot more modest at around 30-40% over Turing, maybe 50-60% better rtx performance with the 2nd gen rt cores but not the 2x claim as seen on desktops.
    All this is for the 180-200w variants, which is rare because majority of the laptops opt for max-q where gains might be even lower. Ampere might not get enough breathing room on thin and light laptops to have a sizeable impact over Turing.
    Considering Turing still had a manageable TDP but some RTX 2080 max-q laptops were flat out beaten by overclocked desktop RTX 2060s I'm not overly optimistic on the RTX Ampere mobile launch, maybe nvidia should go back to the mobility naming scheme rather than mislead people into paying for desktop performance.
    Thoughts?
    IMG-20200907-WA0003.jpg
     
    Last edited: Sep 7, 2020
  2. etern4l

    etern4l Notebook Deity

    Reputations:
    1,163
    Messages:
    1,960
    Likes Received:
    1,392
    Trophy Points:
    181
    30-60% is still a good improvement compared with the "Super" upgrade. Also, I wonder whether there are diminishing returns in terms of performance/power. If so a GPU running at 400W wouldn't be twice as fast as a GPU running at 200W.
     
  3. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    85
    Messages:
    357
    Likes Received:
    150
    Trophy Points:
    56
    I think Clevo and Alienware can handle 300W GPU easy, so I'm optimistic about having 250+W GPUs on the next gen of gaming laptops
     
    seanwee likes this.
  4. Kunal Shrivastava

    Kunal Shrivastava Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    7
    Trophy Points:
    6
    My dm3 can barely cool a 180 watt gtx 1080 (90 degrees and throttle on kryonaut: 1625-1810mhz). For power delivery I think Gallium Nitrite chargers are the future but I don't think even the best laptop vapour chambers can push 300w of heat. Unless there is a breakthrough in laptop liquid cooling or some such I just can't see 300 W happening on laptops.
     
    DreDre, BlueBomber and hfm like this.
  5. Deks

    Deks Notebook Prophet

    Reputations:
    1,196
    Messages:
    5,081
    Likes Received:
    1,994
    Trophy Points:
    331
    There already are 'diminishing returns' in laptop GPU's on NV side.
    It's been a month or so, but I've seen an article on price/performance ratio of mobile gpu's and how much performance you actually get in relation to desktop GPU's.

    The 'cutoff point' were RTX 2060/2070 and 5600m/5700m (though AMD wasn't featured in that article - which is strange enough considering their mobile Navi GPU's have identical specs to desktop counterparts and only slightly lower clocks and memory bandwidth - which would put them within 5-10% of each other.

    So, with very large power requirements on the desktop for Ampere, it remains to be seen how performance will scale with lower power envelopes.

    Vega 56 had initially very large power demands on desktop, but we know this was because its voltage was set too high from factory (because AMD does this to increase the number of functional dies)... and when Acer put Vega 56 into PH517-61, they retained 95% of its performance while dropping its TDP down to 120W (and the stock clocked version in my laptop isn't even reaching full 120W, which means I can overclock it easily on both core and HBM and gain about 10% performance or more before I hit the 120W limit - that's very efficient).

    Something similar could happen here... however, NV (to my knowledge) isn't known for overvolting their GPU's out of the factory to increase the number of functional dies like AMD does.

    That said, mobile GPU's are usually better binned than desktop versions, so NV will likely be largely dropping clocks and voltages by certain amounts to compensate for the lower power demands... not sure how much of a drop in performance will there be, but we'll see.
     
    Last edited: Sep 6, 2020
  6. Kunal Shrivastava

    Kunal Shrivastava Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    7
    Trophy Points:
    6
    Aren't tensor and RT cores the 'notoriously power hungry' part?
     
  7. JRE84

    JRE84 Notebook Deity

    Reputations:
    289
    Messages:
    1,295
    Likes Received:
    660
    Trophy Points:
    131
    Basically 320w can be shrunk
     
  8. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    85
    Messages:
    357
    Likes Received:
    150
    Trophy Points:
    56
    That's kinda weird. Maybe you live in a pretty hot place? Where I live the ambient temperature is somewhere between 20 to 30ºC and at stock clock speed my GTX 1080 draws 200+W (I think 210W is the maximum I've seen in my system) and the cooling system keeps the temperature around 80ºC without throttling. Also based on what people reports on the Area 51m R2 that laptop also cools down the RTX 2080 Super under 80ºC while drawing 200+W. That's why I'm optimistic. When using SLI in my system @100% load, both GPUs draw 200+W each (400+W in total) and the system is able to keep the temperature under 85ºC without thermal throttling.
     
    seanwee and DreDre like this.
  9. Deks

    Deks Notebook Prophet

    Reputations:
    1,196
    Messages:
    5,081
    Likes Received:
    1,994
    Trophy Points:
    331
    It also comes down to how OEM's implement cooling in laptops.
    Bear in mind that you won't always see top notch cooling on laptops that use all AMD hw for example (my Acer PH517-61 is an exception)... OEM's prefer to invest more quality into Intel/NV for some reason despite AMD having equally or more efficient hw.
     
    seanwee likes this.
  10. Kunal Shrivastava

    Kunal Shrivastava Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    7
    Trophy Points:
    6
    Well no, ambient temperatures here are quite pleasant :) You probably have a vapour chamber on your SLI laptop, but even then I'm surprised at how you can get 85 on dual cards. From what I've read dual 1080 laptops can be sold as portable nuclear reactors.
     
Loading...

Share This Page