RTX 3080 trumps 2080ti by a whopping margin

Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.

  1. Cylix101

    Cylix101 Notebook Consultant

    Reputations:
    72
    Messages:
    236
    Likes Received:
    130
    Trophy Points:
    56
    JRE84 likes this.
  2. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    To be fair, RDNA 2 for example already experienced 50% increase in performance per watt over previous generation... and RDNA 3 has been stated it should do the same.
    AMD is also slated to release consumer RDNA 3 GPU's with chiplets late next year (whereas NV will be behind AMD on that front by 1 year - AMD is also releasing MI200 accelerators for data centers that use MCM late this year)... I was thinking of waiting until late next year (or early 2023) to replace what I have (depending also on whether professional software will also include GPU acceleration support for AMD GPU's, because right now, that's excessively sparse).


    No chiplets though.
    AMD will also have a big upgrade with RDNA 3 which should bring another increase of 50% performance per watt (pure uArch changes independent of manuf. node) along with chiplets (on consumer GPU's - or at least, upper high end ones - mid range ones will apparently still be monolithic versions).
     
    JRE84 likes this.
  3. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    5 phase 30A VRMs is already a hard physical limit to ~150W. On that scenario shunting would do anything just having over current protection triggered every time a VRM goes over 30A if any. What's the VRM config on BGA systems for the 3080 mobile?

    The only reason why laptop's are so behind desktops is because the thin and light trend. It's just not possible to dissipate 320W of a GPU on a laptop that is 5mm tick, also the battery life would be like 15 minutes. My 3080 Ti is 3-slot height, that's like 3 thin and light laptops, lol There used to be a niche before for tick desktop replacement laptops. That niche is gone now.

    Dynamic Boost would be a great idea if the GPU laptop could take, let's say 200W on normal mode, and up to 300W with dynamic boost. But as it is now, it just take like how much? 30W extra from the CPU? to a total of 165W?

    Hopefully with AMD FSR, GTX series will have a nice boost in performance now. Since Nvidia DLSS is just for RTX graphics, but to be honest there isn't much of a difference between DLSS and FSR, they look equally good. FSR 2.0 will eve be much closer to DLSS 2.0.

    What pisses me off is the name scheme. Why not call the mobile 3080 just a 3060 Ti (which is how it performs) or call it a 3080M. The desktop 3080 and the mobile 3080 are worlds apart on everything.

    Rumors are like the 4090 is going to perform 2x the 3090. At cost of like 450W TDP vs 350W TDP. Kinda frustrating how every gen desktop GPUs get more and more performance increase over the last gen and laptops get less and less performance increase over the last gen.
     
    JRE84, seanwee, krabman and 3 others like this.
  4. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    426
    Messages:
    1,002
    Likes Received:
    766
    Trophy Points:
    131
    Diminishing returns over previous generations until a real breakthrough if any comes. Those max power requirements for desktop cards wow. At least we can get all sorts of real info by people who took the plunge and by benchs. I saw that I think it was samsung is now mass manuf 4k/90hz laptop screens, would be nice to get those across the board at this point since 4k been out a long time now. Should have 2k as a new standard in gaming oriented laptops by now though.

    https://www.notebookcheck.net/Samsu...nd-4K-OLED-displays-for-laptops.562346.0.html
     
    JRE84 likes this.
  5. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    I agree with most of your post but the part quoted above may not be correct: The professor tells me that they'll be back when the semiconductor shortage ends.
     
    JRE84 likes this.
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,614
    Trophy Points:
    931
    Here is one more that come with too weak power adapter. Pretty disgusting. And Razer is well known for bloated batteries after some time.

    We're able to record a maximum draw of 235 W from the medium-sized (~17 x 7 x 2.5 cm) 230 W AC adapter. The charging rate will slow significantly when running stressful loads as a result. Yep forget those 4 extra bin on the Cpu.

    https://www.notebookcheck.net/Razer...-130-W-TGP-GeForce-RTX-graphics.561887.0.html
     
    JRE84 and seanwee like this.
  7. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    666
    Messages:
    1,920
    Likes Received:
    1,110
    Trophy Points:
    181
    On the 3080 GP/GE 66/76 the vrms are 6 phase with 50A power stages, 5 stage on the 3070 models.



    With mux switches you can have long battery life without compromising performance nowadays. And even current cooling systems in medium sized laptops (GP66) can handle shunted 200w gpus like a champ (temps in the mid 60s). The problem is manufacturers/nvidia not putting more faith into laptop cooling systems and cranking up the tdp.



    Best combo nowadays is to bypass the cpu tdp in bios (imon slope + offset) combined with a shunt mod. Now the only limiting factor is how much power the manufacturer lets the laptop draw.



    I'll be upgrading to a "4080" laptop when it comes out and i've already given up hope that laptops will match desktops for the next few years at least. I believe we're in the power hungry gpu phase again ala fermi and hopefully hopper will be like what keppler was to fermi and bring power draws back in check.[/QUOTE]

    That said, if i get 4k120hz performance in AAA games i'll be satisfied i suppose. And considering the 4090 will be 2x 3090 at just 100w more there will still be a substantial uplift in terms of performance per watt. Which means at current laptop tdps 3090 performance is well within reach.
     
    Last edited: Sep 21, 2021
    JRE84 and Clamibot like this.
  8. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,839
    Likes Received:
    59,614
    Trophy Points:
    931
    I don't think we will see xx80 desktop graphics cards below 320W anymore. But for gaming laptops... They will be thinner with equivalent changes in the cooling.
     
    JRE84 likes this.
  9. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    666
    Messages:
    1,920
    Likes Received:
    1,110
    Trophy Points:
    181
    While there will be thinner laptops, it is nearly impossible for medium sized laptops to go away, and those are enough for me. DTR tdp is just a shunt mod away.
     
  10. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76



    I don't know about switches mux, but if let's say a laptop needs 200W total to play games, and most common batteries are 90WHr then the battery will last less than half hour while playing games, that is physics, and the law of energy conservation. For the battery to last longer there has to be some kind of power limitation while on battery mode. If on battery mode the laptop can play games at 100W it will last a bit less than 1 hour. But you have to take a performance loss, it can't be other way around.

    About the cooling. GPUs aren't that difficult to cool down, the thing is a video card like the desktop 3080 needs around 300W on stock settings and can go as high as 350+W while boosting. I don't think a mid size laptop can cool down that much power, a desktop replacement probably can. A desktop replacement laptop should use around 400W total while playing games, but that's less than 15 minutes on battery of 90WHr. Of course I don't care about battery mode, but most people do.
     
Loading...

Share This Page