How will Ampere scale on laptops?

Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    1,266
    Messages:
    2,029
    Likes Received:
    1,486
    Trophy Points:
    181
    The thing is that all those higher image quality options result mostly with increased GPU load, at the same applying downward pressure on the FPS. Any reduction in FPS would further reduce the load on the CPU.

    The way to use up all those cores would be to apply them to non-graphics tasks, that wouldn't be actually better done by the GPU. Simulations perhaps. Of course, this part of the code would have nothing to do with adjustable graphics. It would have to run on all platforms or the game would be non-deliverable to slower hardware, hence the need to scale any such design ambitions to console reality (or just do a PC exclusive, usually if some additional rationale exists, e.g. a mouse or keyboard is required). Additionally, simulations might require a lot of RAM, which again, is typically in short supply on the consoles, or lower end PCs to which this argument applies similarly.

    In summary, a modern gaming laptop or desktop is very unlikely to be bottlenecked by the CPU in the vast majority of games. Hope that makes sense.
     
    Last edited: Sep 20, 2020
    hfm and JRE84 like this.
  2. seanwee

    seanwee Notebook Deity

    Reputations:
    213
    Messages:
    1,392
    Likes Received:
    578
    Trophy Points:
    131
    Funnily enough, moving forward shunt modding will be more useful on laptops than on desktops.



    2% performance improvement with all 5 shunt resistors modded. Performance cap is Vrel and VOp so basically the cards are bumping up against nvidia's set voltage limit.
     
    hfm likes this.
  3. hfm

    hfm Notebook Prophet

    Reputations:
    1,971
    Messages:
    4,590
    Likes Received:
    2,303
    Trophy Points:
    231
    Yep that was exactly my point in where I state a 4C/8T 15W CPU is still serving me just fine for most gaming purposes. But that doesn't mean I don't want to see this multi core usage proliferate as it's the only way to move forward. We're not going to be hitting 6-7GHz sustained any time soon for the normal gamer. People overclocking to the edge of stability is an extreme edge case.

    My CPU is starting to show it's weaknesses in a select group of games though. As long as I can maintain something close to 60FPS I'm ok, as I don't play any competitive multiplayer. I'm pretty much 100% SP games where it's mostly slower paced. If I cared about competitive MP games I wouldn't be using this setup and would have a loud and heavy notebook with a high refresh panel or find a way to talk my wife into letting me jam a desktop into the office desk area (with probably no success lol, maybe someday we'll buy a bigger place and I'll have my own office room). All i need is s new iteration of something like the Gram 17 to have a 20% boost in CPU for me to be ok with it. It's really GPU where I need the muscle so I can crank up some more details and maintain 1600p 60fps..

    I'm even ok with lower than 60 in some games.. RDR2 is hovering around 40ish with some decent detail cranked and it's good enough to not notice it 95% of the time. But the CPU is hovering around 60-85% sometimes a little higher.. It's teetering on the edge. :)
     
    Last edited: Sep 20, 2020
    Prototime likes this.
  4. seanwee

    seanwee Notebook Deity

    Reputations:
    213
    Messages:
    1,392
    Likes Received:
    578
    Trophy Points:
    131
    Heres an unpopular opinion. I actually welcome the high tdp of the Ampere gpus with open arms.

    Why? Because it forces laptop makers to innovate and create new and better cooling systems to handle the higher tdp parts. That is, provided nvidia doesn't stick a pacifier in their mouths, severely gimp performance and call it a day.

    Either way, they cant sell products if its only marginally better than last gen so i expect tdps to go up across the board. I'd love to see 300w thin and lights and 450w medium sized laptops, perhaps this is the generation we see gallium nitride ac adapters hit the mainstream.

    And with the beefed up laptop designs, we might finally see true desktop parity in laptops with 5nm RTX 4000 gpus, maybe even a xx80ti/xx90 gpu in laptops. Laptop manufacturers aren't going to go backwards and discart all their prior innovations, improved deigns will be carried forward and even improved upon.
     
    unlogic and etern4l like this.
  5. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    121
    Messages:
    1,021
    Likes Received:
    382
    Trophy Points:
    101
    ...for Intel, sure.

    https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-8700+@+3.20GHz&id=3099

    https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+5+4600H&id=3708

    https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+9+4900HS&id=3694

    ...The whole reason I went with the G14 was to circumvent CPU bottlenecks in nextgen ports. Outrunning the nextgen consoles is just like outrunning a bear, You don't have to outrun the bear, you just have to outrun the person behind you. ...I just needed to be ahead of Ryzen2 8c/16t @ 3.5ghz.

    ...but, seriously, I'd say that all mobile CPUs paired with Ampere should match or beat the i7 8700 ...the 10850h matches it now, so a 11750h should get there or better..

    https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-10850H+@+2.70GHz&id=3734
     
    JRE84 likes this.
  6. Papusan

    Papusan JOKEBOOKs Sucks! Dont waste your $$$ on FILTHY

    Reputations:
    31,854
    Messages:
    26,611
    Likes Received:
    49,039
    Trophy Points:
    931
    Passmark Performance Test isn't reliable or trustworthy. I wouldn't buy hardware on basis on their results. And there is a reason, this benchmark won't get green flag at the Hwbot.

    One of many.
    Userbenchmark should no longer be used after they lowered ...
     
    Last edited: Sep 21, 2020
    etern4l and seanwee like this.
  7. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    121
    Messages:
    1,021
    Likes Received:
    382
    Trophy Points:
    101
    It is and it isn't.

    I knew they nerf'd multicore and reweighted tests, but knowing that and understanding that AMD is better than it's scoring ...when it's gimp'd score is still trouncing the other CPU, I can still make some use of that.
     
  8. Deks

    Deks Notebook Prophet

    Reputations:
    1,196
    Messages:
    5,105
    Likes Received:
    2,008
    Trophy Points:
    331
    You and other 'informed' people can, but the majority of people are NOT that informed and they think its a viable metric.
    Sorry, but if you nerf multithreaded performance, you throw off the scales and the test becomes pointless as a way to relay needed information (which also puts into question their other tests as well).
     
    Last edited: Sep 22, 2020
    hfm and etern4l like this.
  9. hfm

    hfm Notebook Prophet

    Reputations:
    1,971
    Messages:
    4,590
    Likes Received:
    2,303
    Trophy Points:
    231
    Yeah, those tests are not great. You shouldn't have to be armed with a bunch of caveat info to decode it.
     
    etern4l likes this.
  10. Papusan

    Papusan JOKEBOOKs Sucks! Dont waste your $$$ on FILTHY

    Reputations:
    31,854
    Messages:
    26,611
    Likes Received:
    49,039
    Trophy Points:
    931
    Todays question will be... How far behind will the 3080 notebook cards come vs. desktop cards. Forget seeing 20GB GDDR6X vRam in notebooks...
    http://forum.notebookreview.com/threads/nvidia-thread.806608/page-250#post-11048082

    Report: Why The GeForce RTX 3080's GDDR6X Memory Is Clocked at 19 Gbps tomshardware.com

    Blame the heat...

    German publication Igor's Lab has launched an investigation into why Nvidia chose 19 Gbps GDDR6X memory for the GeForce RTX 3080 and not the faster 21 Gbps variant. There are various claims, but it's not entirely clear how exactly some of the testing was conducted, or where the peak temperature came from.

    According to the results, the hottest GDRR6X memory chip had a Tjunction temperature of 104C, resulting in a delta of around 20C between the chip and the bottom of the board. Another interesting discovery is that the addition of a thermal pad between the backplate and the board helps drop the board temperature by up to 4C and the Tjunction temperature around 1-2C.
     
Loading...

Share This Page