Inconsistency in mobile GPU specs and real core frequencies

Discussion in 'Gaming (Software and Graphics Cards)' started by Niaphim, Jan 23, 2020.

  1. Niaphim

    Niaphim Notebook Consultant

    Reputations:
    9
    Messages:
    184
    Likes Received:
    82
    Trophy Points:
    41
    Hello everyone,

    Hope this is an appropriate section to post this topic.
    I've noticed something that bothers me. Let's take the GTX 1650 Max-Q for example - the one that I can verify myself as I have a laptop with a 1650 Max-Q.
    Official specifications say that it's frequency should be about 1020-1245. "Normal" mobile version should go up to 1560 MHz. This is taken from the official Nvidia website
    https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1650/
    but could easily be found on other sites - for example,
    https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1650-Max-Q-GPU-Benchmarks-and-Specs.418763.0.html
    However, when I'm using my laptop I see my core clock constantly around 1700-1800 mark, which can't be due to some temporary boost - I'm talking about 1h+ long sessions.
    For example, in this video core frequency goes anywhere from 1200-1300 in less demanding games up to 1800 at 12:30 mark in AC Odyssey (link below)


    Another example - RTX 2060 (laptop version)
    Official core frequency 960-1200(boost)
    https://www.notebookcheck.net/NVIDIA-GeForce-RTX-2060-Laptop-Graphics-Card.384946.0.html
    And in this video, for example, core frequency is consistently around 1400 MHz mark.


    Which gives... I want to understand - is this some false information or is there some boost mode nobody talks about?
    It is most likely NOT related to overclocking, as at least for 1650 Max-Q I can confirm these numbers with stock settings.

    /edit It happens that I took 2 videos of MSI laptops as examples. It is not an MSI thing either, as similar numbers are shown for other laptops, like in this video with a Lenovo laptop (RTX 2060):
     
    Last edited: Jan 24, 2020
  2. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,457
    Messages:
    6,339
    Likes Received:
    3,912
    Trophy Points:
    431
    Max Q GPUs are bound by power limits. Depending on how much power is required, the GPU will boost all the way like a desktop GPU, provided there is power left. It all depends on how a game taxes the GPU.

    For example, a super taxing game like Metro, I usually run around 1100-1200mhz, with jumps to 1300mhz. Always held down by power limit. But in easier to run games like Dying light, I can get all the way to 1800mhz at times, with fps close to 144, but then power limits hit me hard and fps jumps around along with core clocks.

    These chips/cores were made with high clocks in mind, but laptop models, and specifically max q parts, are severely power handicapped, so there is no "game clock" standard, and the GPU will try to always run at its best provided temps, power, and fps limits are met.
     
    Niaphim likes this.
  3. Niaphim

    Niaphim Notebook Consultant

    Reputations:
    9
    Messages:
    184
    Likes Received:
    82
    Trophy Points:
    41
    Thank you, I think I'm starting to understand.

    So, does it mean that actually the frequencies on these cards are "backwards", counterproductive? What I mean is that there is free power, so basically no need for such a high clock, the card manages to hit those higher values. And the opposite case you've mentioned - if some game starts to tax GPU at its maximum capacity, it can't hit those higher boost clocks due to its power limit and stays at what is given as official boost clocks.

    Is that correct?
     
  4. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,024
    Messages:
    36,385
    Likes Received:
    4,433
    Trophy Points:
    681
    @ryzeki is accurate on all points. Look up Nvidia's GPU Boost 4.0 technology, it can send the clocks way high. I have found Nvidia's last two generations of GPUs, especially the current crop (GTX 10/RTX 20), spend nearly all their time above their rated boost clock, even the Max-Q versions.

    That said, Nvidia's reference clocks on its site apply only to desktop cards. With laptops, it's the wild west, and Max-Q doesn't help. It's a situation where you must find real-world benchmarks to assess how the laptop performs.

    If you're interested in why your GPU is limited, download and install GPU-Z, check the box to log data, then run a game for a bit. Check the log afterward; it will tell you what is limiting your GPU. Power, temperature, etc. You can also see your real frequencies as they are sustained over time and a bunch of other stats.

    Charles
     
    ryzeki and Niaphim like this.
  5. Niaphim

    Niaphim Notebook Consultant

    Reputations:
    9
    Messages:
    184
    Likes Received:
    82
    Trophy Points:
    41
    This is exactly why I am confused. Real world tests are the only reliable source of information as far as I can see.
    To put my original question in context - I've been researching on the recently announced ASUS G14 - it will have either a 1660Ti Max-Q or a 2060 Max-Q. The 1660Ti Max-Q is clocked about 100 MHz higher than most other previously released (1435 compared to 1335) at the same power (60W), in this case its performance is fairly predictable. On the other hand, the new RTX 2060 Max-Q is at 65W compared to 80-90, but announced at 98 MHz higher than its "normal" counterparts (1298 compared to 1200). So I struggle to give an estimate on where it will land in terms of performance, considering that in games frequencies are all over the place.
     
  6. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,024
    Messages:
    36,385
    Likes Received:
    4,433
    Trophy Points:
    681
    I'm not disagreeing. Real world tests are the only way to know precisely how well a system can perform. Best off letting someone buy/review the product so you don't have to guess.

    If heat isn't a problem for the GPU, then power limits will probably be the limiting factor, so try to find what they are. Beyond that, pick the worst-performing laptop with the GPU you're looking at and take that as a worst-case scenario.

    Charles
     
    ryzeki likes this.
Loading...

Share This Page