Nvidia Optimus reduces gaming performance

Discussion in 'Gaming (Software and Graphics Cards)' started by yrekabakery, May 8, 2019.

  1. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,820
    Likes Received:
    569
    Trophy Points:
    131
    makes next to no diff direct on the CPU or not, unless youre for some reason hammering the DMI bandwidth while using the egpu
     
  2. Stooj

    Stooj Notebook Deity

    Reputations:
    172
    Messages:
    798
    Likes Received:
    644
    Trophy Points:
    106
    Almost everything important is funneled through the DMI channel so it wouldn't be all that hard to come up with reasons to create issues.

    Keep in mind, that you don't necessarily have to saturate all x4 DMI channels to cause an impact. You would only need to create latency conditions. Off the top of my head, texture streaming could do this.
     
  3. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,820
    Likes Received:
    569
    Trophy Points:
    131
    Thankfully it's not that simple, texture streaming in fact doesn't cause issues at least in the latest games. I've had no issues. Yes almost everything is funneled but cmon, what uses 40Gb? or 3200MB a second besides an SSD or TB3 GPU.
     
  4. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    93
    Messages:
    960
    Likes Received:
    336
    Trophy Points:
    76
    My MSI GT72 w/ 980m doesn't have optimus and whatever extra that cost me was worth it. I'm sure I'll be making a request thread for the short list of gaming laptops with manual iGPU/dGPU switching in approximately two years from today.
     
  5. heretofore

    heretofore Notebook Geek

    Reputations:
    10
    Messages:
    96
    Likes Received:
    34
    Trophy Points:
    26
    Thanks to yrekabakery for bringing this issue to our attention.
    I google searched for more info.

    In windows, go to "Display Settings" or "Display Properties"
    It will show which display is connected to which graphics processor.

    windows_display_settings.jpg
     
    intruder16 and yrekabakery like this.
  6. bobbie424242

    bobbie424242 Notebook Guru

    Reputations:
    0
    Messages:
    56
    Likes Received:
    29
    Trophy Points:
    26
    Note that on Linux (using an Optimus setup) you can turn off entirely the dGPU with bbswitch, so it will consume 0W (instead of about 8W at idle) and reduce idle temp as well (about -10 degrees celsius on my Quadro P600). I do not think this is possible on Windows.
     
  7. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    700
    Messages:
    2,136
    Likes Received:
    2,123
    Trophy Points:
    181
    In an Optimus setup the dGPU definitely does not consume 8W at idle. That’s what a power-hungry GPU like the GTX 1080 consumes without Optimus.
     
  8. bobbie424242

    bobbie424242 Notebook Guru

    Reputations:
    0
    Messages:
    56
    Likes Received:
    29
    Trophy Points:
    26
    I did more test and you might be right: running xorg using the proprietary nvidia driver, I notice about a +3W consumption at idle vs using the iGPU with the dGPU off with bbswitch.
    However, if running xorg with the iGPU without the dGPU off, I observe +8W. And whenever the dGPU is on (idle or not) the increase in temperature is significant.
     
    yrekabakery likes this.
  9. Stooj

    Stooj Notebook Deity

    Reputations:
    172
    Messages:
    798
    Likes Received:
    644
    Trophy Points:
    106
    Windows shuts off the Nvidia GPU automatically based on the driver rules. That being said, many users unknowingly use sub-optimal settings (from a power-saving perspective) which results in the Nvidia GPU being enabled and used for basic apps like Browsers. i.e Lots of people immediately change the "Preferred Graphics Processor" from Auto -> High Performance which forces any GPU accelerated app, no matter how mundane, ot fire up the Nvidia GPU.

    Furthermore, bbswitch cannot work around situations where multiple monitors are in use. Because the external display ports are almost always connected directly to the Nvidia GPU (in the case of Clevo), this will cause the Nvidia GPU to activate regardless.

    The Nvidia Proprietary driver ships with the nvidia-prime app to switch the entire DWM to use iGPU or dGPU. By default it is in dGPU mode so you will see increased idle power.
     
  10. bobbie424242

    bobbie424242 Notebook Guru

    Reputations:
    0
    Messages:
    56
    Likes Received:
    29
    Trophy Points:
    26
    I'm not certain the Windows driver can entirely switch off the dGPU so it uses 0W (what bbswitch does). In my experience and with the dGPU idle, total wattage is always higher
    on Windows than on Linux with the dGPU switched off with bbswitch. On that Linux setup (iGPU only, dGPU off) it is as low as 2-3W for total laptop power consumption at idle (everything else being quite optimized in that regard).

    Yes, that's also the case on my P72: external ports are wired to the dGPU, and that's the only reason I use the dGPU. If Lenovo had offered a config without a dGPU, I would have taken it.

    In my observations, the dGPU takes 8W idle power unless either the Xorg "nvidia" driver is also loaded (in which case it is reduced to about 3W) or it is switched off with bbswitch (0W).
     
Loading...

Share This Page