1440p Ultra or 4K High/Medium?

Discussion in 'Gaming (Software and Graphics Cards)' started by Prototime, Nov 2, 2020.

?

1440p Ultra or 4K High/Medium?

  1. 1440p Ultra

    83.3%
  2. 4K High/Medium

    16.7%
  1. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    3080 does 1440/165?
     
  2. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    Yep. Toughest game installed is Cyberpunk which will run mid 60s with everything maxed and DLSS ultra. Most games run higher frames and are often at the refresh. As an example Metro Exodus is what I'm playing now and it is running from a low of 130 and up to the refresh, again, maxed. QHD had been viable with mobile offerings since the 1080 GTX found it's way into laptops and I had that in my last laptop along with a 2k panel. That one was a 120 refresh and games that came out when it was new were typically running around 90 frames.

    Don't think it would matter to people that feel refresh is king, they're going to want their HD. I'm not in that group though, I notice very little difference once the game plays smoothly and like to go for mo purty. It's not a right answer, just a preference thing. As soon as 4k can pull similarly smooth gameplay I'll dump QHD.
     
    Prototime and JRE84 like this.
  3. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    oh ok cool bro..

    I bought a 1080p 165 hz 32in monitor because it was 309.99 cad....if I had unlimited money I would have went for a 1440p monitor and a new laptop.....getting a 1440p monitor when your cards basically a rx540 doesn't make sense as you cant push most games past 30-40......and no most people don't own a 1080-3080
     
    krabman likes this.
  4. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    I get it, doin what's right for you, it's all good.
     
    JRE84 likes this.
  5. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    It honestly depends on the game. Generally if the game doesn't support high quality assets (pre 2012 releases), a higher resolution is not worth it. It wont show any extra details, it will only produce a slightly cleaner image. You can get the same results with good anti-aliasing and sharpening.

    2560x1080- 18-20% GPU usage

    Call of Duty  Modern Warfare 3 Screenshot 2021.10.09 - 19.26.06.75.jpg

    5120x2160- 93-97% GPU usage

    Call of Duty  Modern Warfare 3 Screenshot 2021.10.09 - 19.25.29.73.jpg


    On the other hand with modern games, in most cases ultra settings are also not worth it.

    Assassins Creed Valhalla, Max details in 2560x1080 @ 55-85fps, 97-99% GPU usage

    Assassin's Creed Valhalla Screenshot 2021.10.09 - 17.05.17.28.jpg

    Also Assassins Creed Valhalla, Medium shadows, Medium Volumetric clouds in 5120x2160 (~4x supersample) @ 35-50fps, 97-99% GPU usage

    Assassin's Creed Valhalla Screenshot 2021.10.09 - 17.05.52.27.jpg

    Here, extra samples on Volumetric Clouds or Extra AA passes over the shadowed edges don't make any noticable visual diffference whatsoever.

    What I can appreciate are the higher fidelity textures that become apparent on the tree bark, Eivor himself and things like each blade of the bushes being properly defined rather than the blurry TAA mess it was at native res.

    To me this adds a lot more depth and immersion to the game than ultra shadows(5-7 FPS loss) or ultra clouds (another 5-7 FPS loss).

    Applying a sharpening pass and playing with exposure, contrast and clarity filters(not captured with nvidia screenshot tool for some reason) make such a drastic difference to the TAA in this game that it looks like a remaster.

    What's more is, medium Volumetric clouds and shadows at this higher resolution already surpass the ultra high sample counts at the lower native resolution of my monitor ! So technically, I'm already playing at "higher" than highest. That's also why the game is not scaling linearly with resolution changes.

    Compared to the PS5 version of this game (Drops to 1080p in places), Valhalla on my PC looks like a generation apart.
     
    Last edited: Oct 9, 2021
  6. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    If you need to use DLSS, then arguably lower res + Quality DLSS >> higher res+ Crappy DLSS even if its supersampled.

    2560x1080 native with DLSS Quality 55-80 FPS:
    Cyberpunk 2077 2560.jpg

    3840x1620(2x) with DLSS Balanced 35-65 FPS:
    Cyberpunk 2077 3840.jpg

    5120x2160(4x) with DLSS Ultra Performance 30-40 FPS:
    Cyberpunk 2077 5120.jpg

    Here the cleanest image is on my native res with the added benefit of better framerate.
     
    Last edited: Oct 9, 2021
    JRE84 likes this.
  7. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Kunal Shrivastava.... you sir are whats right with NBR....tysvm

    keep it up and nbr might make a comeback....people like proof on interesting subjects
     
    Kunal Shrivastava likes this.
  8. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    well it looks like 1440p is all you will ever need for a monitor...4k is overkill for computers...1080p for tvs..watch this really interesting and has me wondering why i bought a 4k tv..
     
    krabman likes this.
  9. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    There are a lot of factors, HDR for example: You really notice that and you end up getting it and 4k together. Computers can be one of the best places to use 4k if near field because you can take advantage of the resolution by getting more on screen but you have to get the monitor sizing right.

    This is a situatione where there are wrong answers but no single right answer across all use cases.
     
  10. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    from a purely pixel density stand point was all...i think 1440p for computers and 1080p for tvs is all.....if you want 4k and 8k go for it just your eyes wont see it...ignorance is bliss i suppose
     
Loading...

Share This Page