What is the point of 4k on small display?

Discussion in 'Hardware Components and Aftermarket Upgrades' started by techlife95, Jul 2, 2017.

  1. HTWingNut

    HTWingNut Bacon

    Reputations:
    21,465
    Messages:
    35,109
    Likes Received:
    9,012
    Trophy Points:
    931
    Perfect! Yes I agree. My 1080 Ti at 3440x1440 is still a far cry from 4K resolution and it still struggles to keep 60FPS. I had the 980 Ti which is why I bumped to 1080 Ti. While it's a improvement, still not where I want it to be unfortunately. So for me 4K for gaming is just not there yet.

    I find it comical how the XBOX One X is being touted as a "4K gaming machine". Then again that means lower quality graphics and 30FPS.
     
    Prototime likes this.
  2. Tanner@XoticPC

    Tanner@XoticPC Company Representative

    Reputations:
    194
    Messages:
    1,984
    Likes Received:
    1,756
    Trophy Points:
    181
    I'm hoping it actually has a positive effect on 1080p high refresh gaming, since I know a lot of users still aren't making the 4K switch and opting for lower rez 120hz panels instead for TVs.
     
  3. Falkentyne

    Falkentyne Notebook Evangelist

    Reputations:
    200
    Messages:
    433
    Likes Received:
    286
    Trophy Points:
    76
    This actually depends.
    Some TV's will downscale to 1080p with no loss in image quality at all, just higher PPI. However MANY monitors *WILL* still interpolate 4k down to 1080p (when there should be no interpolation at all), causing a noticeable loss of image quality. I haven't seen a PC monitor that can downscale 1440p to 1080p with no interpolation at all. Even monitors with decent scalers (e.g. Benq) will still interpolate.
     
    tilleroftheearth likes this.
  4. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,178
    Messages:
    11,573
    Likes Received:
    1,606
    Trophy Points:
    631
    Nah, not more important on the 'recording' end. At least equal when we're talking about computer monitors.

    You're forgetting that an O/S is the 'recording'/'rendering' engine for everything we see: a 4K monitor of sufficient quality makes those long bouts on the computer that less painful (on the eyes).

     
  5. heretofore

    heretofore Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    9
    Trophy Points:
    16
    Recording and rendering are two very different things.
    In video, recording resolution and bit rate are paramount.
    Record video at 480p or 480i, and it will look "similiar" on either a 1080p display or on a 4k display.
    Record video at 4k and it "should" look significantly more detailed when viewed on a 4k display, than on a 1080p display.
    The 1080p display has a lot less pixels which means a lot of tiny details may not be visible (or less clear), when viewing 4k video on 1080p display.

    I'm not just some guy who hangs out of computer forums all day.
     
  6. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,178
    Messages:
    11,573
    Likes Received:
    1,606
    Trophy Points:
    631
    Maybe someone can find the proper terms. I too understand images (moving or otherwise).

    Try to read my response in the context it was meant.

    The O/S is presenting the desktop, the fonts and all other items at the resolution the screen is capable of - not the other way around.

    A higher resolution screen is not just about pixels. It has other attributes that make it better too (again: all other things being equal).

    As for how 'clear' a display is (regardless of input quality) - that is much more on the 'quality' and intrinsic 'design' side of the equation than mere additional pixels could achieve.

    With each jump in resolution; there had to be other factors that were first 'fixed' before the resolution itself became important (or marketable). That is why a quality 4K monitor is inherently superior to anything below it. Those 'extra' enhancements were just not needed (or maybe; noticeable enough) at the lower specs.

    Given a recording at 'x' resolution - and the same size and 'quality' screens - one 1080p, one 4K - the 1080p screen will be inferior. Though most viewers may not able to tell why.

    Given a recording at the same 'x' resolution - and different size screens (but again; of equal quality) - with each viewed at the same effective distance (i.e. arc angle) - the 1080p screen may or may not seem inferior - depending on the eyesight of the viewer.

    Our senses do not like stepped input - they are very tuned to an analogue world - even if most consumers can't tell one way or another (it's called training/educating your 'ears', 'eyes', 'touch', 'smell', 'taste'). When our digital tools become more analogue in nature (at least on the output side; for monitors) - that is when the 'tool' disappears and it seems we are interacting with our data, directly.

    One thing I've found so far: we are still so far, far away from that reality. But each and every advancement made shows how clearly bad the older 'tools' really were. To me, it's a double whammy. I can only imagine being born today among such advancements as we now have. Instead of having to experience them through the senses of an old, worn-out and tired body that I'm left with. ;)

    What all our digital tools do is 'copy' the nature around us. It is that 'copying' part that is so hard to do, faithfully, to the original.

     
  7. heretofore

    heretofore Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    9
    Trophy Points:
    16
    What I am trying to tell you is that "input quality" is much much more important than the display resolution/quality.
    If "input quality" (aka the original recording) looks terrible, then what you see on any display will look terrible.
    If the original recording is high resolution and high bit rate, then what you see on screen should look good, both at 1080p and at 4k.
    When the resolution of the original recording exceeds 1080p, then the 4k display has an advantage.
    If resolution of the recorded video is 1080p or less, then the 4k display has little (or no) advantage.
    This assumes the 1080p display and 4k display are equal in all ways, except resolution.

    Don't focus on the O/S.
    video can be viewed on a display, without a PC, by directly connecting the camera to the display.
    by the way, cameras and microphones make recordings. computers make renderings.
     
  8. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,178
    Messages:
    11,573
    Likes Received:
    1,606
    Trophy Points:
    631
    And I am telling you that 4K+ is not just about video - and neither is this thread (directly).

    The 'recording' isn't the topic of this thread, the quality of 4K vs. 1080p displays is. ;)

    And when 4K or other high(er) resolutions can be seen with a handheld device with a display of 6" or less; having a 4K display on a notebook at 15/17 inches makes for a better overall experience.

    I agree 100% that the input source quality is the most important part of the chain. But you keep ignoring the fact that an O/S isn't built to a specific resolution. It scales effectively infinitely with the display it is attached to. So; in this case; the input source quality (i.e. the O/S) is already above the resolution of any display device we have access to in our notebooks.

    Note too that your assumption that 1080p and 4K displays are 'equal in all ways' is also in error. I've already covered why.

     
  9. heretofore

    heretofore Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    9
    Trophy Points:
    16
    I know what the thread topic is.
    Displays are not only for personal computers. Televisions are displays too.
    I can use a display without a personal computer and without windows/mac/linux/android/etc.
    If the O/S changes/alters the image on the display, this has nothing to do with the display itself.
    O/S = input source quality ??? O/S scales infinitely with the display? huh? Sorry, I dont speak this language.
    input source = original recording (i.e. photo/video from camera or image on storage drive)
    quality of input source depends mainly on camera (if image is from camera).
    Let's not confuse this topic by including software effects into the discussion.

    I also know that 1080p and 4k displays are not the same.
    For simple comparison reasons, I assumed all variables are equal except resolution.
    This makes the comparison more simple and less confusing, even though it's a false assumption.
     
  10. techlife95

    techlife95 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    So did anyone learn something from the video I posted? Does anyone agree with the guy in the video?
     
Loading...

Share This Page