The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    1440p Ultra or 4K High/Medium?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Prototime, Nov 2, 2020.

  1. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    3080 does 1440/165?
     
  2. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Yep. Toughest game installed is Cyberpunk which will run mid 60s with everything maxed and DLSS ultra. Most games run higher frames and are often at the refresh. As an example Metro Exodus is what I'm playing now and it is running from a low of 130 and up to the refresh, again, maxed. QHD had been viable with mobile offerings since the 1080 GTX found it's way into laptops and I had that in my last laptop along with a 2k panel. That one was a 120 refresh and games that came out when it was new were typically running around 90 frames.

    Don't think it would matter to people that feel refresh is king, they're going to want their HD. I'm not in that group though, I notice very little difference once the game plays smoothly and like to go for mo purty. It's not a right answer, just a preference thing. As soon as 4k can pull similarly smooth gameplay I'll dump QHD.
     
    Prototime and JRE84 like this.
  3. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    oh ok cool bro..

    I bought a 1080p 165 hz 32in monitor because it was 309.99 cad....if I had unlimited money I would have went for a 1440p monitor and a new laptop.....getting a 1440p monitor when your cards basically a rx540 doesn't make sense as you cant push most games past 30-40......and no most people don't own a 1080-3080
     
    krabman likes this.
  4. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    I get it, doin what's right for you, it's all good.
     
    JRE84 likes this.
  5. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    It honestly depends on the game. Generally if the game doesn't support high quality assets (pre 2012 releases), a higher resolution is not worth it. It wont show any extra details, it will only produce a slightly cleaner image. You can get the same results with good anti-aliasing and sharpening.

    2560x1080- 18-20% GPU usage

    Call of Duty  Modern Warfare 3 Screenshot 2021.10.09 - 19.26.06.75.jpg

    5120x2160- 93-97% GPU usage

    Call of Duty  Modern Warfare 3 Screenshot 2021.10.09 - 19.25.29.73.jpg


    On the other hand with modern games, in most cases ultra settings are also not worth it.

    Assassins Creed Valhalla, Max details in 2560x1080 @ 55-85fps, 97-99% GPU usage

    Assassin's Creed Valhalla Screenshot 2021.10.09 - 17.05.17.28.jpg

    Also Assassins Creed Valhalla, Medium shadows, Medium Volumetric clouds in 5120x2160 (~4x supersample) @ 35-50fps, 97-99% GPU usage

    Assassin's Creed Valhalla Screenshot 2021.10.09 - 17.05.52.27.jpg

    Here, extra samples on Volumetric Clouds or Extra AA passes over the shadowed edges don't make any noticable visual diffference whatsoever.

    What I can appreciate are the higher fidelity textures that become apparent on the tree bark, Eivor himself and things like each blade of the bushes being properly defined rather than the blurry TAA mess it was at native res.

    To me this adds a lot more depth and immersion to the game than ultra shadows(5-7 FPS loss) or ultra clouds (another 5-7 FPS loss).

    Applying a sharpening pass and playing with exposure, contrast and clarity filters(not captured with nvidia screenshot tool for some reason) make such a drastic difference to the TAA in this game that it looks like a remaster.

    What's more is, medium Volumetric clouds and shadows at this higher resolution already surpass the ultra high sample counts at the lower native resolution of my monitor ! So technically, I'm already playing at "higher" than highest. That's also why the game is not scaling linearly with resolution changes.

    Compared to the PS5 version of this game (Drops to 1080p in places), Valhalla on my PC looks like a generation apart.
     
    Last edited: Oct 9, 2021
  6. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    If you need to use DLSS, then arguably lower res + Quality DLSS >> higher res+ Crappy DLSS even if its supersampled.

    2560x1080 native with DLSS Quality 55-80 FPS:
    Cyberpunk 2077 2560.jpg

    3840x1620(2x) with DLSS Balanced 35-65 FPS:
    Cyberpunk 2077 3840.jpg

    5120x2160(4x) with DLSS Ultra Performance 30-40 FPS:
    Cyberpunk 2077 5120.jpg

    Here the cleanest image is on my native res with the added benefit of better framerate.
     
    Last edited: Oct 9, 2021
    JRE84 likes this.
  7. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Kunal Shrivastava.... you sir are whats right with NBR....tysvm

    keep it up and nbr might make a comeback....people like proof on interesting subjects
     
    Kunal Shrivastava likes this.
  8. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    well it looks like 1440p is all you will ever need for a monitor...4k is overkill for computers...1080p for tvs..watch this really interesting and has me wondering why i bought a 4k tv..
     
    krabman likes this.
  9. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    There are a lot of factors, HDR for example: You really notice that and you end up getting it and 4k together. Computers can be one of the best places to use 4k if near field because you can take advantage of the resolution by getting more on screen but you have to get the monitor sizing right.

    This is a situatione where there are wrong answers but no single right answer across all use cases.
     
  10. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    from a purely pixel density stand point was all...i think 1440p for computers and 1080p for tvs is all.....if you want 4k and 8k go for it just your eyes wont see it...ignorance is bliss i suppose
     
  11. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,273
    Messages:
    5,258
    Likes Received:
    11,621
    Trophy Points:
    681
    Unless you do something else besides gaming.
     
  12. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    I've been on to the pixel density thing for decades, it's widely known in home theater circles: Balancing the screen sizing with optimum viewing distance; it's one of the first thing that will come up when setting up your home theater. Before you get married to the THX chart you should be aware that there are other sources that recommend 1.5 the diagonal to viewing area coming from equally expert sources. You also have many other factors such as the need to provide a view to multiple seating locations that have to be accounted for and you should remember there is another chart you can look up that defines the distance where all benefit of a resolution increase is generally going to be lost. Just because you don't get it all does not mean that you don't benefit from some of it and some people would like to get as much out of it as they can even if they can't get all of it.

    Are people sitting 15 away from their 48 inch 4k TV getting the benefit of that 4k? No, of course not, at least not all of it. Even in those cases HDR is not subject to viewing distance and many experts would argue it's more important than 4k in terms of seen/felt improvement.

    Home theater viewing distances also don't apply in all, of even many, viewing situations with PCs where the monitor is often near field but HDR is not in effect and picture quality may take a back seat to getting more documents on screen anyways.

    There is no one right answer JRE, you can argue it any way you want but that won't change. At best you can define a window within which you should sit to see a benefit from the increase in resolution without getting so close as to be able to resolve individual pixels which makes the picture appear noisy. Even that box not take into account HDR, seating area, and uses where the real estate is more important than the picture quality.
     
    killkenny1 likes this.
  13. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    science..

    pixels..

    distance..

    convo..


    am i missing something

    so the convo is distance doesn't matter? :p

    i get your point but i was just showing a video of the distances? not much to argue
     
    Last edited: Oct 19, 2021
  14. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    The chart isn't where you're going wrong; it's your conclusions that didn't hold up to objective scrutiny and you should not be replacing my comments with different conclusions I never made. At no point did I ever say viewing distance does not matter. I'd say I'm ready to tap out here: Once someone starts pulling out rhetorical devices like your misdirection, various logical fallacies, that sort of thing, you know it's time to walk away.
     
    killkenny1 likes this.
  15. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    its all good man....i guess my point was i didnt have a point just data
     
← Previous page