The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Is Gaming in 960x540 possible? How is the interpolation?

    Discussion in 'Gaming (Software and Graphics Cards)' started by franzerich, Dec 7, 2017.

  1. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    There are lower end GPUs like the 940mx, mx150, 1050 gtx which could profit a lot in performance if they played games at 960x540. Obviously 960x540 is also the resolution where pixels from the (mostly default) 1920x1080 could be scaled by a straight integer value.

    Now I have heard that Nvidia allows to create custom resolutions in the control panel, which can "force" some games to play at that resolution.

    But can someone with actual first hand experience answer the following questions?

    a) Is forced downscaling to 960x540 possible in common 3d titles?
    b) Is this downscaling possible on a laptop screen? (I've read that sometimes IntelHD may prevent it)
    b) Is the image quality sharper than downscaling to 1280x720?
     
    Starlight5 likes this.
  2. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    I'd rather play at 30fps @ native 1080/768 than 120fps @ anything lower. Really, non-native resolution reduced to that extent just results in a horrible blurry mess that is more akin to torture than entertainment.
     
    Ryan Russ likes this.
  3. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    You didn't understand my question:

    Scaling from 1920x1080 to 960x540 is a ratio of 2:1 in width and height. This means a 1x1 pixel in 960x540 resolution can be projected exactly to 2x2 pixels on 1920x1080, which means that the rendered image should look crisp and clear.

    Scaling to resolutions like 1280x720 have a ratio like 1,3:1 (or something like that) in comparison, and that's why the rendered image looks blurry.

    The questions is, if the image is really sharper on 960x540 than on 1280x720, or if they still use cr4ppy interpolation algorithms, which make the 960x540 image more blurred than necessary.
     
    Last edited: Dec 7, 2017
    Starlight5 likes this.
  4. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    At that resolution, even with proper scaling, it's just too low for it to be even remotely crisp. Even at 14-15" it's just unbearable to look at anything sub 1366x768 without having pixels the size of cookies on your screen.
     
  5. LanceAvion

    LanceAvion Notebook Deity

    Reputations:
    131
    Messages:
    736
    Likes Received:
    168
    Trophy Points:
    56
    Isn't 540p the native resolution of the PSVita? If that's the case I suppose you can comfortably game at that resolution, if the screen is 5". For a proper laptop screen/monitor, I'd say 540p is rather unsightly, to say the least.
     
  6. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    It would be very kind if someone could just try it out... forcing a 960x540 resolution via Nvidia Control Panel, and test if it applies in the game, and then report if it's sharper than 1280x720. This would be much more helpful than any speculation.

    I would do it myself if I had a laptop with sufficient specs, but I haven't. My rig is ancient... next purchase is "around the corner" (keep saying that for months...), so I just hoped someone could clarify that inbefore, so I could be more flexible in my choice of low-mid-range laptops.
     
  7. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,705
    Trophy Points:
    431
    Running 720p on my 2560x1440 external monitor looks pretty bad. Doing the same on a native 1080p display will be worse.
     
    don_svetlio likes this.
  8. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    Ok, for your screen with a native resolution of 2560x1440 the question is:

    Does 1280x720 look sharper/more crisp than 1600x900 or 1920x1080?
     
  9. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    No, no it does not. There is no real magic here, anything non-native will be blurry and the lower it goes, the worse it gets.
     
  10. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    That is bad... very bad. That means they apply the same interpolation algorithm even if they could render exactly 1:4 pixels.

    blargh :eek:... where is that puke emoticon when you need it?
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I've done that on lower end laptops with 1080p LCD's. No it's not going to be crisp, but it's half the resolution, and obviously improves framerates. I prefer 40-50 FPS at 960x540, over 25-30FPS at 1080p. I've also run at 800x450
     
    JRE84 and franzerich like this.
  12. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Nearest neighbor upscaling in drivers is something that would be really useful in this use-case scenario. As well as for 720p on a 1440p display, 1080p on a 4K, 1440p on a 5K, etc. But nobody has implemented it yet after all these years.
     
    franzerich likes this.
  13. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    Yeah... I digged through the internet meanwhile and found a 45 pages thread on Nvidia forums requesting this scaling feature. It's ridiculous that this hasn't been implemented yet. Really really disappointing...
     
    JRE84 likes this.
  14. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    I can tell you 720p will still be sharper and look better from experience
     
  15. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Some of you guys have pretty high tolerance. 900p is the absolute lowest I can stand on a 1080p screen.
     
  16. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Just for your experiment, I booted Crysis with a custom resolution of 960x540, took a screenshot, then set it to 720p, and took a screenshot.

    I compared the two, and 540p wasn't even close to the 720p IQ. It's a blurry mess.
     
    don_svetlio likes this.
  17. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    When you have a low spec machine and it's between being able to play and not play, it's not that horrible. Yes 960x540 would look awful on a 17" or bigger display. But on smaller displays it's not such a big deal. Like this video I made a few years back to play Tomb Raider on Transformer T100 at 800x450 resolution.

     
    franzerich likes this.
  18. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    Interesting. It seems to suffer more from the low quality effects and blobb textures than from the low resolution. I watched the same scene on low settings in FullHD and it didn't look much better (because the same low quality effects and blobb textures).
     
  19. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Take some screenshots and scale them in an image editor to see for yourself. This might be a matter of personal preference to some degree, but I feel it's safe to say I would go with 720p scaled to 1080p instead of 540p scaled to 1080p every time no matter the scaling algorithm (unless frame rate is really unbearable). Blurred pixels or not, the amount of meaningful information you get from 540p scaled is much lower.
     
    don_svetlio likes this.
  20. StormJumper

    StormJumper Notebook Virtuoso

    Reputations:
    579
    Messages:
    3,537
    Likes Received:
    488
    Trophy Points:
    151
    Maths doesn't always equal 1.

    Where did you hear that at?



    That would depend on the Monitors Native resolution first. Going smaller would make it worse - I seen regular monitor where others switch it down in scaling and all text and display were so blurry like I was blind. So there is a limit that can't be crossed that would make it worse.

    And also as you mentioned some hardware might not let you go there either by design or chipset limits. Also some games might also block you from doing that as well.

    Downscaling make it worse if it isn't Native Resolution.
     
  21. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    Ok, I have tested a screenshot of Rise of Tomb Raider at 960x540 and upscaled it to 1920x1080 [with and without interpolation] and must admit, the result is barely different.

    - The upscaled [non-interpolated] picture has a couple of more crisp edges and corners, but also looks more "noisy" (because of the visible pixels on every single object in the scene). But this noise is a "bad" noise, because it has nothing to do with the "fidelity" of an original FHD picture.
    - The upscaled [interpolated] picture (cubic interpolation or sinc3) has less crisp edges, but looks smoother.

    In the end both are kinda "equal" in quality. In fact I'd actually say the [interpolated] one looks more comfortable to look at (contrary to my initial assumption). My imagination was obviously wrong imagining "superior image quality" on the [non-interpolated] one.
    It's as you said: they lose so much image information, it doesn't matter if interpolated or not. By this reasoning 1280x720 must also be better than 960x540 (and it is - even if only slightly). I now understand why no one makes an attempt for integer scaling... barely worth.

    Note that I haven't tried it out by an actual game myself, but used a screenshot available in various resolutions from the internet, and upscaled the ones I wanted to compare).
    Native FHD.jpg Native FHD
    960x540 upscaled to FHD.jpg 960x540 upscaled to FHD
    960x540 upscaled to FHD non-interpolated.jpg 960x540 upscaled to FHD (non-interpolated)
    1280x720 upscaled to FHD.jpg 1280x720 upscaled to FHD

    In comparison to the screenshot in Native FHD all others are visibly worse (it should be noted, that you can hardly see the differences in the slideshow, but when you download all pictures and inspect them in native resolution you'll see the differences). The 960x540 however is not so much worse than 1280x720. Eventually if you are forced to reduce the resolution anyways, then 540p is definitely an option to consider.
     
    Last edited: Dec 15, 2017
  22. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    It's pretty straightforward the lower the resolution the fuzzier and blurrier the image will be.
    , interpolation is irrelevant in every and all circumstances
     
    don_svetlio likes this.
  23. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Bottom line is, if you have potato hardware, deal with what you're given and scale down or don't play.
     
    bennni and don_svetlio like this.
  24. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    The Tomb Raider screenshot you used was not a good one. If you're forced to run your game at such resolution due to GPU performance limitation, chances are you can't afford any high quality AA, so we should use screenshots with no AA or cheap post-AA instead. The PR image copy you used was also delivered as a lossy JPG and likely compressed twice (there's 3rd party watermark).

    Here's a no-AA example from Destiny 2. The down sampling was always done with nearest neighbour.

    This is an extreme case full of hard edges or near vertical thin bars. In many games most scenes won't look as bad.
     
    Last edited: Jan 11, 2018
    franzerich likes this.
  25. pipyakas

    pipyakas Notebook Guru

    Reputations:
    14
    Messages:
    56
    Likes Received:
    40
    Trophy Points:
    26
    I played Dark Souls 3 and Nier Automata in 800x450 resolution on my laptop with a GT 840M gpu, and while the "blurry-pixelated mess" is quite obvious, the jump in performance worth every single time
    Of course it depends on which game/genre it is, as I played The Witcher 3 at 720p even though the performance can drops to 20fps but it's no where near unplayable

    most of the time FXAA is feasible, even on the lowest level of gpu performance, and in some games (ex Rise of the Tomb Raider) it cleans up a lot of the jaggies. Some games that include an internal resolution scaler (DOOM 2016, RainbowSIx Siege) also include great AA solution that doesnt have the same performance cost as MSAA or even SMAA while still produce acceptable IQ
    This is just something higher end gpu user could never understand the benefits, because the changes in performance at 100-200fps or 50-80fps even doesnt as effective as going between 20-50fps
     
    Last edited: Dec 14, 2017
  26. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    As much as we hope it would, interpolation doesn't work like that. It'll still look like a blurry mess. Same as 1080 looks on a 4K monitor.
     
  27. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Actually 1080p on a 4k monitor just looks a touch soft... Nothing like 768p on a 1080p screen.... Don't forget 1080p still has millions of pixels in the small area
     
  28. KING19

    KING19 Notebook Deity

    Reputations:
    358
    Messages:
    1,170
    Likes Received:
    779
    Trophy Points:
    131
    Even i cant imagine down scaling any game on 960x540 with my 750m GPU as it would be a blurry mess with my 17" laptop with a 1080p screen. It might be less blurry if you have a 768p screen and/or smaller laptop but other than that its not worth it. Sure it could be playable but it wont be enjoyable imo
     
  29. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    That nearest_neighbour surely looks crisp. A lot of jagged edges, but definitely a clear picture. There's also a nice retro feel to it :D. This option would definitely be nice to have. Besides the 540p resolution would also be only a fourth of 1080p, and therefore allow to bump up various graphics settings in the game.
     
    Last edited: Jan 3, 2018
  30. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    wow thats rough, running at dvd quality in 2018...

    so whats the verdict, 720p or 540p which looks better? Im thinking 720p becuase its a higher resolution and has more pixels per square inch, seems like a no brainer.
     
  31. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    I just found out that the Crysis Demo (from 2007) has the resolution 960x540 included in its graphic settings. So everyone can test easily how it looks like on a FHD screen. It's very useful because you can see how it works out for vegetation.

    Turns out 540p looks horrible (on a 15.6"). And... 720p looks better. Even if 540p had 'nearest neighbour scaling', I doubt it would look satisfying. Too much pixelation, and the vegetation becomes clumpy. Destroys the whole atmosphere. Feels like every resolution other than native looks bad...
     
    Last edited: Feb 15, 2018
  32. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    Did anyone manage to run Overwatch in 960x540 full-screen somehow?
     
  33. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Try making a custom resolution in CRU or your GPU control panel.
     
    Starlight5 likes this.
  34. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    @yrekabakery I did, but OW doesn't allow to select it as an option. I am able to run lower resolutions in window & borderless, but not in full-screen.
     
    Last edited: Feb 15, 2018
  35. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Oh I totally forgot, OW has a built-in scaler option. 1920x1080 at 50% render scale is equivalent to 960x540.
     
  36. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    @yrekabakery well it still doesn't provide enough FPS, otherwise I wouldn't ask. FPS is good with 960x540 borderless (all other settings are absolute minimum), but I read that borderless windowed causes additional input lag in Overwatch compared to fullscreen - and that's exactly what I'm trying to further improve.
     
  37. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Fullscreen 1920x1080 with 50% render scale should provide the exact same FPS as 960x540.
     
    Starlight5 likes this.
  38. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    @yrekabakery was talking about input lag, not frame rate.

    BTW, mobile screens usually don't have scalers, so it can be slightly slower as the GPU has to another resampling step. For (e)DP the per-frame data transmission is not locked to the scanning clock, and a 540p frame would take less time to go through the wires. Even if there's a hardware scaler in the screen controller, running it could introduce additional input lag.
     
    Starlight5 likes this.
  39. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    I need Overwatch to run 960x540 with 50% render scale. In other words, 1920x1080 with 25% render scale.

    That's the only setting that produces enough FPS and thus allows me to aim without much input lag. While I can run Overwatch at this resolution in windowed or borderless windowed mode, I don't know how to enforce this setting in fullscreen - and thus ask for help.
     
  40. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Actually I was talking about frame rate. 1920x1080 50% render scale should produce the same FPS as 960x540 100% render scale.

    Christ. So effectively 480x270? At that point, I think you have bigger problems to worry about than input lag. Like being able to see. What potato GPU are you attempting to play on? OW should be able to playable run at 540p even on semi-modern integrated graphics.
     
    Starlight5 likes this.
  41. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    @yrekabakery HD 520 crippled by single-channel RAM. The alternative is 1024x768 (with 50% render ofc) full-screen which produces terrible image, and FPS often goes less than 60 which is very bad for input lag. 960x540 looks much better and cleaner because of matching aspect ratio, and runs faster - though placing headshots as McCree or Soldier ain't easy because it is indeed hard to distinguish where's the character's head.

    FWIW I can run Overwatch at much higher settings - but that means being stuck at 30 fps and, consequently, tremendous input lag which makes it impossible to aim properly with most characters. Bottom line, it's either stable 60+ fps, or nothing. I'd hook up eGPU but the machine doesn't have TB3, so it'll have to wait until I get a new one, either with TB3 or Intel-AMD SoC - don't fancy neither gaming behemoths, nor unreliable consumer junk with whining coolers; and there's always a chance I'll end up buying a (8-th gen) machine with iGPU and no TB3 again, but dual-channel RAM this time (32GB is quite proficient for my workflow, unlike dGPU which I can only use for games).
     
    Last edited: Feb 16, 2018
  42. franzerich

    franzerich Notebook Evangelist

    Reputations:
    11
    Messages:
    497
    Likes Received:
    121
    Trophy Points:
    56
    After some additional testing I noticed that Crysis is not the best example to test it out. Some vegetation looks bad regardless of the resolution you're using, i.e. Palm tree leaves clump together or become ugly blotches after a distance of 50-100m. The LOD is not very good in this game. I think I've seen better quality in more recent titles. Probably had some faulty memory of "photorealistic" in my head...
     
  43. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    I thought some open ground scenes in the 1st gen Operation Flash Point looked near photorealistic when it was freshly released. And I feel absolutely shameful that I did.

    [​IMG]
     
    franzerich likes this.
  44. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    compared to some of the other stuff out around that time, that did look photorealistic.
     
  45. Ryan Russ

    Ryan Russ Notebook Consultant

    Reputations:
    34
    Messages:
    152
    Likes Received:
    65
    Trophy Points:
    41
    tbqh i see this asked quite a lot, and the answer is no. the reason being that pixels are closer to hexagons, not cubes, so an increase in resolution from 960x540 to 1080p makes it look worse. in theory it would work because of the transfer from 2x2 to 1x1 but in actuality, it doesn't, that's just first of the reason.
    [​IMG]
    notice the LCD array's sides.

    and look at 4k footage of MGSV Ground Zeroes. you get pop-in of detail because you can actually portray enough detail in 4k vs 1080p.

    [​IMG]
    vs
    [​IMG]

    and a comparison shot

    https://international.download.nvid...n-cross-section-7-3840x2160-vs-1920x1080.html
     
  46. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Fixed low resolution retro games, or modern retro-style indie games using pixel art, can look quite good using nearest neighbor upsampling instead of the usual blurry bilinear filter. So no, it doesn't always look worse.

    4K pushes the LOD range back, increasing detail in the distance. This is engine dependent, not something inherent to high resolution/SSAA rendering. It doesn't happen in all games.
     
  47. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    In this context we are combining 4 physical pixels into 1, not cutting 1 into 4. It doesn't matter how the subpixels are arranged as long as each logical pixel still has 4 of those in a 2x2 grid that is close to square. Unless you're running subpixel-dependent rendering which is rare outside GUI text, the pixel combination works the same way.

    Pentile screens (which you should avoid anyway) are a different story.
     
    Last edited: Feb 27, 2018