There are lower end GPUs like the 940mx, mx150, 1050 gtx which could profit a lot in performance if they played games at 960x540. Obviously 960x540 is also the resolution where pixels from the (mostly default) 1920x1080 could be scaled by a straight integer value.
Now I have heard that Nvidia allows to create custom resolutions in the control panel, which can "force" some games to play at that resolution.
But can someone with actual first hand experience answer the following questions?
a) Is forced downscaling to 960x540 possible in common 3d titles?
b) Is this downscaling possible on a laptop screen? (I've read that sometimes IntelHD may prevent it)
b) Is the image quality sharper than downscaling to 1280x720?
-
-
don_svetlio In the Pipe, Five by Five.
I'd rather play at 30fps @ native 1080/768 than 120fps @ anything lower. Really, non-native resolution reduced to that extent just results in a horrible blurry mess that is more akin to torture than entertainment.
Ryan Russ likes this. -
You didn't understand my question:
Scaling from 1920x1080 to 960x540 is a ratio of 2:1 in width and height. This means a 1x1 pixel in 960x540 resolution can be projected exactly to 2x2 pixels on 1920x1080, which means that the rendered image should look crisp and clear.
Scaling to resolutions like 1280x720 have a ratio like 1,3:1 (or something like that) in comparison, and that's why the rendered image looks blurry.
The questions is, if the image is really sharper on 960x540 than on 1280x720, or if they still use cr4ppy interpolation algorithms, which make the 960x540 image more blurred than necessary.Last edited: Dec 7, 2017Starlight5 likes this. -
don_svetlio In the Pipe, Five by Five.
At that resolution, even with proper scaling, it's just too low for it to be even remotely crisp. Even at 14-15" it's just unbearable to look at anything sub 1366x768 without having pixels the size of cookies on your screen.
-
Isn't 540p the native resolution of the PSVita? If that's the case I suppose you can comfortably game at that resolution, if the screen is 5". For a proper laptop screen/monitor, I'd say 540p is rather unsightly, to say the least.
-
It would be very kind if someone could just try it out... forcing a 960x540 resolution via Nvidia Control Panel, and test if it applies in the game, and then report if it's sharper than 1280x720. This would be much more helpful than any speculation.
I would do it myself if I had a laptop with sufficient specs, but I haven't. My rig is ancient... next purchase is "around the corner" (keep saying that for months...), so I just hoped someone could clarify that inbefore, so I could be more flexible in my choice of low-mid-range laptops. -
saturnotaku Notebook Nobel Laureate
Running 720p on my 2560x1440 external monitor looks pretty bad. Doing the same on a native 1080p display will be worse.
don_svetlio likes this. -
Ok, for your screen with a native resolution of 2560x1440 the question is:
Does 1280x720 look sharper/more crisp than 1600x900 or 1920x1080? -
don_svetlio In the Pipe, Five by Five.
-
That is bad... very bad. That means they apply the same interpolation algorithm even if they could render exactly 1:4 pixels.
blargh... where is that puke emoticon when you need it?
-
I've done that on lower end laptops with 1080p LCD's. No it's not going to be crisp, but it's half the resolution, and obviously improves framerates. I prefer 40-50 FPS at 960x540, over 25-30FPS at 1080p. I've also run at 800x450
JRE84 and franzerich like this. -
yrekabakery Notebook Virtuoso
franzerich likes this. -
Yeah... I digged through the internet meanwhile and found a 45 pages thread on Nvidia forums requesting this scaling feature. It's ridiculous that this hasn't been implemented yet. Really really disappointing...
JRE84 likes this. -
I can tell you 720p will still be sharper and look better from experience
-
yrekabakery Notebook Virtuoso
Some of you guys have pretty high tolerance. 900p is the absolute lowest I can stand on a 1080p screen.
-
Just for your experiment, I booted Crysis with a custom resolution of 960x540, took a screenshot, then set it to 720p, and took a screenshot.
I compared the two, and 540p wasn't even close to the 720p IQ. It's a blurry mess.don_svetlio likes this. -
franzerich likes this. -
Interesting. It seems to suffer more from the low quality effects and blobb textures than from the low resolution. I watched the same scene on low settings in FullHD and it didn't look much better (because the same low quality effects and blobb textures).
-
Take some screenshots and scale them in an image editor to see for yourself. This might be a matter of personal preference to some degree, but I feel it's safe to say I would go with 720p scaled to 1080p instead of 540p scaled to 1080p every time no matter the scaling algorithm (unless frame rate is really unbearable). Blurred pixels or not, the amount of meaningful information you get from 540p scaled is much lower.
don_svetlio likes this. -
StormJumper Notebook Virtuoso
-
- The upscaled [non-interpolated] picture has a couple of more crisp edges and corners, but also looks more "noisy" (because of the visible pixels on every single object in the scene). But this noise is a "bad" noise, because it has nothing to do with the "fidelity" of an original FHD picture.
- The upscaled [interpolated] picture (cubic interpolation or sinc3) has less crisp edges, but looks smoother.
In the end both are kinda "equal" in quality. In fact I'd actually say the [interpolated] one looks more comfortable to look at (contrary to my initial assumption). My imagination was obviously wrong imagining "superior image quality" on the [non-interpolated] one.
It's as you said: they lose so much image information, it doesn't matter if interpolated or not. By this reasoning 1280x720 must also be better than 960x540 (and it is - even if only slightly). I now understand why no one makes an attempt for integer scaling... barely worth.
Note that I haven't tried it out by an actual game myself, but used a screenshot available in various resolutions from the internet, and upscaled the ones I wanted to compare).
Native FHD
960x540 upscaled to FHD
960x540 upscaled to FHD (non-interpolated)
1280x720 upscaled to FHD
In comparison to the screenshot in Native FHD all others are visibly worse (it should be noted, that you can hardly see the differences in the slideshow, but when you download all pictures and inspect them in native resolution you'll see the differences). The 960x540 however is not so much worse than 1280x720. Eventually if you are forced to reduce the resolution anyways, then 540p is definitely an option to consider.Last edited: Dec 15, 2017 -
It's pretty straightforward the lower the resolution the fuzzier and blurrier the image will be.
, interpolation is irrelevant in every and all circumstancesdon_svetlio likes this. -
Bottom line is, if you have potato hardware, deal with what you're given and scale down or don't play.
bennni and don_svetlio like this. -
Here's a no-AA example from Destiny 2. The down sampling was always done with nearest neighbour.
- Original 1080p screenshot: link
- 1080p (cropped to center): https://i.imgur.com/18p6WEd.png
- 720p upscaled, spline: https://i.imgur.com/QnePS5W.png
- 540p upscaled, spline: https://i.imgur.com/dqqtOQu.png
- 540p upscaled, nearest neighbor: https://i.imgur.com/kmVom3M.png
- imgur gallery: https://imgur.com/a/egmCd
This is an extreme case full of hard edges or near vertical thin bars. In many games most scenes won't look as bad.Last edited: Jan 11, 2018franzerich likes this. -
I played Dark Souls 3 and Nier Automata in 800x450 resolution on my laptop with a GT 840M gpu, and while the "blurry-pixelated mess" is quite obvious, the jump in performance worth every single time
Of course it depends on which game/genre it is, as I played The Witcher 3 at 720p even though the performance can drops to 20fps but it's no where near unplayable
This is just something higher end gpu user could never understand the benefits, because the changes in performance at 100-200fps or 50-80fps even doesnt as effective as going between 20-50fpsLast edited: Dec 14, 2017 -
-
Actually 1080p on a 4k monitor just looks a touch soft... Nothing like 768p on a 1080p screen.... Don't forget 1080p still has millions of pixels in the small area
-
Even i cant imagine down scaling any game on 960x540 with my 750m GPU as it would be a blurry mess with my 17" laptop with a 1080p screen. It might be less blurry if you have a 768p screen and/or smaller laptop but other than that its not worth it. Sure it could be playable but it wont be enjoyable imo
-
. This option would definitely be nice to have. Besides the 540p resolution would also be only a fourth of 1080p, and therefore allow to bump up various graphics settings in the game.
Last edited: Jan 3, 2018 -
wow thats rough, running at dvd quality in 2018...
so whats the verdict, 720p or 540p which looks better? Im thinking 720p becuase its a higher resolution and has more pixels per square inch, seems like a no brainer. -
I just found out that the Crysis Demo (from 2007) has the resolution 960x540 included in its graphic settings. So everyone can test easily how it looks like on a FHD screen. It's very useful because you can see how it works out for vegetation.
Turns out 540p looks horrible (on a 15.6"). And... 720p looks better. Even if 540p had 'nearest neighbour scaling', I doubt it would look satisfying. Too much pixelation, and the vegetation becomes clumpy. Destroys the whole atmosphere. Feels like every resolution other than native looks bad...Last edited: Feb 15, 2018 -
Starlight5 Yes, I'm a cat. What else is there to say, really?
Did anyone manage to run Overwatch in 960x540 full-screen somehow?
-
yrekabakery Notebook Virtuoso
Starlight5 likes this. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
@yrekabakery I did, but OW doesn't allow to select it as an option. I am able to run lower resolutions in window & borderless, but not in full-screen.
Last edited: Feb 15, 2018 -
yrekabakery Notebook Virtuoso
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
@yrekabakery well it still doesn't provide enough FPS, otherwise I wouldn't ask. FPS is good with 960x540 borderless (all other settings are absolute minimum), but I read that borderless windowed causes additional input lag in Overwatch compared to fullscreen - and that's exactly what I'm trying to further improve.
-
yrekabakery Notebook Virtuoso
Starlight5 likes this. -
@yrekabakery was talking about input lag, not frame rate.
BTW, mobile screens usually don't have scalers, so it can be slightly slower as the GPU has to another resampling step. For (e)DP the per-frame data transmission is not locked to the scanning clock, and a 540p frame would take less time to go through the wires. Even if there's a hardware scaler in the screen controller, running it could introduce additional input lag.Starlight5 likes this. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
That's the only setting that produces enough FPS and thus allows me to aim without much input lag. While I can run Overwatch at this resolution in windowed or borderless windowed mode, I don't know how to enforce this setting in fullscreen - and thus ask for help. -
yrekabakery Notebook Virtuoso
Starlight5 likes this. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
@yrekabakery HD 520 crippled by single-channel RAM. The alternative is 1024x768 (with 50% render ofc) full-screen which produces terrible image, and FPS often goes less than 60 which is very bad for input lag. 960x540 looks much better and cleaner because of matching aspect ratio, and runs faster - though placing headshots as McCree or Soldier ain't easy because it is indeed hard to distinguish where's the character's head.
FWIW I can run Overwatch at much higher settings - but that means being stuck at 30 fps and, consequently, tremendous input lag which makes it impossible to aim properly with most characters. Bottom line, it's either stable 60+ fps, or nothing. I'd hook up eGPU but the machine doesn't have TB3, so it'll have to wait until I get a new one, either with TB3 or Intel-AMD SoC - don't fancy neither gaming behemoths, nor unreliable consumer junk with whining coolers; and there's always a chance I'll end up buying a (8-th gen) machine with iGPU and no TB3 again, but dual-channel RAM this time (32GB is quite proficient for my workflow, unlike dGPU which I can only use for games).Last edited: Feb 16, 2018 -
-
franzerich likes this. -
Falkentyne Notebook Prophet
compared to some of the other stuff out around that time, that did look photorealistic.
-
tbqh i see this asked quite a lot, and the answer is no. the reason being that pixels are closer to hexagons, not cubes, so an increase in resolution from 960x540 to 1080p makes it look worse. in theory it would work because of the transfer from 2x2 to 1x1 but in actuality, it doesn't, that's just first of the reason.
notice the LCD array's sides.
and look at 4k footage of MGSV Ground Zeroes. you get pop-in of detail because you can actually portray enough detail in 4k vs 1080p.
vs
and a comparison shot
https://international.download.nvid...n-cross-section-7-3840x2160-vs-1920x1080.html -
yrekabakery Notebook Virtuoso
-
Pentile screens (which you should avoid anyway) are a different story.Last edited: Feb 27, 2018
Is Gaming in 960x540 possible? How is the interpolation?
Discussion in 'Gaming (Software and Graphics Cards)' started by franzerich, Dec 7, 2017.