I am not seeing why my example is not good. My last example was to show that even though two panels have the same aspect ratio, they can still have scaling issues. I looked back at your 16x9 versus 48x27, and althought the have different resolutions and the same aspect ration, you picked sizes that don't have scaling issues, as it is exactly three times bigger. So your example would be fine. Except that you can't divide one panel by 100 and the other by 40 and then compare them for scaling. (My example was not perfect, but it was close enough, since both sides were divided by about 390).
But a 1920x1080 compared to 1600x900 panel is not an exact integer different. The 1920 panel is 1.2 times the resolution in each direction. It is the point two that causes the problem.
If we can at least agree that aspect ratio and scaling are two different things, then we can focus on scaling in one direction only. Can we agree on this? If not, then I am still not seeing why you bring up aspect ratio when we are talking about scaling.
(Please note, I have been civil, and just hope to clear up the misunderstanding.)
OP, I hope this is helpful. I understand this may be off topic to this thread. But it is on topic to your other thread regarding image quality on these two displays. So it seemed reasonable to answer here. Do you have any questions?
-
and you know what? i'm wrong. a 1920x1080 panel set to display 1600x900 has to scale. you forced me to second guess myself when you said "an LCD panel has a physically defined number of pixels." can somebody actually clarify, would this be an LCD panel's raster? like the physical grid itself, is that the panel's raster? ajnindlo was also helpful when he mentioned that different sized pixels might be used in different displays, which may or may not be true, and noted that individual pixels aren't skipped when displaying an image. he's also 100% correct that HD+ is not an integer factor of FHD, something i consciously ignored and that's very relevant to my mistake. can we forget i said anything about math?(James D, you've been useless all along as per usual.)
my over-the-top model incorrectly assumed that individual pixels could be skipped to "evenly" reproduce a resolution that's less than the panel's raster (if i use that term correctly) or physical pixel grid, but matches the aspect ratio of the panel. i also assumed that a higher PPI would compensate for any skipped pixels. that's where i was hung up. or maybe i've just been confusing LCDs with the way CRT monitors display different resolutions? but i stand corrected in that there's no way for an LCD panel with a native resolution of 1920x1080 to display any other resolution without scaling, be it up or down.
when you argue condescendingly, you have to be prepared to look stupid when you're wrong. and i'm admitting that i was wrong. but if i hadn't tried so hard to make my mistaken point so aggressively and just ducked out, i probably wouldn't have had been corrected so thoroughly. so we all win.2.0 likes this. -
P.S. Admitting that you were wrong is very good. But looks like you are doing it somewhat wrong (IMO of course). Offtop, I know. -
Sent from my PI39100 using Tapatalk -
-
I have made brain farts before, it happens. I kind of think of most of these threads as school. Either we want to teach, to help, or we want to learn. And sometimes you plan to teach, but you learn. I like that the best, as I am still learning (even after 35 years with computers). I also like to think we are all comrades, most of us are here to help, right?
I also want to give props to mattcheau. As it is not easy to admit you made a mistake. If you are not pushing yourself, then you never make mistakes. A man who takes ownership of his faults, finds ways to fix them and grow. A man that blames others will not grow. -
-
So you're also right
By the way, the impact reducing the resolution has on actual rendering speed depends on a lot of different things - this isn't a linear graph depending on number of pixels to be rendered. For example, just when adding a full screen anti-aliasing filter/normal 4x supersampling filter, the amount of pixels to be rendered grows by a factor of 4. So reducing the resolution then has a large impact, if you wanted to keep the filter.
But if you have other solutions for reproducing detail that depend on program logic, such as the kind of indirect filtering you see in Deus Ex, and practically every new game that comes out now, the impact when reducing the resolution is very small.
You could make the argument that higher resolution images with less supersampling causes less breakage and a more accurate representation of the pixels in the graphics context as well. And that this is really much more pleasant to look at than even a similar resolution picture with the super-sampling. After all, you have had a lot of titles lately that have been produced with supersampling - not to smooth out inaccuracies from higher detail in the rendering context. But instead to blur out inaccuracies that were in the scene from the beginning. I.e., a leaf looks "kind of real" when it's blurred out, because otherwise it would just be a single static polygon.
As that kind of thing disappears, we'll hopefully see less and less reliance on the full-scene filters that effectively put a stop to having the "standard" raised beyond 1280x720...
Anyway. So instead what suddenly impacts performance, in an indirect rendering engine, is the number of objects in the scene that aren't reduced before the rendering pipeline on the graphics card. Unpredictable things could happen here, such as grass causing overdraw only when waving in the wind, or needing more frequent lookups for other objects in higher resolution as they are invalidated after physics -- just as an example.
But more and more, resolution isn't really the issue, since indirect rendering engines can use processing power outside the 1/60th of a second to prepare data for the rendering pipeline. Some developers are getting very good at this. But sadly, on current tech (non-integrated buses), you can only go so far.
How much fps gain with 1600x900 instead of 1920x1080?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by KillWonder, Sep 9, 2013.