The question is not framed right; but basically I want to know a formula such that z% reduced DPI setting is "equivalent" to r.y% and r.x% increased screen resolution (in x and y dimensions).
Equivalent in the sense that reducing by z% the DPI can enable me to fit things on screen as if the x resolution had been increased by r.x% and similarly for y.
Where is this question coming from?
I used to be on 1920x1200 screens; but was recently forced to go to 1920x1080 laptop. The newer 18.4 laptop screen is glossy and I guess as a result is "clearer" in suitable lighting. I feel that I can make out smaller fonts. So I was just wondering...My current dpi is 89. My old dpi was 96.
-
-
Decrease in DPI by x% = Increase in what the screen can show by 1 / (1 - x%) - 1.
Ex. Decrease by 10% = Increase by 1 / 90% - 1 = 11%.
DPI is linear (per inch is linear), resolutions are linear (width and height are linear), therefore it's just a straight inverse of each other. -
That being said, you should (ALWAYS??) run your screen/graphics adapter at their full native resolution & color depth and use available OS and application tools to translate DPI and font sizes to your needs. To do anything else introduces distortions and artifacts that play bloody hell with your eyes.
-
Thanks!
Formula for reducing DPI to increase screen resolution?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Kyle, Dec 20, 2010.