I know that running at the LCD screen's native res looks the best. I was wondering how much worse off would it be if on a 15.6inch 1080p(Asus G51J)I ran it 1280*800 or 1440*900? Now I would think having such a tighter pixel density in a 15.6 inch screen that it may not look as bad as a 23inch desktop monitor downscaling from native.
The reason why this is an issue for me is because I don't want to always run games at 1080p. I would rather run it at a lower res so I can have better framerates/increase settings. 1080p could be handy for desktop/2d use. Any thoughts?
-
I've seen my girlfriend's Studio 1555 with 15.6" 1080p display running at 1280x720 for TF2 before...it's honestly not too bad. For the record, 1280x720 preserves the 1080p (1920x1080) aspect ratio of 16:9 while 1280x800 does not; it is 16:10. Also, instead of 1440x900 which is also 16:10, try 1600x900 if it's an option (it isn't always).
-
While playing a game, it's not going to matter much at all. Just stick with native resolution for text, and static images.
-
The GTX 260m in the Asus g51j should be able to handle any game thrown at it at 1080p. It will even run crysis at 1080p with high settings.
-
-
This is why I still love my old CRT. Does gaming on a non-native resolution really look better than a non-native resolution on the desktop? Does anyone have any pictures showing the side by side difference of native vs non-native resolution in gaming?
-
As long as you stick within the same aspect ratio as your native resolution, it will look just fine. Here's what you do. In the Nvidia Control Panel, there's an option on the left menu where you can adjust aspect ratios. Select use Nvidia scaling. It will keep the aspect ratio the best it can. So if you play Crysis at 1280x720 (720p) instead of 1920x1080 (1080p), it will still look the same, and the aspect ratio will be maintained.
-
LCDs as well as general softwares are getting better with the "scaling issue" so doing stuff at non-native now doesn't really look as bad as it used to.
I personally don't think(or see nay reason) why games specifically would scale better than anything else.
Those who assert it looks good most likely also assert a lower resolution(but that still maintains the same aspect ratio) looks good on Aero or any other application. The difference is that you don't need to tone down resolution on the other stuff as games are really the main demanding thing which hog up graphics resources at higher resolutions(that and graphics rendering programs).
Yeah I miss CRTs too; a good CRT would scale everything almost perfectly without grainy problems due to pixel density >.> -
I am very happy with my native 1440*900, so I don't think you'll dislike that res for gaming...
-
-
I find that there is a quality loss when not at native, its not as bad as it was, but im a visual nut, so I notice right away either way. Comes to personal preference. -
The only thing I can give you is that you are right about pixel density. A 1920x1080 LCD will generally scale to 1280x720 better than a 1600x900 LCD would scale to 1280x720. This is just by virtue of having higher number of pixels to use in the interpolation. -
Games are essentially video, and when you see two independent frames in quick succession, your mind blurs them together to create the illusion of seamless motion.
Secondly, games tend to use a lot of blur effects anyways. You have motion blur, antialiasing, antistropic filtering, bloom, HDR, etc... which all in one way or another blur the image. Thus, even while using native resolution, many things tend to have a blur on it anyways.
Compare this with text, or a static image, which are pixel perfect, and you get drastic difference in being able to detect scaling for things that are static and things that are in motion. -
-
Yes, but if you're staring at menus all day, you're doing it wrong.
I didn't say that games are immune to the scaling effect, just that it's much harder to notice in a game than if you were reading/writing a document. -
Yes, but my point is that you can still notice. Life bars, score bars, things like that are static elements which arguably you DO have to stare at quite a bit depending on the game so in the end, one could notice the "fuzziness".
So in reality, games don't really downscale better(which is what I was asserting) so much as it is simply more difficult to notice downscaling "fuzziness" in games. -
Well, I would define "harder to notice" as downscaling better, but now we're just bickering over terminology.
Running a lower resolution then native..
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Physick, Oct 22, 2009.