If I have a 1920*1080 screen, will games look good if I play them in 1366*768?
A friend told me that they would look just as good as when I would be playing on a 1366*768 native screen. Is he right?
-
yeap, same aspect ratio so there will be little or no distortion
-
Well LCDs have gotten quite a bit better over time so the issue fo "non-native" resolution isn't as bad as it used to be. Theoretically there should be a better image at native resolution, but really downscaling has gotten better and as I've said, so have screens so I doubt the image will be THAT much better.
IMO it'll still be an enjoyable experience either way -
Technically, this would be upscaling.
But regardless, it should still look pretty good. You might notice a little less sharpness of lines, but as long is your nose isn't on the glass, it should be hard to notice.
-
I think the opposite to everyone who posted here... I have a 1440x900 screen and when i play 1280x800 i can easily see that the image is blurry, very very less sharp than native...
Now, if you overpass the first day, then your eyes will accomodate to that "quality" and then you won't notice the blurriness (at least not as much as in the beginning). But once you play again in the higher res, you'll see the difference in a second. It's like playing with 0 AA vs 8xAA, there's a world of difference. -
masterchef341 The guy from The Notebook
it depends on where this screen is.
if it is a laptop screen 2 inches away from your face, you will notice more.
if its a large monitor reasonably far away from you or a TV, you will notice less.
also, i seriously think that some higher end TV's have better upscaling algorithms than the graphics cards.
so it all depends.
but the short answer is that your friend is INCORRECT. a 1366x768 image stretched across 1920x1080 pixels will not look as crisp as a 1366x768 image on 1366x768 pixels. But it should not look bad, either. You get more serious problems when you mixing 16:10, 16:9, 5:4, and 4:3 images. notice that 1366/768 is approximately equal to 1920/1080. That is key. -
Just for interest,I just tried sims 2 on my 1280x800 compaq on native and on m4400 on WUXGA's screen,and I think they look 99% the same! -
I play native 1440x900 on a 17", i dont see a difference to my friends 1920x*** 17" really, so it shouldnt be affected too bad.
-
No, the difference i'm talking about is not between 2 machines at native res... I guess that 1440x900 native vs 1680x1050 native vs 1920x1080 native won't look so different in quality... What i'm talking about is playing in non-native (which is what the OP is asking)... Playing at 1366x768 on a 1920x1080 screen won't be even close to play at native...
And Xirurg, don't compare the 2 laptops, compare between 1280x800 and native both on your m4400, which is the difference the OP is talking about (eventhough i think, haven't tried, that 1280x800 native vs 1280x800 downscaled should look different) -
-
Dell Studio 15 with HD4570 and 1366*768 screen running a game in native res.
Dell Studio 15 with HD4570 and 1920*1080 screen running a game in 1366*768 res.
According to Xirurg's test they'll look 99% the same. -
No it will look grainy and sometimes using a resolution other then native can hurt performance.
-
I have a friend with a 1680x1050 screen, and when he tried 1440x900, it didn't look as sharp as 1440x900 on my screen (native). There was a difference, for me, big difference...
For example, when i play 1280x800 the image is blurry, very blurry (compared to native)... If that is how it looks 1280x800 native, or even close, i guess ppl won't ever use 1280x800...
Now, tianxia came up with a point that might be true, maybe 1440x900 downscaled from 1920x1080 look better than downscaled from 1680x1050... But i don't think it will ever be the same... -
-
Yeah I mean, it's not the same for obvious reasons: they physically don't have the same amount of pixels to cover the same area. Therefore, when changing resolutions, the density of pixels is also different.
I'd think the difference(and therefore how noticeable it is) will depend on the quality of the screens themselves though as well. -
-
Upscaled and downscaled images will always look worse than at native res, with the possible exception of an image upscaled to a res that is a multiple of the image (such as a 960X540 image scaled to 1920X1080). The degree to which the degredation is noticeable depends on scalars sitting between the GPU and screen, the screen itself, the res of the screen, the specific piece of software/content being viewed, and to a large degree, subjective observation. The only way to really know if it is ok for you is to look for yourself.
-
Actually, it depends on the game you play.
For example,
Left 4 Dead would not have any distortion for any resolution you put.
The longer the screen, the more you can see and observe.
However, logically, human eyes have only 60angle cone of vision.
So, the image after 60angle cone of vision would be distort.
THIS IS NORMAL AND LOGIC.
PERSPECTIVE THEORY. CONE OF VISION.
FIFA 09 would be distorted because it does not support all screen size. If not mistaken, it only support up to 1280x800.
Unlike Left 4 Dead, it actually fit any screen size(even the screen size that do not exist). -
It would be a little unclear but it's still a nice resoulution, and it will look good. - Of course it would look better in 1920 X 1080 resoulution but that's FULL HD, and you don't need that for games
(Unless you are a geek who wants EVERYTHING MAXED with a simple laptop
)
-
-
Hmm, as apposed for the dell, the 15" screen should be lower res me thinks, i run 1280xblah on my old sony vaio fe21s and its spectacular (ps its a 15.4 inch book)
-
there are mixed reviews i see here.
for me, when i run at lower resolutions, my performance increases.
i try to balance high resolution with performance.
what i notice when i go below 1920 x 1200:
-generally any text will be large, grainy, and distorted a bit
- the HUD is large and fat and blocks more of what you would typically see at higher resolutions
-much of the games textures are not distorted, however, some games with very high textures are distorted... they look fat, grainy, and very pixelated
-generally at far distances, you will not notice the grainy and pixelated distortion, but sometimes objects at far distances look blocky.
however, the performance gains from reducing resolution outweigh the few graphic issues. -
-
-
Yes, as eluded to, the other thing you can do is turn down AA and AF and play at native. Having the game render to a higher res backbuffer can actually have similar effects to AA and AF visually speaking, so you should try that too. But as has been said, definitely go with the highest res panel you can get. High res is great for everything, not just games.
-
Downscaling to 13** rather then having a native of 13** will have reduced qualities, i've experienced this upon testing lots of different res' with games.
I think it really depends on user preference as well though, i cant stand grain, or stretched pixels to be honest, but i can stand working on a window or two at a timeor another monitor full stop hehe.
-
If it's native, higher resolution will always look better.
It's only a question of, can you rig handle driving those resolutions at the settings you want to play at.
Remember, there is a reason the quest to increase resolution is never ending.
It's true of home theater screens and it's true of laptop screens.
Simple question-simple answer.
Play at 1920x1080 if you can Phil. -
Any way, I know enough. -
Hmm, i try 1920 on full hd 32/42 inch tvs and it looks dreadful in comparison to a laptop screen, just the pictures large looks good but thats it, icons and text are..
-
TVs aren't optimized for crisp text and other such uses. Plus are you sure the native res on your TV is actually full 1080p? Lots of TVs that say they support 1080p actually mean they accept a 1080p input but scale it to a native lower res. I say that especially because you say you tested on a 32" TV, and full 1080p native 32" TVs are still relatively rare (most seem to be 720p/768p native).
-
if you have a nvidia GPU, open Nvidia control panel--->Click Change Flat Panel Scaling---> Select Use Nvidia scaling with fixed aspect ratio--->Click apply.
I play crysis on 1400x10xx and it still looks amazing(i have a 1920 x 1200 res) not so much when i didnt use the fixed aspect ratio scaling.
-
Always try to drive the hardware at native resolution. Even though it's the same aspect ratio, 720p on a 1080p screen just doesn't have the clarity 720p would have on a native 720p screen. But eventually your eyes will adjust to it.
To me, this means having a higher than average res for a mid range gaming notebook isn't really a good thing. If the notebook can't drive games at 1080p then it should have a lower res screen option or you should find another notebook.
If you're not gaming, then just find the highest res you can. -
Hey, I paid for 1200P native resolution, I'm gonna d*** well use it!
-
Haha, my tv's are definately 1080 (as the tv res posts itself during interfaces plugged in)
Its a shame they are not so crisp because i would love to play games much "bigger" then 17 inch haha.
What are peoples views on these 600hz tvs these days? -
As for 600Hz, it's overkill. I have a 120Hz 61" DLP (that displays 1920x1080 quite sharply) and I can't find any problems with it. I can't see what use a 600Hz TV is other than having a bigger number in a bullet point list for selling it. Kind of like having a 15 megapixel snapshot camera... there's no technical benefit to it. -
The only use I see for having a 600hz tv is subliminal messaging
-
Yeah, what is 600Hz even supposed to do for you? It's 10 times the refresh rate of any video signal it is likely to experience. Definitely seems like overkill to me.
-
I'm fairly sure he made a typo...
-
How can a monitor decide if I can or cannot scale a non-native resolution, absolutely ridiculous!!!Attached Files:
-
-
Lethal Lottery Notebook Betrayer
in my expirance anything other than native res looks terrible.
-
I say go for the higher res screen , because the 4570 can't really handle either resolution, so even with the 1366 screen, you'll probably be dropping the resolution to 1280x720 anyway.
-
i prefer to play in native res
-
Native lower for that card will yield better performance by far.
Naaa, its not a typo, various 600hz tvs popping up on UK market these days.
Might be plasma though, and i definately believe that my TV's are full HD res although that would suck big time if they were at 13**. Will have to check the specification tonight. -
This argument is so similar to the mp3 sampling rate argument that my friend and I have everytime we go over each other's mp3 collection...
My friend swears he can tell the difference between 128K, 192K and 256K sampling...he won't listen to anything less than 256K sampling...I can't really tell the difference, and since I spend most of my time listening to mp3's on a laptop or mp3 player which aren't even close to high-end audio equipment quality, I prefer to have more mp3's at a lower quality...I think I'm limited by my ears anyway...
And that's off-topic, but that's what this discussion reminded me of, so in my dotage, I felt the need to reminisce...
Video scaling has dramatically improved over the last decade with the advent of digital display devices...my first laptop (Toshiba 315 CDT) was 12" 800x600...most games were still 640x480, and they looked fairly hideous on the screen...in fact, I would usually disable scaling and play games (Starcraft) with black borders around the game...
My current laptop, 17" 1920x1200, has decent scaling on the nVidia 7950 GTX...playing at 1920x1200 looks the best...but several (older) games don't offer that resolution...in fact, they don't even offer 16:10 aspects...and the up-scaling to 16:10 and 1920x1200 doesn't really interfere with my enjoyment of the games...I can turn off adapter scaling, and the boxed image does look crisper, but I prefer the immersion of using all 17" of the screen...
My 7950 GTX can't drive other games (World in Conflict, Crysis, Oblivion) at native resolution so I play at 1280x800 and allow up-scaling...I'm sure the quality suffers and my eyes have grown accustomed to it, but the drop in quality doesn't ruin my enjoyment of the game as much as playing Crysis at 5 fps would...I could turn off adapter scaling, but I'm lazy and hate switching depending on what game I want to play....
And as far as home theater, I have a Panasonic projector with a native 1280x720...DVDs look incredible (depending on the source material, not all DVDs are created equal) at 1280x720 due to the projector's excellent scaling (no external scaler or up-converting DVD player needed, thank you...in fact, my DVD player does up-convert but I have it turned off because the projector has a better scaler)...however, HDTV broadcasts in 720p do look better than broadcasts in 1080i...and I bought a Blu-Ray player, and was somewhat disappointed with the results...720x480 standard-def DVDs scaled up look as good on the projector as 1920x1080 Blu-Ray movies scaled down...
And I'm not sure I understand the point of hi-def on a 32" TV unless you're going to sit within 3 feet of the TV...
But to each their own...
The technical answer (or the purist's answer) to the original question is NO...1366x768 video signal scaled to 1920x1080 does not look as good as 1920x1080 displayed natively nor does 1366x768 scaled to 1920x1080 look as good as 1366x768 displayed natively (screen sizes will also have an impact due to dot pitch, but technically, the NO answer stands...)...but, in the end, if your eyes can't tell the difference, and there are other factors involved, does it really matter?
Will a game in 1366*768 look good on a 1920*1080 screen?
Discussion in 'Gaming (Software and Graphics Cards)' started by Phil, Jun 2, 2009.