I got a TV that's not the average 720 or 1080 resolution, but 1360x758 (or 768, cant see). So when I output it to the TV, using my VGA cable (VGA works fine on my 1080p TV -- for those who say VGA can't output HD) from my Dell Inspiron 1520, it looks all pixelated. Mainly the text. While it's not a whole big deal, since I only plan on watching stuff on it, and not using it as a monitor. Is there anyway I can force a much more cleaner resolution on the TV?
I have a LG 32' --- 350 Model
Thanks!
-
davepermen Notebook Nobel Laureate
1) try to set it to gamer mode or something, that disables most "optimisations" a tv does, can help for fonts and such. (but you might want them on for the movies, though)
2) have you set your output resolution to 1360x768? or f.e. 1280x720? check if there's a difference. in one case, your movie player will do the upscaling (depending on the player, you could disable it, and get some tiny borders, even), in the other case, the tv does the upscaling (and he can fail miserably at it).
other than that, depending on your gpu drivers, you can adjust tv outputs quite a bit. try messing around in there.
but one thing is for sure, the 720->768 scaling is a nasty one. if i computed it right, only every 48th row will be sharp, all the other ones are blendings, and thus lack detail/get pixelated (depending on the upscaling).
in the horizontal case, there are actually no lines that will be still sharp, all will be blurred pixels except for the first, the center and the last row (gcd(1366, 1280) is 2?).
so out of your 1 megapixel tv, not even 1k pixels will be actual sharp pixels.
i might have done this completely wrong, though.. but one thing's sure: up/downscalings which are not power-of-two are always nasty, often completely killing any hd out of a picture. -
1366x768 is what is referred to as 720p. Very common for laptops and cheap TV's. While 720p is fine for watching video, it is pretty bad for computer usage. Many people, myself included, complain about 15.6" 720p laptop displays looking pixelated. When expanded to larger screens, the much larger pixels stand out even more. It is perfectly normal that your 32" 720p display looks very pixelated when hooked up to your computer.
davepermen, upscaling of an image does not really work out like that. The closer the original to the scaled resolution, the better it will look, but it will still be blurred, just to a lesser degree. -
davepermen Notebook Nobel Laureate
no, upscaling works the same way. if you upscale x2, you get the original, crisp image. everything else will be less crisp as the pixels have to be interpolated to be made fitting.
and no, 1366x768 is NOT 720p. that's 1280x720. -
-
davepermen Notebook Nobel Laureate
well, no, everything not x2 zooming is much worse than x2 zooming. mathemically, the ones closest to each others are the worst, they have the highest loss in detail.
and it doesn't matter what the industry sells. they don't ever care about picture quality, they care about cheap screens for the masses. and that's what the 1366x ones are: cheap screens for the masses (most don't even really see more than dvd sharpness anyways, have their aspect ratios all wrong, and saturation on max, but never notice that it's somehow wrong) -
-
My understanding of image scaling is that when you scale up, you are making up pixels that doesn't exist, so if it is an integral one like 2x, that should be ok, just duplicate. Anything non-integral, it would depend on how the 'algo' does it. Same goes for scale down but usually scaling down has a better quality than scale up, especially for non-integral ones.
What do you mean by coded for native resoluton of 1366 x 768 ?
If I write the player, I would simply check the case that the resolution is so close and just leave some black band and not doing any scaling in this situation(afterall, the extra space can still be used for controls). -
ViciousXUSMC Master Viking NBR Reviewer
I imagine your not give the TV a proper resolution or have some kind of overscan/underscan happening that is distorting the picture making it look blurry.
Even my 1920x1080 tv looks blurry until I manually go into my ATI drivers and disable the overscan that happens by default for some reason.
Some TV's even have overscan/underscan that needs to be turned off, usually something called "PC Mode" or using a VGA input or similar gets around it. -
davepermen Notebook Nobel Laureate
-
If you scale an image by an integer multiplier, no data is lost. For example, if you scale in image by 2x, you just take a single pixel and make it into a 2x2 cluster of pixels. Therefore, the image is technically just as sharp as it was to begin with.
As for 720p, almost all 720p monitors are 1366x768. The reason for this is overscan. Televisions generally cut off the edges of a picture and then scales the rest of the image to fit the TV. This may be why your TV is blurry when it comes to displaying your computer screen. You need to make sure to set your TV to either a "computer" mode or something that ensures 1-to-1 mapping. -
Don't forget, viewing distance also plays into how "sharp" an image appears, up to a certain pixel density. If one scales an image by 2 the minimum viewing distance for the image to appear "sharp" increases, unless the display has a pixel density that is greater than ~300 PPI in both the x & y axis, then it doesn't matter how close you get to the screen as it would be extremely difficult to discern individual pixels at that point.
-
-
You and daverpermen seem to be arguing from different ends. They're arguing from the source end; given a source of a certain resolution, then best way to view that source is in an integer multiple of that source, AKA if your source is 683x384, then the best panel to view it on (meaning least blurry) is one with integer multipliers, i.e. 1366x768, or 2049x1152. You're arguing from the display end, that given an existing resolution (say, 1366x768), the best image is one that has a resolution closest to that (whatever resolution you can find for the source).
-
Shouldn't the entire argument be from the point of view of the display end, considering that is the only way it is in the real world?
-
davepermen Notebook Nobel Laureate
but what i really was talking about is, better watch 720p in 720p mode (that means, with 24pixel borders on top and bottom). that way, you get 100% of the detail of 720p content, no rescaling, no blurring, no loss of detail. and yes, that is massively better than scaling it to 768p. been there, have done it.
and just in case, on a 1080p screen (mine is 2m x 1.1m so it's big), there isn't much difference to watch a dvd (half 1080p, so scaling by two works fine there) to watching hdtv (720p), as the resulting scaled picture looks about the same in detail.
it's math, nothing else. -
-
This is the reason why when purchasing a digital camera for example, the question of how many pixel is enough depends on the final 'print' size. You need more pixels(at the source level) if you want to print say 12x8 rather than 6x4. IOW, a 12x8 print can look much uglier than 6x4 print if your source is a 3MP DC -
720 to 758 resolution - all grainy
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Sepharite, Dec 23, 2010.