The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    720 to 758 resolution - all grainy

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Sepharite, Dec 23, 2010.

  1. Sepharite

    Sepharite Notebook Consultant

    Reputations:
    9
    Messages:
    273
    Likes Received:
    1
    Trophy Points:
    31
    I got a TV that's not the average 720 or 1080 resolution, but 1360x758 (or 768, cant see). So when I output it to the TV, using my VGA cable (VGA works fine on my 1080p TV -- for those who say VGA can't output HD) from my Dell Inspiron 1520, it looks all pixelated. Mainly the text. While it's not a whole big deal, since I only plan on watching stuff on it, and not using it as a monitor. Is there anyway I can force a much more cleaner resolution on the TV?

    I have a LG 32' --- 350 Model

    Thanks!
     
  2. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    1) try to set it to gamer mode or something, that disables most "optimisations" a tv does, can help for fonts and such. (but you might want them on for the movies, though)

    2) have you set your output resolution to 1360x768? or f.e. 1280x720? check if there's a difference. in one case, your movie player will do the upscaling (depending on the player, you could disable it, and get some tiny borders, even), in the other case, the tv does the upscaling (and he can fail miserably at it).

    other than that, depending on your gpu drivers, you can adjust tv outputs quite a bit. try messing around in there.


    but one thing is for sure, the 720->768 scaling is a nasty one. if i computed it right, only every 48th row will be sharp, all the other ones are blendings, and thus lack detail/get pixelated (depending on the upscaling).

    in the horizontal case, there are actually no lines that will be still sharp, all will be blurred pixels except for the first, the center and the last row (gcd(1366, 1280) is 2?).

    so out of your 1 megapixel tv, not even 1k pixels will be actual sharp pixels.

    i might have done this completely wrong, though.. but one thing's sure: up/downscalings which are not power-of-two are always nasty, often completely killing any hd out of a picture.
     
  3. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    1366x768 is what is referred to as 720p. Very common for laptops and cheap TV's. While 720p is fine for watching video, it is pretty bad for computer usage. Many people, myself included, complain about 15.6" 720p laptop displays looking pixelated. When expanded to larger screens, the much larger pixels stand out even more. It is perfectly normal that your 32" 720p display looks very pixelated when hooked up to your computer.

    davepermen, upscaling of an image does not really work out like that. The closer the original to the scaled resolution, the better it will look, but it will still be blurred, just to a lesser degree.
     
  4. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    no, upscaling works the same way. if you upscale x2, you get the original, crisp image. everything else will be less crisp as the pixels have to be interpolated to be made fitting.

    and no, 1366x768 is NOT 720p. that's 1280x720.
     
  5. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    Obviously, but running at half the vertical and horizontal resolution is almost never even possible and is completely ridiculous anyway. When you scale an image, the closer the resolution of the image to the native resolution of the display, the better it will look.

    Technically yes, but 1280x720 is a very rare native resolution these days. Almost everything touted as 720p is in fact 1366x768.
     
  6. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    well, no, everything not x2 zooming is much worse than x2 zooming. mathemically, the ones closest to each others are the worst, they have the highest loss in detail.

    and it doesn't matter what the industry sells. they don't ever care about picture quality, they care about cheap screens for the masses. and that's what the 1366x ones are: cheap screens for the masses (most don't even really see more than dvd sharpness anyways, have their aspect ratios all wrong, and saturation on max, but never notice that it's somehow wrong)
     
  7. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    I'm not so sure. I think your ideas on image scaling are fundamentally incorrect.

    Lol. No reason to get so worked up. Fact of the matter is these days that 720p is code for a native resolution of 1366x768.
     
  8. chimpanzee

    chimpanzee Notebook Virtuoso

    Reputations:
    683
    Messages:
    2,561
    Likes Received:
    0
    Trophy Points:
    55
    statement like that needs more detail to back it up.

    My understanding of image scaling is that when you scale up, you are making up pixels that doesn't exist, so if it is an integral one like 2x, that should be ok, just duplicate. Anything non-integral, it would depend on how the 'algo' does it. Same goes for scale down but usually scaling down has a better quality than scale up, especially for non-integral ones.

    What do you mean by coded for native resoluton of 1366 x 768 ?

    If I write the player, I would simply check the case that the resolution is so close and just leave some black band and not doing any scaling in this situation(afterall, the extra space can still be used for controls).
     
  9. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    I imagine your not give the TV a proper resolution or have some kind of overscan/underscan happening that is distorting the picture making it look blurry.

    Even my 1920x1080 tv looks blurry until I manually go into my ATI drivers and disable the overscan that happens by default for some reason.

    Some TV's even have overscan/underscan that needs to be turned off, usually something called "PC Mode" or using a VGA input or similar gets around it.
     
  10. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    done enough graphics programming to know that, while it would be nice to be wrong, i'm sadly right here. upscaling that is not x2 (or x4 or what ever) results in a loss of information. if you just scale up one pixel in height, all the pixels blur with each other, with exception of top and bottom row. all others will be less detailed than before. if you scale up by 2, none will be blurred. no detail will be lost. the math is more complicated of course.

    well, i know that retailers suck with their labelings, and sell you anything with any form of name calling, but technically, only resolutions with 720 pixels height are 720p. this one is 768p.
     
  11. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    If you scale an image by an integer multiplier, no data is lost. For example, if you scale in image by 2x, you just take a single pixel and make it into a 2x2 cluster of pixels. Therefore, the image is technically just as sharp as it was to begin with.

    As for 720p, almost all 720p monitors are 1366x768. The reason for this is overscan. Televisions generally cut off the edges of a picture and then scales the rest of the image to fit the TV. This may be why your TV is blurry when it comes to displaying your computer screen. You need to make sure to set your TV to either a "computer" mode or something that ensures 1-to-1 mapping.
     
  12. Paralel

    Paralel Notebook Evangelist

    Reputations:
    57
    Messages:
    396
    Likes Received:
    0
    Trophy Points:
    30
    Don't forget, viewing distance also plays into how "sharp" an image appears, up to a certain pixel density. If one scales an image by 2 the minimum viewing distance for the image to appear "sharp" increases, unless the display has a pixel density that is greater than ~300 PPI in both the x & y axis, then it doesn't matter how close you get to the screen as it would be extremely difficult to discern individual pixels at that point.
     
  13. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    I'm talking about what looks better. Just because the pixels all align perfectly doesn't mean it actually looks better. You are talking about a loss of information, but an image scaled up from 2/3 or even 1/2 the pixels of the display is going to carry a lot more information than any of the nodes that scale perfectly such as 1:4. If you really think that displaying a 1280x720 image on a 1366x768 monitor looks worse than displaying a 683x384 image on it, you should probably go see an eye doctor. Just try it for yourself and force a handful of different resolutions on your monitor all the way down to half the vertical and horizontal dimensions and see what I mean. The closer to the original, the better it is going to look. Smaller aliased pixels are a lot better looking than your screen made out of lego blocks.
     
  14. Judicator

    Judicator Judged and found wanting.

    Reputations:
    1,098
    Messages:
    2,594
    Likes Received:
    19
    Trophy Points:
    56
    You and daverpermen seem to be arguing from different ends. They're arguing from the source end; given a source of a certain resolution, then best way to view that source is in an integer multiple of that source, AKA if your source is 683x384, then the best panel to view it on (meaning least blurry) is one with integer multipliers, i.e. 1366x768, or 2049x1152. You're arguing from the display end, that given an existing resolution (say, 1366x768), the best image is one that has a resolution closest to that (whatever resolution you can find for the source).
     
  15. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    Shouldn't the entire argument be from the point of view of the display end, considering that is the only way it is in the real world?
     
  16. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    well, i suggest you an eye doctor, then. i have never said upscaling by 2 means lego blocks. but it upscaling the 720p image does not reveal more image information than the 384p image (which is rather shocking).

    but what i really was talking about is, better watch 720p in 720p mode (that means, with 24pixel borders on top and bottom). that way, you get 100% of the detail of 720p content, no rescaling, no blurring, no loss of detail. and yes, that is massively better than scaling it to 768p. been there, have done it.

    and just in case, on a 1080p screen (mine is 2m x 1.1m so it's big), there isn't much difference to watch a dvd (half 1080p, so scaling by two works fine there) to watching hdtv (720p), as the resulting scaled picture looks about the same in detail.

    it's math, nothing else.
     
  17. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    Well the numbers certainly don't tell the whole story. Still, opening up the subjective viewpoint, could simply shift the debate to a whole new level, and set of standards.
     
  18. chimpanzee

    chimpanzee Notebook Virtuoso

    Reputations:
    683
    Messages:
    2,561
    Likes Received:
    0
    Trophy Points:
    55
    If you fill the extra pixel with something that wasn't there, the end result may not be better.

    This is the reason why when purchasing a digital camera for example, the question of how many pixel is enough depends on the final 'print' size. You need more pixels(at the source level) if you want to print say 12x8 rather than 6x4. IOW, a 12x8 print can look much uglier than 6x4 print if your source is a 3MP DC
     
  19. Judicator

    Judicator Judged and found wanting.

    Reputations:
    1,098
    Messages:
    2,594
    Likes Received:
    19
    Trophy Points:
    56
    In this particular case, where the TV is already bought and (presumably) won't be returned, then yes. If it's a more general case where someone might be deciding what resolution to buy, then perhaps not as much, as you then want to consider your source to perhaps determine what resolution is best for you to get.