The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Graphics, Game, Blur, "Native" Res Debate

    Discussion in 'Gaming (Software and Graphics Cards)' started by Matthewrs_Rahl, Jun 27, 2008.

  1. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    To start, I have read the debate going on in this area and don't want a repeat performance of that, ha-ha. Not my intentions, I assure you.
    I'd like to address more specifically this notion of "blurring" in-game and out-of-game in regards to the lowering from "native" resolutions. As well as begging the debate of gloss vs matte screens in this discussion.

    Let's apply this to a computer with the following specs...
    17" widescreen (Desktop Replacement Laptop)
    Quad-Processor, 2.83GHz, 12MB L2 Cache, 1333MHz FSB.
    Nvidia 8800m GTX SLI (2x 512MB)
    4GB RAM DDR2 800MHz
    7200RPM HDD (3x 200GB) in RAID 5 (so, 400GB HDD Space max)
    Vista 64Bit

    Let's apply the above specs to a computer operating in the following situations/circumstances...
    1. In-Game: Let's use something like "Crysis" (high-graphics-demanding game) versus Civilization/Age-of-Empires/Starcraft2 (high-processor-demanding games).
    2. Common Tasks: Viewing the icons on your desktop, reading text on a webpage (say this VERY PAGE), using MSpaint, movies(HD)/anime.
    3. Environment/Surroundings: Indoors vs Outdoors. Bright vs Low-lighting.

    Things to pay attention to given the above...
    1. Gloss vs Matte Screen Debate (emphasizing situations/circumstances #3 outlined above), in regards to the reflective nature of the screens. As well as screen glare, vibrancy, sharpness, depth, hue, saturation, and so forth. Not to mention how well it "mixes" with high-performing (gaming) graphics cards, in regards to showing the realism of detail (e.g. I always found the old CRTs lended some realism by dumbing down the edgy/sharp look of some games, like in quake/doom).
    2. Blurriness vs "Native" Resolution Lowering Debate, in regards to what situations will pose the most blurry problems. Keep in mind, we ALL know that you can only lower our resolution (not raise) and that lowering from the native resolution will NEVER decrease blurriness. While considering our tasks at hand (emphasizing situations/circumstances #1&2 outlined above) how practical is it to lower resolutions, without any resulting significant/noticeable screen blurriness. Feel free to bring in technical-jargon and so forth, such as "fixed aspect ratios", "letter screen", or debate the actual widescreen qualities (e.g. 16:9 vs 16:10). Also, feel free to address the sub-debate on raising DPI to cope with small-lettering in high-resolution situations. Some people will swear by the fact that blurriness is only noticeable in non-gaming circumstances, if that is the case, feel free to voice that opinion.
    3. Screen Resolution vs Graphics Card Debate. This ties into the debate on blurriness (see #2 above). I recommend reading this thread before going any further. There options are WUXGA (1920 x 1220), WXSGA+ (1680 x 1050), WXGA+ (1440 x 900), WXGA (1280 x 800). The WUXGA is available on nearly ALL the 17" widescreen notebooks, with the WSXGA+ being optional on about half of them. The other two options (WXGA+ & WXGA) are by-and-far rare on the 17" widescreen notebooks. The debate here centers around situations/circumstances #1&2, primarily. In today’s age, "standard" screens are becoming rare and I've never seen a CRT screen for a laptop. Leaving us with, namely, widescreens. What can we do to get the most "wow" factor when gaming or just viewing movies (possibly of HD quality). MIND YOU, HD utilizes "i" and "p" #s, which are DIRECTLY RELATIBLE to screen resolutions (e.g. WUXGA = 1920 x 1200 pixels = 1080p). The quality/performance at these resolutions will often be bottlenecked by the graphics card (particularly in gaming), as opposed to the processor. If we used the example graphics card listed at the top of this post, what do YOU think would be the wisest resolution choice? Feel free to bring in concepts such as "more "landscape" = higher dead pixel probability", "who uses 1920 x 1220 in-game anyway?", "1920 x 1220 is too 'sharp' for playing with MAX performance settings set", "1920 x 1220 may be superior now, but 2-3 years from now, you'll be wishing you had gotten a lower resolution to maintain MAX performance settings in newly released games", and so forth.

    Please state specifically which question # (1, 2, or 3) you are addressing in your posts, for simplicity. Thanks. :)
     
  2. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    1. Personal preference. I like oranges over apples, but until I become King of the Universe (slated at about 2013 AD), others may prefer apples.

    2. See attached pictures. First is a normal picture at 1280x1024. Second is a 1280x1024 screen displaying at 800x600.

    [​IMG] [​IMG]

    Is detail loss noticeable? Yes. Enough for you to care? Depends on how anal you are.

    3. I's and P's have nothing to do with computer monitors. All computers are "progressive scan". Pretty simply, the better graphics card you have, the better games will play at higher resolutions. Again, this is personal preference. We might as well debate horses v. unicorns again.
     
  3. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    1. Whaaaaa? Uh, all I can think to say in response to that is...I like mandarin oranges the best? Lol.

    2. Great picture choice to display the difference. I deffinetly saw a difference in quality in the first second (I didn't have to click back-and-forth). I'm not anal about it, though, I'd likely live just fine on 800x600 if I had to. This is addressing out-of-game graphics, however. There is still much more to be said of in-game graphics and text-blurring and DPI setting changes, etc-etc.

    3. I was under the impression that I's and P's were the terminology used in placed of "pixels" when referring to HD video. I wasn't implying that notebook monitor LCDs utilize "interlacing" like most typical television screens. In fact, I'm pretty sure ALL notebook LCD monitors use "progressive scan", just as MOST HDTVs utilize "progressive scan". It was just a comparisson for people to better follow the topic. More can definitely be said about all this. I'm also afraid to ask, but what is the horses v. unicorns debate? This sounds vaguely familiar, the only thing that comes to mind when I think about it is "closed thread", but I don't remmember the topic anymore.

    Btw, love your guides Lithus, I've seen 2-3 here and there over the past year and a half (I read more than I write around here, but I like to think I'm still a member of the community, ha-ha, if only minutely so).
     
  4. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    1. Glossy (on a Macbook Pro). Forget all the crap about color accuracy because it is irrelevant on a notebook screen (notebook LCDs are TERRIBLE compared to most Desktop screens, due to 6bit vs 8bit color), or in other words go get an EYEONE and calibrate your screen, regardless of glossy/matte. A well glossed screen doesn't glare nearly as much as an over-glossed screen. Really this point is not that important in the long run.

    2. It will be fuzzy looking if you are looking for it (when you lower resolution from native), but as Lithus posted, it all depends on how anal or bothered by it you are. FPS games and other games where there is abundant movement seem to mitigate fuzzyness because you can't focus on one spot for very long. So for a gamer, lowering resolution isn't such a bad idea, especially if it improves performance.

    3. If you want to run new games at WUXGA, you need a high-end GPU like the 8800m GTX. Otherwise, even a midrange GPU can accelerate 1080p content on one of these screens, no sweat. If you don't mind lowering resolution in games, you can skimp a little bit on the GPU (though I never recommend integrated graphics).
     
  5. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    From what I learned on the other thread you WILL see a difference between an image on native resolution and its downscaled equivalent using the same aspect ratio. However as Lithus' example demonstrates not everyone is going to care and most people can live with it.

    As for the ideal resolution, I think StormEffect says it best. I still like my WSXGA+ resolution since I can max out older games on my 8600GT while having the option to downscale for more recent ones.
     
  6. vshade

    vshade Notebook Evangelist

    Reputations:
    35
    Messages:
    340
    Likes Received:
    0
    Trophy Points:
    30
    1. I like glossy

    2. The blurriness is note that bad if you have little high frequency details, and since lots of high frequency details are aliasing you got a kind of free antialiasing. but if the game has small objects, like a rts or even a fps with big draw distance, or very high resolution textures, the blurriness will be more annoying

    3. I think the best resolution to match the 8800 graphics is the 1680x1050. and the 1440x900 is more popular than you think
     
  7. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    1. As far as the first point, its really up to you. I like matte on some screens, glossy on others. If its just a light gloss that increases the color vibrance, I like that better. If it really reflective, I don't. You'll need to actually look at the screen you want.

    3. As far as a resolution, just look at the laptop or a similar sized laptop (15" 17" 13" whatever) in a store, and figure out what resolution you need for desktop work. Get that resolution. A better GPU is a better GPU. Period. The downside to a powerful GPU is heat and battery life... not related to resolution so much.... a 1920x1200 + 8800 is a fine combination. Running games in lower resolution isn't that big of a deal.


    ----
    2. IMAGE SCALING:

    People ALWAYS get this stuff confused. No one nailed it yet. Hope this helps.

    Lithus is displaying a 1280x1024 image on an 800x600 "screen". What I think he means is that he looked at a 1280x1024 image on his monitor set to 1280x1024 (his native resolution). Then he changed his resolution to 800x600, and looked at the full picture on the full screen again. Alright, this is going to get complicated, but not that technical. We are going to discover why there is no image blurring apparent in those shots. There IS a difference though, we will discuss that too. Three things to consider: what happened to his monitor, what happened to the picture, what happened to the screen shot?

    First, we consider that the image being rendered (even images on the desktop are rendered) is being rendered in 800x600 now. OK. However, that image is being put onto a 1280x1024 monitor. Perfect. That is called upscaling, and that is what we are used to. It makes an image blurry, and also decreases the intensity of jagged edges somewhat.

    Now, what about the picture? It was originally 1280x1024, and now it is being shown on 800x600 pixels. The image must have shrunk! That is called downscaling. Downscaling is the opposite of upscaling, and guess what, it makes an image sharper and increases the intensity of jagged edges. It doesn't cancel out the upscaling, but it does change the picture a bit. You also lose detail when downscaling.

    Now consider the screenshots Lithus provided. Unfortunately for this demo, the screenshot your computer does gets the image before it was upscaled. That means that it will erase any visual trace of upscaling (blurriness). You get the original 800x600 image if you take a screenshot of an 800x600 screen on a 1280x1024 monitor. What we WANT for comparison's sake is two 1280x1024 images... (representing the entire monitor) regardless of the resolution being displayed on them. You still see the affects of downscaling on the second the picture. However, its not even as prominent as it should be: BOTH pictures are shrunk to about 800x600, so the comparison (sorry) is completely shot.

    Three ideas to discover this for yourself:

    A- look at it in real life
    B- capture the OUTPUT of your monitor (if your monitor even has an output... doubtful)
    C- take a comparison photo (still not perfect, but better than having your computer take a screenshot in this case...)

    (hope this was clear enough to get the point across, wrote it quickly)
     
  8. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    2. IMAGE SCALING

    Actually, my monitor is 1680x1050. The pictures were manipulated using photoshop. The first picture is a raw 1280x1024 shot, absolutely untouched. The second was scaled down to 800x600, rasterized, and then pulled up to 1280x1024. None of those were actual screenshots, but instead, just a manipulation of some guy's photo.

    Thus, when you receive these pictures, the computer recognizes them both as 1280x1024 shots as I have done all the transforming on my computer prior to uploading them. Thus, this is similar to playing a game at 1280x800 on a 1680x1050 monitor.

    There is actually a good deal of image blurring in the second photo as compared to the first. First notice the reflections in the parrot's eye. Then look at the black parts of the beak in relation to the white hair. It's easiest to notice these differences if you pull both up and quickly switch between the first and the second (yay for tabbed browsing).
     
  9. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    well done.

    thats a great way to do it.

    the only bad part is that both images ended up at 800x600.

    i wonder if you could repost them at full res?
     
  10. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    When it comes to gaming a higher resolution is one of the main things that has a big impact on performance levels.

    I have found that it is MUCH better to take a hit in resolution (just make sure its the same aspect and scaled via your video card and not your monitor) and then have higher detail and graphics settings than it is to have a higher resolution then have to lower the detail/graphics settings to maintain a higher fps.

    The change is sharpness is about the only loss in the change of resolution, but the settings for details could be the difference between realistic lighting and shadows, seeing objects further in the distance, and a ton of other things.
     
  11. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    Lol, keep it coming. I gotta rush out the door (I got dinner plans with the company owner of tigerdirect.com, seriously. I just hope he doesn't try to sell me on a refurbished laptop, ha-ha), but I'm eagerly anticipating some more views/science behind these topics. :)
     
  12. Kittie Rose

    Kittie Rose Notebook Evangelist

    Reputations:
    55
    Messages:
    690
    Likes Received:
    0
    Trophy Points:
    30
    My Gemstone Blue has a mostly beautiful screen, but it's hard to get very good blacks off it. I know that would be a lot of people's measure of a good screen and it annoys me but the screen is just so much more vibrant than my older Zepto's.
     
  13. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    Both pictures are 1280x1024. Or at least that's what it is when I pull it from Picoodle. Just right click, properties and it'll show you the resolution.
     
  14. ronkotus

    ronkotus Notebook Evangelist

    Reputations:
    175
    Messages:
    619
    Likes Received:
    0
    Trophy Points:
    30
    Seems to depend on the browser/settings. Use the URL under the picture to get the full view.
     
  15. andygb40

    andygb40 Notebook Deity

    Reputations:
    99
    Messages:
    755
    Likes Received:
    0
    Trophy Points:
    30
    Some screens display scaled images better than others. My old Dell 6400 had no problems downsclaing to 1024*768, from its native 1280*800, but when I do the same on my current Znote 6324, 1280*800 from 1440*900, I get drastically increased "blurryness" and it is harder to play games for long due to eye strain.
     
  16. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    Sorry for my delayed response, was away from the computer. I'll be dissapearing for a bit tomorrow as well, as Dell came through with my warranty and is replacing this notebooks motherboard for me. I don't think the motherboard being replaced will fix my problem, but that is another discussion all together.

    Any more thoughts on all this? It's been fairly small/limited discussion thus far, ha-ha.
    Obviously the downscaling from "native resolution" causing blur is the most poignent discussion going on here. However, andygb40 is the first (that I've seen) to discuss in-game blur problems associated with lowering resolution.
    We should have some way of judging everyones opinnion (a poll is far too simple to do any justice) on just HOW MUCH blur people are experiencing and what the primary contributing factor to this is. The task they are performing (word processor, web surfing, gaming, etc), the screen they purchased, the choice of integrated/dedicated graphics cards, and so forth. Because everyone seems to share one of following three opinnions when it comes to lowering screen resolution. "I see no difference" (rarest opinnion). "Noticable, but nothing problematic" (most common). "I can't stand it, too much eyestrain, I have regrets in my resolution decision" (less common). This appears to be a very big deal.

    Thoughts...opinnions...science...jargon...anyone?
     
  17. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    You should not be changing from your native resolution is anything but gaming. If you find text too small, increase the DPI of the font, don't decrease the resolution.

    This only leaves gaming, and if you want a debate, read the stickies. http://forum.notebookreview.com/showthread.php?t=141185
     
  18. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    I've read that topic in full already (I think I included it in my first post here that people should read it, ha-ha). It was an informative read, but I felt it was far too unstructured. I used that as a spring-board to this topic. There was never a real conclussion to that debate and I wanted to renew the topic, but with more stringent parameters. On there, people were discussing both high GPU and low GPU. Large LCD screens (17") and small LCD screens (15.4"). LCD or CRT screens. There was too much going on for anyone to follow it completely. And as the polls on there ultimately showed, it was more-or-less neutral at the end, with the exception of WUXGA (1920 x 1200) which was less favorable. Those results were largely due in part to their not being a "standard" for what screen size and GPU the user had on their notebook. I think it's impossible to construe anything but general "opinnion" from those polls without having a more specific outline. It would be like asking someone if they are republic or democrat. You're going to get mixed reviews, but that doesn't mean both parties aren't able to agree on some things (e.g. a lot of republicans would support abortion, even if we have some representatives/presidents who aren't). Let's not get political, though. I've seen that cause a trolling topic or two, was just an example.
    Perhaps I'll place a poll, but I can't be certain eveyone is going to read the whole first post before voting (a lot of ppl just vote in the beginning), thus nullifying the whole point of having a poll. :/

    Edit: I took a look and I'm unable to add a poll after-the-fact, so, no go anyhow. :/
     
  19. andygb40

    andygb40 Notebook Deity

    Reputations:
    99
    Messages:
    755
    Likes Received:
    0
    Trophy Points:
    30
    I still stand by my point. The only hardware that makes a difference in whether you get blurry images when scaling down to a lower resolution is the LCD itself. If I use an external monitor (CRT type) I have no problems what so ever scaling down to 1024*768, even if the monitor has a native 1280*1024. However LCD's are more prone to image blur when you run them at anything lower than their native resolution, but again how bad this is comes down to who's panel you've got. I'll get some screen shots later to show you what I mean.
     
  20. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    Cool andy, I look forward to seeing those screen shots. :)
    And yeah, I'm a fan of the old cathode screens myself. When I'm home from school, I prefer the look of the old bulky monitors and have never experienced any blur on them.
    As to LCDs, the laptop I am on now is my first experience ever having an LCD of my own and when I fiddle with the resolutions (native = WXGA = 1440 x 990), I see virtually no difference at all. If I get within half-a-foot from my screen, I might see something, but nothing eye-straining in the least. That is just one experience, however.
     
  21. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    for me, AA settings are more important than res.
    GTA san andreas looks much better on wxga w/ 4*AA than wxga+ w/ 2*aa
     
  22. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    While a it off-topic. Does anyone here know any "general" stores (e.g. best buy, microcenter, circuit city) that show laptops off for testing pre-purchase? I did a run of some stores earlier today for like 2 hours and was highly dissapointed as 1 of the 4 stores didn't carry 17" screens and NONE of the stores had screens featuring WUXGA resolution. Only one of the 4 stores even offered WSXGA+. PFFT. I was hoping to get my own personalized sense of the "blur" and report back to you guys on it, but I failed to find any places. :/
     
  23. Matthewrs_Rahl

    Matthewrs_Rahl Notebook Consultant

    Reputations:
    171
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    Well. No luck with stores. I did however find out that Hypersonic-PC is located about 5-10 minutes from my house. I sent there customer service an e-mail seeing if I couldn't arrange something with them. With any luck they'll say it's alright.