The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    ASUS G73J Screen Resolution

    Discussion in 'ASUS Gaming Notebook Forum' started by masters, Apr 8, 2012.

  1. masters

    masters Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Hi,

    For some reason on my laptop, i am not able to go beyond - 1600*900 resolution :mad: .
    The graphic card that i have is ATI Mobility Radeon HD 5800 series.
    I am wondering if i should be getting 1920*1200 :confused: .
    Running windows 7 home edition.
    Intel core i7.
    RAm 6 GB
    Thanks
     
  2. dstrakele

    dstrakele Notebook Consultant

    Reputations:
    66
    Messages:
    199
    Likes Received:
    0
    Trophy Points:
    30
    What is the full model# of your G73J?-?? laptop? I believe certain G73 models ship with 1600x900 displays. They indicate HD+ on the box. Those with 1920x1080 displays have "Full HD" on the box.
     
  3. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    This, if your model has BB in it after G73JH-, it is a Best Buy model with the HD+ lcd panel.
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You have a Best Buy model like I do, so the maximum screen resolution is 1600x900. The 6GB RAM is a dead giveaway. For G73JH's only the Best Buy models had 6GB; all the other submodels had 8GB.
     
  5. Srikar

    Srikar Notebook Evangelist

    Reputations:
    55
    Messages:
    398
    Likes Received:
    2
    Trophy Points:
    31
    You can easily buy a higher resolution screen and swap them out. There are threads here on it, check the sticky.
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah, if the cost is justified for you, but you would lose the ability to play newer games at native resolution.
     
  7. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    ^This. I've viewed my 1600x900 resolution as a blessing as the GPU is better suited to it. The 128bit bus chokes @ 1080p on a lot of games, but I'm still running max or close on everything ...and probably will continue to for the rest of this console gen.
     
  8. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Well the card can't run games at max at 1080p, that is for certain, but it still handles most games pretty well at 1080p if you're willing to sacrifice some settings. Besides, i prefer higher resolution over higher settings to a certain extent and it is awesome when i use my G73 for work.
     
  9. Yiddo

    Yiddo Believe, Achieve, Receive

    Reputations:
    1,086
    Messages:
    4,643
    Likes Received:
    1
    Trophy Points:
    105
    +1

    Resolution is easily brushed aside for higher settings and I didn't realise how major it was until I moved to 1080p and then took a look a few months later at a 900p screen, the difference is massive. If you compare 900P at ultra to 1080p at high you will take Full HD any day of the week.

    The 5870M runs better on a 900p screen if you intend to play Crysis 2 or Battlefield 3 but for games like World of Tanks / World of Warcraft (mid range games you play often) etc that do not require high end it is a worthy upgrade.
     
  10. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Aside from those games you mentioned and The Witcher 2, the 5870m still handles itself pretty well and yeah, you don't notice the difference until you go with higher resolutions. In some games, AA isn't even all that beneficial at higher resolutions and AA is one of the main GPU hogs in games.
     
  11. AlwaysSearching

    AlwaysSearching Notebook Evangelist

    Reputations:
    164
    Messages:
    334
    Likes Received:
    15
    Trophy Points:
    31
    Can I ask what specifically looked so much better? I have the 1600 and
    am just curious as I will have to go 1920 when I replace this as no one
    really sells 1600 anymore.

    So my thinking has always been go for good native resolution but best
    performance and crank up the game visually.

    The 1920 is approximately 17% more pixels. Doesn't seem significant
    in terms of visual quality but as I said I don't have one.
     
  12. Yiddo

    Yiddo Believe, Achieve, Receive

    Reputations:
    1,086
    Messages:
    4,643
    Likes Received:
    1
    Trophy Points:
    105
    17% is a lot though that is 1/5 more pixels on a screen which is no different in size and the more pixels you cram into a smaller screen the better the display looks visually. Not to mention the extra gadget you can fit on the right side of the screen :) maybe that is just me.

    It is really hard to put into words what is better from it you really need to see for yourself because I came from a G73JH BB with a 900p screen and was under the assumption that I didn't need 1080p and as the 5870M was in it I was happy with 900p so I stayed put, since moving to 1080p on my clevo and comparing it to my friends G73 the difference really is noticable. Both AUO glare screens.

    I cant say if it is a worthy upgrade for the price but looking back now I would have upgraded it if I had the chance again.
     
  13. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    This also falls into personal preferences, i know people who prefer lower resolutions with higher settings too. If you have a 1080p external monitor to hook up the laptop to, you could make the comparison. Aside from that, there is also the matter of what else will you do with the laptop, higher resolutions are nice for productivity purposes. At work, i'm on a dual screen setup for that very purpose, more screen real estate.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    WoT is a terribly optimized game. It's a bit of a running joke what has happened to WoT after the latest 7.2 patch. My frame rate got halved instantly going from 7.1 to 7.2. On my G73JH I can only play at high settings now (nowhere near maxed out) at 900P to have over 30 FPS all of the time. If I run maxed settings at 900P it becomes impossible to aim my tank when the frame rate dips below 20 consistently.

    In fact, I've got Crysis 1 on here right now and, believe it or not, it runs way better and with a more consistent frame rate than WoT even though it's got more eye candy. The WoT devs seriously have to make WoT multithreaded and add support for more than one graphics card for owners of multi-GPU systems.
     
  15. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No. 1920x1080 is 44% more pixels than 1600x900. Punch it into your calculator if you don't believe me. (1920*1080-1600*900)/(1600*900) = 0.44.

    That's a significantly higher workload for the GPU.

    But on the plus side I do have to admit that there is a significantly larger amount of screen real estate in the 1080P screen. It's very noticeable in everyday usage.
     
  16. KuroLionheart

    KuroLionheart Notebook Deity

    Reputations:
    147
    Messages:
    727
    Likes Received:
    2
    Trophy Points:
    31
    My 900p screen met an unfortunate accident and I finally upgraded to 1080p.

    Wow...what have I been missing? This is way better in every aspect.
     
  17. Yiddo

    Yiddo Believe, Achieve, Receive

    Reputations:
    1,086
    Messages:
    4,643
    Likes Received:
    1
    Trophy Points:
    105
    Its huge right. I was suprised myself its like cookie dough ice cream!
     
  18. Srikar

    Srikar Notebook Evangelist

    Reputations:
    55
    Messages:
    398
    Likes Received:
    2
    Trophy Points:
    31
    What is so bad about dropping the resolution a notch? Are you constantly alt-tabbing while playing games?
     
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    What does alt-tabbing have to do with anything when you're playing in fullscreen? What I meant is that you should try running a non-native resolution on your LCD such as 720p for a 900p screen or 900p for a 1080p screen and see how godawful blurry it is. Dropping the resolution down will compromise the image quality the most on an LCD so it is better to just reduce the settings while keeping resolution at native.
     
  20. Srikar

    Srikar Notebook Evangelist

    Reputations:
    55
    Messages:
    398
    Likes Received:
    2
    Trophy Points:
    31
    I do that on games here and there, and no, it doesn't make them "godawful blurry." In fact, I'm currently playing LA Noire at 1600x900 because of how crappy a port it is and it's really not that much different compared to when I ran it at 1920x1080. I mean, really, why keep a lower resolution screen for a select few games when you can run the rest at the max resolution easily and have more space for other tasks?

    I'm also not sure why you say dropping the resolution compromises the image "the most on an LCD," considering CRTs have no native resolution.
     
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I find it very surprising that you can't perceive a noticeable blurriness from running a non-native resolution on the LCD, and it gets worse as the resolution is lowered further. Simply put, the underlying technology of an LCD is very different from that of a CRT and that is why image quality does not degrade when running resolutions below maximum on a CRT.

    An LCD is a digital display and has a fixed pixel structure, meaning that running anything under its maximum supported resolution is essentially just taking a smaller image and stretching it to cover the enter screen. I'm sure you can understand why this would contribute to obvious visual glitches and blurring. A CRT, due to its analog nature, can accurately rescale any number of resolutions below its maximum without noticeable image degradation.

    I said that "dropping the resolution down will compromise the image quality the most on an LCD" because nothing degrades the overall quality of the entire picture like running a non-native resolution will. The general haziness and lack of sharpness in the image is worse, in my opinion, than turning down some of the other graphical settings.

    I originally made the suggestion to the OP to keep the 900p screen because 1080p replacement screens aren't cheap nowadays and if you're playing any games released in the last year or two the lesser screen will definitely boost your performance. With the recent releases of Ivy Bridge and mobile Kepler and Radeon 7000M parts, the G73JH isn't exactly high-end in the laptop world anymore.
     
  22. Srikar

    Srikar Notebook Evangelist

    Reputations:
    55
    Messages:
    398
    Likes Received:
    2
    Trophy Points:
    31
    Like I said, the drop to the next resolution does not make the image "godawful blurry" as games do upscale quite well. We're not comparing a 640x480 upscaled DVD image to a 1920x1080 blu-ray image. I suppose DVD movies make you blind because if 1080 to 900 is "godawful blurry," the drop in resolution there must be unwatchable.

    On top of that, there are few games lately that chug so much to require the drop in resolution. It's usually just poorly optimized (moar like unoptimized) ports that use the CPU over the GPU. Recent games that I've easily maxed at 1080 are Skyrim, Dungeon Siege 3, Just Cause 2, Mafia II, Borderlands, Arkham City, Deus Ex 3, Kingdoms of Amalur, and Assassin's Creed: Revelations. Only recent games I've played that have really stressed my G73 have been LA Noire (CPU intensive) and The Witcher 2 (which I played maxed at 1600x900 minus ubersampling before my screen upgrade, gotta turn some things down at 1080).
     
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    My only experience is with running 720p on my 900p screen and I can't stand it, so I maybe you're seeing better results in running 900p on your FHD screen. It's all just personal preference really and based on what you think looks good and plays welll.

    I mostly play first-person shooters and I try to aim for 50-60 FPS at all times to stay competitive; the input lag and slower aiming at 30 FPS just doesn't cut in a fast-paced shooter. You can probably get by with 30 FPS in those RPG's and third-person action games you play without feeling limited.

    Also, there are quite a few games you didn't list that will bring a G73Jh to its knees at the higher settings such as Metro 2033, BF3, Crysis 1, and Crysis 2 w/DX11 & hi-res textures. There's no way this ASUS, or any other laptop out there for that matter, will max out said games and maintain 60 FPS at all times.
     
  24. Srikar

    Srikar Notebook Evangelist

    Reputations:
    55
    Messages:
    398
    Likes Received:
    2
    Trophy Points:
    31
    Yeah, I didn't mention graphical stress test games, two of which aren't really recent at all. Might as well stick with a 720 resolution with that line of thinking.