The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    How to run laptop display @ 100Hz at non - FHD resolutions?

    Discussion in 'Sager and Clevo' started by bloodhawk, Mar 15, 2016.

  1. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Anyone know how to run the built in display at the full overclocked refresh rate at non FHD resolutions?

    For example in Counter Strike Global Offensive, if i create a 100hz custom resolution and run the game @ 1920x1080 - its runs perfectly fine at 100hz, but if i drop the resolution to something like : 1280x960 (4:3 stretched) or 1024x768 (again 4:3 stretched) the display only works @ 75hz.

    I even tried creating custom resolutions for the above mentioned resolutions @ 100hz, but even then the display clocks down, unless i use FHD.

    Anyone know whats up? I wouldn't really care if i was at home, since i have a separate 144Hz monitor there.

    [​IMG]
     
  2. Wilmingtech

    Wilmingtech Notebook Enthusiast

    Reputations:
    2
    Messages:
    17
    Likes Received:
    6
    Trophy Points:
    6
    Ill take a non technical edumacated guess here -

    Since the lower resolution is not running pixel for pixel, the controller for the screen has to now process information coming in to properly scale what (color) goes to what pixel.

    Because of the heavier workload for the controller the refresh rate is lowered to accommodate the time it take to process the additional information.

    That's my best guess.

    Probably not a workaround for this one.

    -Sean


    PS - I think "interpolation" is the proper term to use for the scaling that the controller chip does.
     
    Last edited: Mar 16, 2016
    i_pk_pjers_i and SirJ like this.
  3. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    So im guessing something like changing the pixel clock might help ?
     
  4. Wilmingtech

    Wilmingtech Notebook Enthusiast

    Reputations:
    2
    Messages:
    17
    Likes Received:
    6
    Trophy Points:
    6
    Two different things. Pixel clock refers to how many pixels need to be transmitted to complete one frame times the number of frames in a second (refresh rate).

    The basic math for pixel clock would multiply Horizontal x Vertical x Frames per second (1920 x 1080 x 60fps) to get the pixel clock. This allows for the sending device (graphics card) and receiving device (monitor) to agree on how many pixels (how much bandwidth) both devices need to work together.

    It can get a little complicated as there are additional pixels added to compensate for older technologies (such as vertical and horizontal blanking in the use of a crt) and/or timing.

    This pixel clock is an absolute based on the frame size anf frequency/refresh rate. There is a calculation built into the devices so that you never need to change the pixel clock.

    Devices are smart enough today that you adjust the frequency and the pixel clock is automatically calculated.

    Frequency or Framerate (for digital displays) is deciding how many times in one second the actual frame is drawn. 60hz is usually the minimum and when you get over 120hz most people cant really tell a difference. Maybe younger and healthier person can see a difference.

    It also depends on how the 'content' is created. So a video created at 30fps playing back at 60hz is really no increase in performance. Another example would be a video game programmed to redraw the screen at a maximum of 60fps gains no performance when the monitor is set to 120hz. Its just doing twice as much work drawing the same image twice.

    Interpolation happens in digital monitors (and cameras/ photography) when the graphics card is now set to send a frame that is smaller (or larger) than the native resolution of the screen and the screen controller has to determine how to make that image coming in fit to the screen.

    I'm not 100% sure about this but I believe that some hardware allows you to choose whether the screen itself does the interpolation or the graphics card does the interpolation. I'm pretty sure the monitor defaults to the interpolation.

    To have the graphics card (via software) do the interpolation you would have to tell it to send a constant 1080p signal regardless of the frame/image size that comes into or is generated by the card itself.

    So to bring it back to your original question - Even though your graphics card can draw scenes at scalable rates, typically the signaling to the monitor is constant.

    So if your video card is drawing frames at 120fps, you are not gaining an advantage with a monitor that is set at 75hz. The graphics card decides which frames to drop and sends the agreed upon frame rate (with a hard set pixel clock/bandwidth predetermined by the handshake between the monitor and graphics card).

    In your case you may be able to set the graphics output from the card to 1920x1080 @100hz and tell the graphics card to draw a lower resolution to increase the framerate.

    But I don't understand what you are trying to accomplish with the lower resolution.

    Sorry for rambling on. Most of this is again... an edumacated guess and Im sure I got some stuff wrong. If nothing else I hope it gives you a better understanding.

    -Sean
     
    Prema and i_pk_pjers_i like this.
  5. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    WAAAOW lol.

    Thanks for the explanation. Ill give up sampling a shot, it should be possible based on the fact that we are able to use DSR to down sample. Im out right now, ill give this a shot later on today and get back to you.
    For most people this wont be an issue, but i have been playing CS for almost 8 years now at lower stretched resolutions, so irks be beyond belief and 60-75hz after being used to 144hz is very noticeable.
     
  6. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    You could create custom resolutions at the lower settings with the higher refresh rate.
     
  7. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Did that, even then it defaults to 75Hz in games. Even tried setting windows desktop to the lower res @ 100hz and then running the game, even after this it defaults to 75Hz in game.
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    At that point you might have to remove the lower refresh rate options in the firmware of the panel.
     
  9. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Gotcha, time to install Linux now..
    I have a feeling that the lower resolutions are not hard coded into the firmware with the overclocked refresh rate. At least thats what desktop monitors (and/or) theirs drivers do, have a list of the max/min supported refresh rates for all supported resolutions.
     
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    They do tend to a have a smaller list with internal panels.
     
    bloodhawk likes this.