Anyone know how to run the built in display at the full overclocked refresh rate at non FHD resolutions?
For example in Counter Strike Global Offensive, if i create a 100hz custom resolution and run the game @ 1920x1080 - its runs perfectly fine at 100hz, but if i drop the resolution to something like : 1280x960 (4:3 stretched) or 1024x768 (again 4:3 stretched) the display only works @ 75hz.
I even tried creating custom resolutions for the above mentioned resolutions @ 100hz, but even then the display clocks down, unless i use FHD.
Anyone know whats up? I wouldn't really care if i was at home, since i have a separate 144Hz monitor there.
![]()
-
-
Since the lower resolution is not running pixel for pixel, the controller for the screen has to now process information coming in to properly scale what (color) goes to what pixel.
Because of the heavier workload for the controller the refresh rate is lowered to accommodate the time it take to process the additional information.
That's my best guess.
Probably not a workaround for this one.
-Sean
PS - I think "interpolation" is the proper term to use for the scaling that the controller chip does.Last edited: Mar 16, 2016i_pk_pjers_i and SirJ like this. -
-
The basic math for pixel clock would multiply Horizontal x Vertical x Frames per second (1920 x 1080 x 60fps) to get the pixel clock. This allows for the sending device (graphics card) and receiving device (monitor) to agree on how many pixels (how much bandwidth) both devices need to work together.
It can get a little complicated as there are additional pixels added to compensate for older technologies (such as vertical and horizontal blanking in the use of a crt) and/or timing.
This pixel clock is an absolute based on the frame size anf frequency/refresh rate. There is a calculation built into the devices so that you never need to change the pixel clock.
Devices are smart enough today that you adjust the frequency and the pixel clock is automatically calculated.
Frequency or Framerate (for digital displays) is deciding how many times in one second the actual frame is drawn. 60hz is usually the minimum and when you get over 120hz most people cant really tell a difference. Maybe younger and healthier person can see a difference.
It also depends on how the 'content' is created. So a video created at 30fps playing back at 60hz is really no increase in performance. Another example would be a video game programmed to redraw the screen at a maximum of 60fps gains no performance when the monitor is set to 120hz. Its just doing twice as much work drawing the same image twice.
Interpolation happens in digital monitors (and cameras/ photography) when the graphics card is now set to send a frame that is smaller (or larger) than the native resolution of the screen and the screen controller has to determine how to make that image coming in fit to the screen.
I'm not 100% sure about this but I believe that some hardware allows you to choose whether the screen itself does the interpolation or the graphics card does the interpolation. I'm pretty sure the monitor defaults to the interpolation.
To have the graphics card (via software) do the interpolation you would have to tell it to send a constant 1080p signal regardless of the frame/image size that comes into or is generated by the card itself.
So to bring it back to your original question - Even though your graphics card can draw scenes at scalable rates, typically the signaling to the monitor is constant.
So if your video card is drawing frames at 120fps, you are not gaining an advantage with a monitor that is set at 75hz. The graphics card decides which frames to drop and sends the agreed upon frame rate (with a hard set pixel clock/bandwidth predetermined by the handshake between the monitor and graphics card).
In your case you may be able to set the graphics output from the card to 1920x1080 @100hz and tell the graphics card to draw a lower resolution to increase the framerate.
But I don't understand what you are trying to accomplish with the lower resolution.
Sorry for rambling on. Most of this is again... an edumacated guess and Im sure I got some stuff wrong. If nothing else I hope it gives you a better understanding.
-SeanPrema and i_pk_pjers_i like this. -
WAAAOW lol.
Thanks for the explanation. Ill give up sampling a shot, it should be possible based on the fact that we are able to use DSR to down sample. Im out right now, ill give this a shot later on today and get back to you.
For most people this wont be an issue, but i have been playing CS for almost 8 years now at lower stretched resolutions, so irks be beyond belief and 60-75hz after being used to 144hz is very noticeable. -
Meaker@Sager Company Representative
You could create custom resolutions at the lower settings with the higher refresh rate.
-
-
Meaker@Sager Company Representative
At that point you might have to remove the lower refresh rate options in the firmware of the panel.
-
I have a feeling that the lower resolutions are not hard coded into the firmware with the overclocked refresh rate. At least thats what desktop monitors (and/or) theirs drivers do, have a list of the max/min supported refresh rates for all supported resolutions. -
Meaker@Sager Company Representative
They do tend to a have a smaller list with internal panels.
bloodhawk likes this.
How to run laptop display @ 100Hz at non - FHD resolutions?
Discussion in 'Sager and Clevo' started by bloodhawk, Mar 15, 2016.