Also note with the 1080p displays it's a bit like the silicon lottery, some panels will clock well and some will not.
-
Meaker@Sager Company Representative
-
Spartan@HIDevolution Company Representative
personally, I don't notice a difference after 75hz so I don't risk it. Very happy with 75hz it's so smooth -
Try a 144hz display . You will not want to come back to 75.
-
Spartan@HIDevolution Company Representative
I had a PG2788Q bro, it was smooth as heck, but not as smooth as the difference you experience when going from 60 hz to 75 hz -
Hell no! The jump from 60 to 120 / 144 is more noticeable .
But it depends on the eye sensitivity lottery as well ahaha.1nstance, temp1147462323, Papusan and 2 others like this. -
The key is to be able to sync the refresh at the average FPS of the game, and match them by limiting FPS to that average.
The 980m SLI usually gives me FPS above 100, so a 100hz refresh is a natural match
Papusan and Spartan@HIDevolution like this. -
In latest AAA titles yes.
But in competitive titles like CSGO /LOL / DOTA2 the higher refresh rate and high frame rates 150fps+ give you a definitely edge.hmscott likes this. -
I'm not saying faster than 100hz isn't good, I am saying that above 75hz, on up to 100hz that the panel can OC is useful for syncing up with game FPS.
There is no need to stop at 75hz. My last few laptops could all OC from 75hz-100hz, and running at max stable hz was stable the whole time. I also verified with the manufacturers that it was safe to do so.
If the panel could do 200hz stable / safely, I would want to OC to that too
-
Oh my bad.
My FHD can do 105hz, but anything over 90hz and you start seeing color banding way more prominently, but that's another draw of a 6bit panel.
I just wish there was a way to program 100hz in the panel's firmware for lower than 1080p resolutions. -
Spartan@HIDevolution Company Representative
doesn't overclocking the display damage it on the long run? -
Not really, I had my Dell U2410 OCd to 75hz from 60hz and it was running fine even after 3 years for constant use.
Some displays take it better than others.
You will know when it's about to go bye bye on you.Papusan, hmscott and Spartan@HIDevolution like this. -
Spartan@HIDevolution Company Representative
I once read that one should only choose 60 or 75 or 100 hz but I forgot the reason.....you know anything about this?
It seemed that if you choose any other value like 80hz there would be tearing or something.....
I will try +100hz on my screen to see if it can take ithmscott likes this. -
Spartan@HIDevolution Company Representative
@bloodhawk
nevermind, the NVCPL says to enable custom resolutions, I need to disable DSR -
Well it must be for syncing reasons as far as I can tell. Because for example videos run at 24.96/29.9x fps, so 75 and 90 will be close multiples. I'm guessing it's something around those lines. I never noticed anything on my 144hz though.hmscott and Spartan@HIDevolution like this.
-
They are doing multiples of 30-33hz - 30hz, 60hz, 99hz = 100hz. 80 hz is an odd frequency if you are vsync'ing at 60hz.
The idea is to turn off vsync for the most part - some games will do a good job of adaptive vsync at your current monitor refresh - and use an FPS limiter like Rivatuner or NVinspector to set the runtime FPS to match the screen refresh hz.
So if you have a game like Batman that likes to run around 80-85hz, you can set the refresh to 80hz or 85hz - try both - and limit the FPS to +1 of the refresh - so 81FPS or 86FPS, and that gets rid of tearing.
Very handy when Nvidia's latest driver messes up G-sync again
bloodhawk, Papusan and Spartan@HIDevolution like this. -
Thanks for the replies.
Although now I have a dilemma.
4K with G-Sync is a great option since I get both G-Sync for gaming and 4K for general use. I'm not planning to game on 4K.
However with a Full HD G-Sync display I could overclock to 100Hz (assuming I get the right display). Hence it's a choice between 4K and 100Hz.
What would you guys choose between the two options? I'm not sure how I feel downgrading from the current 3K screen to Full HD... -
If you are on 3K right now, you will hate it. The FHD doesn't even come close to the 4K when it comes to colors and pwettyness.
If i had to chose i would go with 4k. But i play a lot of CS and DOTA, so...NeoXx likes this. -
Depends on your preference while gaming, higher Hz helps in fast paced competitive scenarios (shooters like CS:GO). For everything else, the color gamut and extra sharpness of the AUO 4K will shine, just keep in mind the extra load on GPU, power consumption and windows scaling issues on 4K
Sent from my Micromax AQ4501 using TapatalkLast edited: Apr 21, 2016NeoXx, Spartan@HIDevolution, Papusan and 1 other person like this. -
Well I mostly play AAA games such as Witcher 3. Perhaps 60Hz is fine for that...
-
Support.1@XOTIC PC Company Representative
Yeah, 60hz is still fine for most people, and you could always lower the resolution and still get the benefit that the panel has for better colors.
-
An odd question. If you run the 4K LCD at 1920x1080, can it be overclocked to 75+ Hz?
That that means is if someone wants to game (wanting to OC the LCD), they could change resolution for a short period of time, and still get higher refresh rate, but at the lower resolution.
Just a thought.Last edited: Apr 22, 2016TBoneSan likes this. -
From what I've read, overclocking depends purely on the panel so changing resolution won't matter. Technically picture resolution only changes what individual pixels display,and GPU load. The load on the panel itself is the same.
Sent from my Micromax AQ4501 using TapatalkLast edited: Apr 22, 2016 -
Perhaps I didn't explain this fully.
As of right now, the 4K LCD panels will overclock to something like 63-65Hz. This is due to the amount of pixels that need to be pushed at 4K (3840 x 2160 pixels). However, if resolution was set to FHD (1920x1080) on a 4K panel, would the panel be able to OC higher (75+ Hz)?
My point is, for those who would like this panel, they could run 4K mode w/ scaling. Then if they wanted to OC higher for gaming or whatever, temporarily switch resolution with scaling at 100%, and OC the panel during that time. -
As I said earlier won't make a difference. Display overclocking capacity depends on panel used and, in some cases (like the p750dm) BIOS restrictions. Display resolution doesn't matter as in laptops GPU handles the upscaling of the 1080p image to 4k before pushing it to the display through eDP, unlike external monitors with their own scalers. Even at 1080p, 3840x2160 pixels still have to be pushed, difference is that clusters of 4 pixels give the same color output due to the reduced resolution of the image from the GPU, hence while GPU load reduces, the load on the display is unchanged.
Sent from my Micromax AQ4501 using TapatalkLast edited: Apr 22, 2016NeoXx likes this. -
I always thought the signal was the signal (at whatever resolution/frequency) and the panel itself decides on the pixel placement/layout based on the physical layout of pixels.
So if I understand, you're saying the FHD signal is upscaled before it is sent to the panel, and the panel always receives a QFHD signal regardless of gpu / os resolution? -
Only for laptop display(not 100% sure on this)
Sent from my Micromax AQ4501 using Tapatalk -
Nope, it cant do more than 65hz , that too if you are really lucky with the panel. Its mainly because of the high pixel counts and the high pixel clocks.
-
What is the main reason P870DM doesn't have support for 980 SLI desktop versions?
Also the laptop seems highly customizable, will there be support for new GPU's from Nvidia such as Pascal? -
Support.1@XOTIC PC Company Representative
Kind of a mix of the size of the graphics cards and the layout of the system, plus power consumption and cooling issues, I believe.
No official news about that as of yet. There hopefully will be. But we probably won't find out more until later down the road, and NDAs are lifted.NeoXx likes this. -
@pat@XOTICPC @Prema @Mr._Fox @Meaker@Sager
Are there VGA power connectors for each MXM slot in the p870dm?
What is the max supported power of each MXM slot(including connector(s)) for running a single card and for SLI(or DX12 multiadapter)
Sent from my Micromax AQ4501 using Tapatalk -
Spartan@HIDevolution Company Representative
@Prema
Bro, it seems like G-SYNC is not actually functioning when using the Prema VBIOS V2 on my 980 GTX.
At first, I could've sworn that Need For Speed was smoother when I played it (had the stock VBIOS initially) but then I thought it may be all in my head since I do see the GSYNC option in the NVCPL.
Today, I confirmed this because GSYNC and VSYNC were both on, then I played a video using Splash Video Player which can up the FPS on videos to 60FPS for smooth playback and when there was motion in the scene, I could see slight screen tearing as if GSYNC/VSYNC were off.
I then flashed the stock VBIOS using the same driver 354.35 and the tearing went away.
I don't know how else to confirm this but I think it's worth a look.hmscott likes this. -
@Phoenix
That's actually not even possible as the vBIOS does nothing in regards to g-sync.
It's all done by the driver...TomJGX and Spartan@HIDevolution like this. -
So say I bought a P870DM to get away from this BGA crap Alienware how would I go about getting the BIOS? PM you? TI Forums? Im of course referring to one that is not a partner variant like a Sager etc.
-
Weird. I use SVP with MPCHC and i work super well. The tearing is probably because of the refresh rate being more than the fps or a similar type of desync.
When using 60fps, i made sure my refresh rate was 60Hz. Even on my 4k display it works flawlessly.
I play a lot of DOTA2 and with G-sync it locked it to 75Hz on the FHD panel. -
Spartan@HIDevolution Company Representative
thanks for the technical explanation, guess it was all in my head after all. Rep added t0 j00bloodhawk likes this. -
Spartan@HIDevolution Company Representative
please update your signature with your new rig and hopefully remove Alienware
-
Anybody?
Sent from my Micromax AQ4501 using Tapatalk -
Meaker@Sager Company Representative
There are two power ports, the max official supported configuration is 2x 980M due to the cooling and power draw.
-
Thanks for the reply, I was actually looking for theoretical power limits of the MXM slots combined with the connectors for single and dual card setups, ignoring the cooling capabilities. I thank @Prema mentioned the motherboard supports 375w somewhere.
Sent from my Micromax AQ4501 using TapatalkLast edited: Apr 29, 2016 -
That is true, but the problem is that the extra power phases do not have adequate cooling.
-
Is there a possibility to get cooled down the extra power phases with a mod of some sorts?god1729 likes this.
-
Meaker@Sager Company Representative
You could try padding them to connect them to the heat pipes that travel over them but it wont get you much extra headroom in temperature/power.
-
Hi phoneix,
First of all thanks for the guide! Very nice job.
I have one problem, i tried the overclock 4.4 ghz on all cores but i have overheating problem, you have some suggestion for fix the problem?
I have eurocom x9E with prema bios and liquid ultra on cpuSpartan@HIDevolution likes this. -
Spartan@HIDevolution Company Representative
what voltage offset did you set for Core / Cache? -
I follow your instruction, i use prema bios, not the xtu
-
How high is max temp in Wprime 1024M and Cinebench 11.5? Some have problems with Liquid ultra on cpu duo warped, not flat heatsink. The solution is to svap paste or change to new heatsink.Last edited: May 9, 2016
-
Spartan@HIDevolution Company Representative
I advice you to repaste with IC Diamond because Liquid Ultra is great, but it doesn't make good contact with the heatsink on this laptop due to how the heatsink was designed. When I switched from Liquid Ultra to IC Diamond my temps went down by a whooping 10C under load. -
Eurocom repaste my cpu with liquid ultra when i sent to them under warranty, and they ask me about the upgrade the thermal compound in my x9 because they said that is better for overclocking, before i was with ic diamond, anyway i will repaste and i will do again the test,
Guys i need to know something about gsync, because they put in my sky9 the gsync screen.
Anyway For make sure of that there is some option for see if the gpu is the gsync one?
(When i go through nvidia setting and i activate nvidia g sync he automatically start also the vsync..i think is not normal..and when i start game i have one indication "nvidia g sync on"
So i think they swap also the gpu..)
Anyway i have few question,
There is some program for see if gsync is enabled? And i have to switch off vsync in all games?
And then, Now i have prema bios and works very good, i just want to know why when i overclock the ram for example from 2400 mhz to 2666 system shut down and need again the cmos reset, i have 8x4 2400 crucial.
And another thing i want to buy (when will be released) the gskill 3000mhz, with prema bios i dont have any problem to put this ram right?
I ask that because i have very bad experience with my previous hyper x impact 2400 mhz.. Always bsod
Thanks always for help guysLast edited: May 11, 2016 -
Spartan@HIDevolution Company Representative
If you check in the nVIDIA Control Panel and you find there is an option to enable GSYNC, then there you have it, otherwise it won't even appear.
Regarding the RAM overclock what makes you think your RAM is capable of running at 2666 MHz? I advice you to not overclock the RAM.
For example, when I had a Samsung 64GB 2133 MHz. Kit, I could overclock it to 2400 MHz just fine, it passed all benchmarks and AIDA64 stability test. but if I play a game, after a while the FPS will drop to 5 FPS all of a sudden and only a reboot would fix it. Forget RAM overclocking it is a very sensitive area and could lead to a lot of background errors which you may not notice and even data loss.
I currently have the G.Skill 2800 MHz 64GB RAM kit and the Prema BIOS supports up to 3200 MHz. RAM speeds. You just need to set the RAM XMP profile to XMP Profile 1 in the BIOS
PS: Don't listen to anything Eurocom tells you. They don't know what they're doing.hmscott likes this.
Clevo P870DM n00bie's guide
Discussion in 'Sager and Clevo' started by Spartan@HIDevolution, Mar 16, 2016.