I have a little problem ^^
When i overclock my screen at 75Hz and i close CSGO for example, i can see a purple screen shortly
-
Questions about the heatsink-
Ive had a P750DM-G for about a month from rjtech with the IC Diamond option, and think it was running hot. I took off the heatsink to reapply the thermal paste, and it was gooped on there. So much it was overflowing the sides. It also smelled bad, like the back room of a grocery store.
When I reapplied, I put the heatsink back on and it looked like it wasnt making good contact. I screwed it down by # on the heatsink and then took it back off, and there was barely any thermal paste on the heatsink like it wasnt even touching the CPU. The GPU stays around 60 in 3dmark and games, but the CPU gets up in the 90s while playing Fallout 4 and 98 in the XTU benchmark. Did i get a bad heatsink?2bad0 likes this. -
My stock temps on a stress test were higher 80's i believe,would have liked to test for you but my xtu won't open and i can't restart atm. -
I did the undervolt at -100 and did the benchmark. Temps went from 98 to 83. Then I ran 3dmark Firestrike and it crashed. Restarted and tried -50, but then it immediately crashed. Im not overclocking. Dont know why I cant go -50 now without crashing and havent had time to test it again yet.
I think I am going to try to reapply again using more paste and see what happens. If there is a gap, would a thermal pad work for a CPU or should I look at getting a new heatsink? Ive seen them for the P750ZM for $88, which I guess would be the same one?
EDIT: I take that back. I was using the drop down box and wasnt paying attention. I was increasing by 50. I am trying decreasing by 5s to see how low I can go. So far -50 is good.Last edited: Dec 25, 2015hmscott likes this. -
XTU Bechmark? i hit 95 like a champ!!, with a good undervolt ? 75 is a rare event
if warped the CPU portion because 60 on the GPU under load seem normal to me.
just ensure cpu well fitted well screwed and allLast edited: Dec 25, 2015 -
I crash at -70. Dont think the heatsink would have anything to do with it though since undervolting makes it cooler. Did I get a bad CPU? Forgot to mention its the 6700K.
-
its difficult to compare voltages just based on the voltage offsets, since every cpu has its own stock voltage. it will make more sense if you guys started posting your actual voltage additionally to the voltage offset from stock
2bad0 likes this. -
Where do you get stock voltage? If its in XTU, I am missing it. My max is 1.276V in HWMonitor, but that is with the -50 applied. CPU-Z fluctuates in the Core VID box.
Attached Files:
-
-
u can use hwinfo64, for example. it will record minimum, maximum and average voltages for any specified amount of time
also, click on that little blue wrench symbol in xtu, it lets you expand the visible parameters to include stuff such as voltage...
what ull want to do is change the windows energy profile to high performance and then check the core voltages at load, e.g. during xtu stress test.TomJGX likes this. -
Meaker@Sager Company Representative
You can also expand the graph making it easier to read
-
I think the P75xDM is going to be my next buy. I need something that's relatively powerful and quiet and will fit in a backpack pretty much every day. The 3.5 kilo of weight sounds like a bit much, but I'm pretty sure my back can handle it.
However, the battery life is pretty abysmal. I'm not expecting 10h or anything, but I was wondering if I can squeeze some extra life out of it by proper configuration or component choice. For instance, would going for the i7-6700 (non-K) and undervolting it improve things at all? How about FHD instead of the UHD everyone seems to be reviewing?
Are there any external battery packs on the market capable of powering this laptop for any substantial extra time?
Lastly, do you think it's worth it to wait for the next gen Intel CPUs (Cannonlake?) and nVidia chips (Pascal)? Would those be able to bring any substantial efficiency improvements? -
u can potentially squeeze out an extra half an hour of battery life by proper configuration, so dont expect any wonders
dont forget that u can tweak the 6700K to behave like any other skylake cpu! downclock, undervolt, restrict TDP levels, all through xtu. on the other hand, ull never reach 6700K performance levels with a 6700 non K, so yeah, choice should be obvious
FHD does give u better battery life than UHD, dunno by how much though. again, wouldnt expect wonders here though
well, consider that u get anywhere between 1 and 3 hours of battery life with a 82Wh battery, so with each 27Wh battery pack ud get an additional 20-60 min battery life. not sure how much sense that makes, considering how large those battery packs would have to get in order to give u an actual real life advantagebest thing to do would probably be to just get a second p870dm battery to double your potential time away from a socket...
cannonlake: no idea, but to be realistic, expect the usual 5-7% bump in IPC performance, not more. as for pascal, should be quite a nice bump in performance, anywhere between 30 and 100% more compared to maxwell. as for efficiency, your guess is as good as anybody´s.... especially since "efficiency" is a VERY relative and biased term as opposed to "performance" if u ask me
best thing for you to do: check your day to day usage patterns, look up benchmarks for the hardware available today and then decide for yourself how much more benefit u would get out of 5-7 additional % cpu power and 30-100% additional gpu power... and, of course, how immediate your need for new hardware isull have to wait until Q2 16 for pascal and cannonlake wont be showing up before well into 2017. next stop will be kaby lake, the skylake refresh, expected sometime in 2016. good thing about this: kaby lake will be compatible with current skylake sockets and chipsets
Last edited: Dec 27, 2015Scerate, WuXeS and Spartan@HIDevolution like this. -
Efficiency, is basically Performance-per-Watt. It's quite simple. There's no relativity or bias to it. Take a benchmark, measure power draw in a controlled environment, compare systems. You can do this yourself with an overclocking system if you want. Lots of tests have been done on overclocked systems to find the best performance-per-watt. The older 2600K had an interesting curve where it actually came back to parity when you ran at ~4.4ghz. ie. the same benchmark completed using the same amount of power overall, but ran significantly quicker when overclocked, so literally no downside other than peak thermal output. Other clockspeeds used slightly more power to complete the test.
So in reality, increases in efficiency are a far better indicator of how a certain architecture will perform. Everyone cries about the 6700HQ being "only" 10% faster than the old 4700MQ but completely ignore that it uses anywhere from 30% to 50% less power doing so.
Another great example is Maxwell chips. If you think about how Maxwell was developed and released it's quite obvious that making things more efficient is a superior method to simply throwing more power at it and making it faster. The first Maxwell chip came out at 60W and proved that the performance-per-watt change was huge. Then they released GM204 which was a full 85W lower TDP than the outgoing 780Ti and yet still beat it. The next step was to throw the 780Tis 250W budget at the GM200 chip and you get the TitanX/980Ti which utterly destroyed the older cards in benchmarks. They basically bought themselves 85W of power to put back into the more efficient engine.
You can literally derive the performance of most Maxwell chips from the original 750Ti that came out (keeping in mind TDP is not exactly the same as power draw, but pretty darn close).
-It pulled 4105 graphics points in Firestrike at 60W TDP. ( Guru3D bench)
-Multiply that up to a 250W TDP. 4105 * (250/60) = 17,104.166
-Therefore we'd expect a 250W card on the same architecture to pull ~17,104.166 Firestrike graphics points and....
-TitanX pulls 17396 ( Guru3D bench)
Keeping in mind almost 12 months went between those 2 cards (so driver updates etc), that's accurate to 2% or so.
I don't know what the Pascal release cycle will look like. But if they chuck out a 750Ti-like card which does the same thing at <50W then we'll be in for a treat when they throw 250W at it.CaerCadarn and WuXeS like this. -
jaybee83, Stooj - thank you very much, this is actually very helpful.
Stooj - judging by your sig, you made the exact jump I'm looking to make. I currently have the W230ST, which is still fine(ish), but its obvious problems (supernoisy fan, ****ty touchpad, so-so overall build) have become rather painful over the 2 years I've had it for. I'm looking for something a bit more powerful graphics-wise, less noisy (I do audio work), but still portable (I move between home and office on most days). How's your Batman been treating you? Why not the 980m, if you don't mind me asking?
And yeah, what I meant by efficiency, especially in the light of my battery concerns, is just what you wrote: per-watt performance. Assuming no wonders in battery life department happen along the way, it would be great if Pascal gave a substantial bump there.
I guess I'll wait and see what happens in Q2 2016. The P750DM-G (or rather its next iteration) seems like the best bet right now. Things might get shaken up a bit if the next Robin line gets the TB3 and TB3 eGPU solutions actually materialize - then it might come out on top for me because of extra portability and better battery life. -
I have spent more time on the bug that Turbo boost got disabled. It turns out that a full load on cpu disables Turbo boost for some reason.
I have used Throttlestop to re-enable it, but I hope it get fixed in some bios update.
I'm using 6700(non-K), so I assume 6700k user might not have this bug. -
Overall the machine is going well. Pretty solidly built, perhaps other than the panel (a bit bendy, some minor bleed). I'm hoping that if my desktop kicks the bucket then I'll be able to stick the 980Ti into an eGPU enclosure and the P750DM-G will become my primary machine. Alternatively, if cannonlake or a new Skylake revision come out with notable improvements to efficiency then the P750DM can get an upgrade and the outgoing 6700K can go into a new desktop (I'm big on recycling parts and getting the most out of hardware).
Most importantly it's obscenely fast with VMs and such (got myself a 950 Pro as well). I probably would've gone for a P650RG if not for the lack of TB3 port on it. It still boggles my mind that Clevo didn't make that change.
I'm a sysadmin/network engy by trade but I found my tablet/phone could do more and more of my onsite stuff easily enough and the old W230SS became less useful (as powerful as it was, the 860M really didn't cut it anymore). Wherever I was gaming with it I was jacked into power anyway and the 13" screen, while great on the go for work duties, was a trade-off I didn't have to make anymore. I'm planning on getting a Dragonbox Pyra (handheld Linux computer roughly the size of a Nintendo DS) to do my daily/onsite stuff on. -
performance-per-watt doesnt mean jack squat, if you artificially restrict clocks and performance just to get a higher "efficiency"
sure, in a perfect world it would mean that we could reach a higher performance using the same amount of power, but thats where your argument fails: what good is a 6700HQ with less power consumption but barely the SAME performance as its predecessor?! this is exactly what i dont like about current developments, were just keeping the status quo or even regressing in performance just so that people can get their "thin & light" crap with almost NO options left for us enthusiasts anymore.
so nowadays its all about "efficiency" regardless of any performance gains, thats why in the end i always look at the actual performance of hardware instead of its efficiency
Sent from my Nexus 5 using TapatalkLast edited: Dec 28, 2015CaerCadarn and TomJGX like this. -
The 6700HQ and every other X700 series CPU have always targetted a particular performance target. For 90% of users that target is perfectly fine and the benefits in power savings are real. For the hardcore users there are options. Using less power and generating less heat is hugely valuable. Especially after years of 99 degree Haswell CPUs and generations of super hot GPUs. Why do you think AMD have so utterly lost the mobile market? Their architecture is not efficient and thus doesn't scale down to mobile to be even remotely competetive.
The Skylake CPUs are indeed faster than the Broadwell and Haswell chips so, you're just wrong there. -
btw, the 6820HK is ONE SINGLE cpu model, which is
a) soldered
b) TDP restricted, thus cannot hold its clocks properly even if you give it more legroom to stretch via xtu settings
c) cant overclock worth crap compared to previous gen mobile MQ cpus, including the non-extreme ones
wonderful days indeed, we had alienware, MSI, asus, clevo laptops, all with socketed hardware, overclockable cpus (even the non-extreme models), exchangeable gpus (immensely practical if it craps out on you, no need to throw away the entire machine) and easy maintenance (cleaning, upgrading, repasting, modding). what has become of those wonderful days? a BGA nightmare, where even RAM is starting to be soldered onto the mobos, where battery life is king above all else, especially performance, and where the advancements in computing performance are even regressing in some cases
definitely not the place id like to live in, if you ask me...
sure its not "ice cold", but hey, still doing 4.3 ghz on all cores on an everyday basis and am able to get up to 4.8 ghz stable (thats a desktop cpu in a laptop
). the 6700K can reach similar clocks in a similar system. sure, at lower temps, but the advancement there is so incremental its not even worth calling it a new generation, if u ask me...
btw, while i can appreciate a good discussion here lets keep it professional and not resort to personal attacksthe only thing ive been attacking are current hardware developments, but i dont like comments such as
"You just don't get it do you?"
"What planet are you from and what exactly do you expect here?"
"Get real."
"It really isn't that hard."
if you have good arguments, bring them on. otherwise, no need to lash out hereLast edited: Dec 29, 2015 -
So I went to reapply the thermal paste again, and some got on the video card and sides of the CPU. I used a q-tip and rubbing alcohol to get as much off as I could, but the laptop wouldnt boot after I put it back together. The power button and keyboard light came on, but the screen didnt do anything and no hard drive light.
Took it back apart and took CPU out to get under it, and it looks like some pins got bent somehow. I dont have the eyesight to try to straighten them. Is there anywhere that sells motherboards? Or any other options besides trying a local repair shop that might not be able to fix it? -
Bottom line is a Bugatti's engine was build for just that,An F1's engine was built for just that.(Yes they maybe received influence from slowest thing that never moved) But,None of these company's took a weed whackers engine and say lets scale it and hope for the best.It was power first, when we're satisfied then we make it not explode,now with cpu's its kinda the reverse.Taking a 6700hq for light to mid power use slapping an extra mghz on with unlocked multiplier calling it 6820hk and hoping it overclock's well is insane.If it's for enthusiast build it from the ground up for enthusiast.Not to even mention that the Car's engine (BGAware) blows away,you need to replace the entire car?
Last edited: Dec 29, 2015jaybee83 likes this. -
Sent from my Nexus 5 using Tapatalk -
So I finally bought the P751DM. First run totally stock;
http://www.3dmark.com/3dm11/10735453
CPU temp 70 and GPU temp 57, that is a insanely cool notebook!!!Last edited: Dec 29, 2015 -
welcome to the future my friend
enjoy ur new ice cold machine
steberg likes this. -
Hey Everyone!
Been lurking here for a while but need a bit of help now. I got my P770DM-G a few months ago, couldn't be happier with it. Been mostly playing games and it's been really solid. Was overclocking the GTX980M on the stock vbios (+135 core +300 mem) and it's been very stable. Yesterday I installed Prema's vbios mod, everything seemed ok but as soon as I bumped the overclock up a little, I had a black screen when benchmarking, and had to restart. Have been tweaking since, and get the same result every time, even with no overclock at all. After around 30s of any gpu-heavy task, the screen cuts and I have to hold the power button.
Got in touch with Prema about it, and today I did a clean install of the Nvidia drivers, and have gone back to the stock vbios, and I'm getting the same behaviour when running anything graphics-heavy, even with no overclock. When gaming, I'll get the same black screen 30s to 1m in. Do I need to send it back? Have I ruined the gpu? Everything is working correctly for normal tasks. Any help or advice would be amazing. -
First i don't think you ruined the gpu, to kill a gpu you have to bump in much more voltage. Wich drivers do you use right now? I didn't flash my P771DM-G yet with the custom vBios, i borrowed it my dad actually he's in germany right now getting some vacation. But i think the issues started like you said with the newsest 361.xx drivers, while these drivers fixed my problems with my Desktop rig (had to underclock to run UE3 titles like Blade and Soul or even Elite Dangerous) i can't comment yet on how they would behave on my NB. Try using the 359.12 drivers i'm using them on my NB with no issues, hope it fixes your problems. -
Thanks for the quick response!
This was on 359.06 drivers, wouldn't let me install 361 drivers, would just restart and repair when I tried. Didn't touch the voltage slider in Nvidia inspector, and I'm back to the same settings which were very stable previously which is why I'm a little confused as to what I've done.. -
cpu @ stock too? i would actually try remove battery and AC/DC then press about 10-15 sec power button to unload electricity then plug in AC again and try again.
-
My i7-6700 (non-K) is at -50mV offset, will put that back to stock, try what you've said then give it another go. Thanks again
-
Also keep in mind that the Mod won't throttle, so even if you thought that you where using the same clocks, your old vBIOS actually under-clocked (throttled) clocks under load, essentially running lower clocks than you set them.
You may need more voltage to actually run the clocks under real load that you thought you where already running before. (Just bench stock vBIOS Vs. Mod under same clocks and you will see much higher numbers clock for clock with the Mod. This gap increases the higher you set the core).Last edited: Dec 29, 2015 -
That makes sense. Sorry for the panic, will try re-installing and starting from scratch
-
Ok, re-installed windows, re-installed nVidia drivers, stock processor settings, stock vBios. No overclock, started up Fallout 4, everything running fine, 1 min in, screen cuts to black
. Any other ideas? I assume if I've flashed the wrong vBios for my gpu then it wouldn't start at all? I'm sure I've done something stupid
-
did you reflash the vbios from your gpu? using a backup created from nvflash or GPU-Z?
-
@Scerate Ah no, I used the stock version from Prema. Maybe I was using wrong vBios for my gpu? Any way I can find out what I should be using, because would rather use Premas modded version if I can get it working. Assumed I should be on the DM-g version but maybe not if that could explain it.
-
Meaker@Sager Company Representative
Use your original backup vbios for any testing.
-
getting the same behaviour no matter what I do.. not sure what's happened. Was running so smoothly before.. I shouldn't play with these things
Thanks for the help, might have to just send it back, see if the shop can see what I've done wrong -
Meaker@Sager Company Representative
Make sure the thermal pads are making good contact with the VRMs.
-
I've spent a long time googling, and have skimmed over all 274 pages of this forum, but I still don't have a solid answer.
I'm looking at buying 2x16GB ram chips, but I don't know what speed to get. I know that the specs say the motherboard supports 2133, but I was wondering if
1) It could potentially support faster
2) If it can't, will a faster speed still work, but it would just be reduced to 2133. If so, will the timings automatically be reduced as well?
3) if the answers to 1 and 2 are 'no', is there any reason to buy ram faster than 2133?
Also, I was thinking of going with G.Skill Ripjaw, unless there is a better recommendation.
Best,
Matthewclevo4k likes this. -
Luka Stemberger Notebook Enthusiast
However, I'm sure the difference between 2133 and 2400 wouldn't be noticed in any real life scenario, and even in benchmarks it would be marginal, so don't worry too much about it.
That being said, do not buy the G.Skill RAM you were linking to, Skylakes support DDR3 and DDR4, but it is up to the motherboard manufacturers to decide which type to go with, they are not interchangeable. Batman 2.0 uses DDR4 only.
And now to introduce myself!My name is Luka and I got my Batman 2.0 about a month ago. I just finished reading all of the 274 pages here and I have some things to say about this beast of a laptop, so I plan to make a video with it in the next few days
Major_Hazzard, SierraFan07, TomJGX and 5 others like this. -
All of this is probably negligible, but I just don't want to drop $200 on 2133 speed ram and later down the road regret that I didn't get something faster.Last edited: Dec 29, 2015 -
A question I had about backlight bleed. I opted for the 4k monitor upgrade, and I notice a lot of backlight bleeding, most of which I can deal with, but the bleed in the bottom left corner is much worse than anywhere else. I've used a variety of monitors my whole life, but somehow I never really had an issue with uneven bleedthrough before. How do I know whether its just the way this monitor is, or if there is something wrong with it? I tried taking pictures of it, but my crappy camera isn't good enough to take an adequate enough picture of it to show.
-
Luka Stemberger Notebook Enthusiast
Higher clocks bring longer latency, but I'm not sure which one brings more benefit to your usage scenario, so I can't advise unfortunately. But since these machines are already expensive and high end, you might as well pay a bit extra for the faster thing even if you don't experience much real world advantage. I put two SM951s in RAID just because I can, I don't notice much improvement over using a single drive. But, it's 3200 vs 2200 MBps, so I did it just because
And a similar comparison, why would you care about saving 1-2 watts in such a power hungry machine? -
Luka Stemberger Notebook Enthusiast
This is actually not a backlight bleed. A backlight bleed happens when the LCD layer is of a non-consistent quality and shows patches of lesser black values. This is measurable but only visible in extreme cases. Another thing that could be considered a backlight bleed is the actual light source showing on the edges of the screen when viewed at an angle, caused by a bad or outdated assembly process.
What is happening with Batman 2.0 is the frame is easily bent and often not uniform and completely flat. That causes strain on the LCD panel and bends it in places. Because it is not a bendable OLED, it will change the position of the pixels in regard to the polarizing panel, causing light to come through. The panel itself is not to blame though, it is probably in perfect condition. If you try to gently bend the screen in places where you can see it, you will see it improve when it comes back in place. But do not be rough with it, you might damage it. I'm sure something can be done about it other than replacing the chassis, I just don't know whatantolovich1985 likes this. -
One last question. I just narrowed down one more problem I was having with it. There are many times when I left click with the button, while my finger is still on the touchpad, instead of clicking, the mouse will just started moving vertically up the screen until I stop touching the touchpad. Have you noticed this issue or heard anything like it?Last edited: Dec 29, 2015Luka Stemberger likes this. -
Luka Stemberger Notebook Enthusiast
And when I think about it, it probably still is backlight bleed, I just wanted to explain that the panel itself is great, the frame is to blame. I'm sure that opening it up and refitting or re-positioning the panel would help, I just don't want to loose the warranty. Oh and bending it actually did help a little, but not completely and I wouldn't advise it not to brake it.
That is not an issue, it's a feature!The touchpad has gesture control and is multitouch. So holding your finger on the border makes it continue moving in the direction you started. That way you don't have to lift your fingers and keep swiping to scroll, you can just keep holding it on the edge and it will continue scrolling on it's own. I'm sure it's adjustable in the Synaptics driver app, I never bothered to look, I like those features.
antolovich1985 likes this. -
I'm going to contact Sager and see what they say about the monitor. I'll let you know when I get an answer from them. Thanks for all your helpLuka Stemberger likes this. -
Hello! I am new to this forum, my name is Maxence. I am about to purchase the Eurocom version of the Clevo P770DM with the 1080p Gsync monitor, 970m (6gb vram) and i5 6600k. Any thoughts on the Eurocom model? Knowing that all those parts will cost about 1600 dollars, is the price worth it? I have not found anything with the same specs for less than 2000, but perhaps some of you know of another brand? Also, how is the build quality and finish of this laptop? Would this be a good machine for future-proofing?
-
Support.1@XOTIC PC Company Representative
Welcome to the forum Maxence!
Yes, usually Clevo models are a good bang for your buck, and often hard to find similar specs in the same price range. That laptop is a good one, both for it's performance, and is generally well built. It is one of the best for future-proofing, with a more powerful desktop processor, plenty of room for ram and SSD upgrades, and having an upgradable graphics card (which most laptops don't offer). Pretty good model in general in my opinion.clevo4k likes this. -
After removing the bezel power your beast up and see, if there's backlight bleeding or not. If not, you can avoid it by tightening the screws not that hard. If yes, you have a point in RMA it!
This whole procedure won't void your warranty and isn't that difficult to handle!steberg likes this.
*** Official Clevo P75xDM and P77xDM/Sager NP9758-G and NP9778-G "Batman 2.0" Owner's Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by ProFX, May 18, 2015.