Of course the 1080 also comes with a larger PSU than the 1060/1070 package, but the rest of the systems hardware is identical.
-
-
tanzmeister Notebook Evangelist
yes, off course. I understand that. I am just saying that they have to change name on the paper for certification, which is connected to the PSU power. at least that's how I understood it.Prema likes this. -
Yes European CE Certifications only apply to a specific model (name) with exact power package.
-
Al_Jourgensen Notebook Consultant
I’m sorry I didn’t understand, I bought the P775TM with a gtx 1080, could I have problems with g-sync? -
Al_Jourgensen Notebook Consultant
How is the noise, temps?
Thanks in advance -
CPU doesn't rise ~60C, and it's very quiet cooling system. GPU max in my cases ~75C (with @tanzmeister mod), and yes, quite noisy. I don't play benchmarks, so my words are relevant only for normal use.Al_Jourgensen likes this.
-
If you bought a system with 1080 it's a P7xxTM1 and not a P7xxTM. Just check your board name in bios.FTW_260 and Al_Jourgensen like this.
-
This was from a 4min video encode. Max temp 89c. I didn't have the fans on max profile so I probably could have run it a little cooler...
Sent from my SM-G950U using Tapatalk -
Hi guys, I'm about to buy one with i7 8700k and gtx 1080, I am not an expert like you in putting my hands on my pc like you all so some advice could be welcome
, but I can learn fast
. My main usage will be with virtual reality and web surfing, I wanted something future proof enough not to be forced to change my pc in few months and after a long research and some advice I am ready to buy it...just here to hear from you now..thank you in advance..
-
Overheating could be a problem too..very hot here during summer season..
-
I am going to choose the 4k display, anyone can share is experience about that?
-
tanzmeister Notebook Evangelist
-
Thanks!
-
Both 8700K and GTX1080 will generate a lot of heat and while the processor can be easily undervolted for reducing temps..the GPU is difficult to tame specially if ambient temps are warmer.
With the right tools you can easily mod for better cooling and performance.
Sent from my SM-G950F using Tapatalk -
Thanks much. Looks like I'm going to be taking mine apart again and redoing it. I get those temps at 4.4GHz lol.GameServ likes this.
-
So, assuming I am not so in manipulating and i am worried of high temps you suggest to go back to a lower one like i7 7700 series processor to avoid troubles? Do you think is enough for my needs? Thanks again
-
So what will happen when I use TM1 bios to my TM model with gtx 1060? System will not boot up?
Maybe fan profiles will be more agressive?Last edited: Jan 22, 2018 -
I think you will have hotter gpu temps so the cpu you choose won't matter.
Most recommend the 1070 for this laptop if you are not into opening it up to repaste it. -
Worst case it won't boot. Best case NVIDIA driver won't install and g-sync will be lost either way even if you Mod the driver to install.Georgel likes this.
-
@Prema, will your magic bios help with activation G-Sync on i3 CPU series?
-
So I'm in a bit of an odd position. My P775DM3 is on the fritz and may need to be replaced, AGAIN.
I've maneuvered myself so that I could exchange it not just for a new replacement DM3 but as an option also to a TM1 as well (with a negligible price bump for the 6 core CPU).
My concern is if the TM1's split heatsink is not worse performing than the DM3's unified heatsink?
My argument is this; the DM3 unified heatsink lets the heat from the most heat producing component (e.g. CPU or GPU) be led to the other heatsink side where there is cooling capacity left. This way my GPU can work harder in games when the CPU is not taxed as much and pump the heat to the CPU side. Also when I'm converting video the CPU can clock higher as the excess heat is pumped to the GPU side that sits mostly idle.
On the TM1 spilt design however when either the CPU or the GPU is sitting still and the other processor works up a sweat, the idle processor heatsink does not add to the cooling capacity of the whole system.
It's solid reasoning and I believe the main reason why the unified design was made in the first place. Split is just easier and cheaper, not better performing.
Can anyone chime in on this? Fact based. Should I go for another DM3 or switch to TM1? -
I have the same insights with my DM model. But i ordered TM1 and I will compare both constructions. The main advantage is that you can easily replace the paste separately on the CPU / GPU without raising the whole heatsink. So if I mainly use CPU for work, the thermal paste on CPU loses its performance faster I will repast only CPU grase
-
From what I saw new HS design is a bit better.
-
That's my point exactly, it's seemingly geared towards ease of use, not performance.
I'd love to have some actual experiences.
Also, would a DM3 heatsink not fit a TM1? I mean, it's just the chipset that differs right?
I want to appreciate you chiming in, but you'd have to support this opinion a bit more. -
I actually have the same concerns as you do. A larger heatisnk is also supported by 2 fans. Provided that you usually don't stress both the GPU and the CPU at the same time, a unified heatsink might take advantage of the both fans, not just one
Planning to get P870TM1 instead of P775TM1 though, if I keep getting Clevo, this time I want to test their absolute best as well. -
[/QUOTE]
I want to appreciate you chiming in, but you'd have to support this opinion a bit more.[/QUOTE]
Well All I saw is on this thread, so just tune in and do some reading...then report us your opinion, and nothing to support your opinion is needed... -
The main issue is as it follows: A separated fan setup works better if you burn both to their max temps, but since none of the fans ever stop, a combined setup works better if you're using just either the CPU or the GPU at max ability.
-
Unified Heatsink design will never be a good option. Make it fit perfect on both cpu and Gpu is more complicated. But with high load on Cpu this can be an small advantage. But with high load on both, Nope.
Georgel likes this. -
Welp, I usually have high load on CPU more than on the GPU...
-
NEVER a good option? That's way too extreme of a statement which you contradict yourself a little further in your own reply... I grant you that making perfect contact on two processor surfaces is harder, but when it fits good I argue it performs better.
I often stress my CPU for video conversion and both sides of the heatsink, also the GPU side, pump out heat.
That wouldn't be the case with the TM1 setup where the GPU side would stay cool and useless.
For gaming that uses little CPU cycles it's exactly the same but opposite, enabling my GTX1080 to clock higher.
I am almost certain that with the split design my CPU and GPU would clock lower and perform worse, unless both are stressed to the max simultaneously, which hardly ever happens.
If I end up getting a TM1 (6 core lure...) I will try and get me a separate DM3 heatsink and test it myself.
Until then I'm interested in fact based arguments, or at least coherent thought based arguments rather than underbelly feelings and preferences.
You see, if you just repeat what is in this thread already and add nothing of yourself, then you're not really adding anything to the discussion other than noise. Opinions are fine, great even, but they are based on thoughts and considerations and I'd be interested in learning from them, see it perhaps in a new way. But you're not stating a thought through opinion it seems, so I would indeed like to hear some supporting arguments for your position if you have any (other than 'I read it here').Last edited: Jan 23, 2018 -
Al_Jourgensen Notebook Consultant
can you please tell me what mods and tools are these? is there any tutorials?
thank you -
Very Easy :: Difficulty Level (1/10)
1. Install ThrottleStop and undervolt your CPU. I can undervolt to -200mv stable but your results may vary.
Little Effort :: Difficulty Level (3/10)
1. Remove your laptop cover
2. Unscrew the CPU and GPU heatsinks without removing them..make sure you DO NOT move the heatsinks after removing the screws
3. Do the paperclip mod for both CPU and GPU heatsinks..
4. Install ThrottleStop and undervolt your CPU.
5. Use MSI Afterburner or ASUS GPU Tweak II for GPU
A bit difficult :: Difficulty Level (6/10)
1. Remove your laptop cover
2. Unscrew the CPU and GPU heatsinks and pull them out
3. Repaste both CPU and GPU
4. Do the paperclip mod for both CPU and GPU heatsinks..
5. Install ThrottleStop and undervolt your CPU.
6. Use MSI Afterburner or ASUS GPU Tweak II for GPU
Difficult :: Difficulty Level (10/10)
1. Remove your laptop cover
2. Unscrew the GPU heatsink and pull it out
3. Repaste the GPU and put it back using the paperclip mod
4. Unscrew the CPU heatsink and pull it out
5. Take out the processor and delid it
6. Repaste the CPU and put it back using the paperclip mod
7. Install ThrottleStop and undervolt your CPU.
8. Use MSI Afterburner or ASUS GPU Tweak II for GPU
Guides
ThrottleStop
http://forum.notebookreview.com/threads/the-throttlestop-guide.531329/
Paperclip mod
http://forum.notebookreview.com/thr...ctive-cooling-mod-for-p775dm2-p775dm3.803626/
GPU tweaking using MSI Afterburner
http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,2.html
GPU tweaking using ASUS GPU Tweak II
https://rog.asus.com/forum/showthread.php?87087-Overclocking-the-Strix-GTX-1080
CPU and GPU repaste (video is for a different laptop but same technique)
Processor Delidding
Required Items
Small screw drivers
Small paperclips
Cleaning cloth
Tissues
Alcohol wipes
ArctiClean 60ml Kit
Thermal Grizzly Kryonaut Thermal Grease Paste [for CPU and GPU repasting]
Rockit 88 Delid & Relid for LGA 1150 & 1151 [for CPU delidding]
Thermal Grizzly Conductonaut [delid repasting between CPU and IHS]
Permatex 82180 Ultra Black Silicon Gasket [for putting back CPU and IHS]
A lot of courage
Last edited: Jan 23, 2018 -
I'm sure the new single Cpu heatsink eg in P870TM will handle heat better than the Old Unified or the New single heatsink in P775 series. A gaming laptop should have decent cooling for both Cpu and Gpu. A Unified heatsink design will often fail with max load on Cpu and Gpu.
A cpu that must rely on the cooling for GPu is a failed design.Last edited: Jan 23, 2018Georgel likes this. -
Now that I thought about it, the new deisgn is actually better for a single reason. If the heatsink is unified, you can get a heat reservoir in between the CPU and the GPU while both are being under load, think about it like, the spot between them actually doesn't help them cool, but keeps heat trapped there, so it is actually worse if you do use both at the same time.
Otherwise, the GPU heatsink would get hotter without it being used, which isn't fully ideal either... -
Al_Jourgensen Notebook Consultant
Jesus, thank you very much my friend, now i´m in buisness with this methods when those fans trottle
really thank you, thank you very much -
Meaker@Sager Company Representative
Thermal energy will always move to an area of lower concentration so it does not really pool like that. -
Of coruse, but if there is a point between the GPU and the CPU, and they both keep producing heat, yet there is not enough area to cool the center between them, theoretically that area serves more for heat concentration rather than dispersion...
-
Will heat from Cpu trigger the GPU fan to run a lot faster to remove the heat if the GPU has minimal load?
-
Meaker@Sager Company Representative
The hottest part will be the core contact point and everything in between will be cooler. The heat will simply stop flowing in a direction and go an alternative way if possible. It won't get trapped between two heat sources.
The chip will be warmed so yes. -
Let's think things this way:
\Heat is about potential difference
The entropy of a system tends to go to infinite. This just means that the heat energy of that system wants to be dispersed evenly throughout that system
At the point the heat stops flowing to the space between the chips, that area is at a constant 80C - 90C, which isn't very good. At least in my mind (?)
This is why some might prefer having non-unified heatsinks.
At the same time, having more space also allows for easier heat flow, so it should be easier for heat to spread throughout the system having more conductor. -
The CPU does not 'rely' on GPU side cooling at all. There is added cooling for the CPU available if the GPU is not loaded (and vice versa), keeping the CPU either cooler or able to clock higher than with a GPU load. A split heatsink simply does not add the extra cooling capacity to the CPU when GPU loads are low. On my DM3 a CPU side only fan gives 12 degrees Celsius higher temps with converting video than with the GPU fan on max as well (77 vs 65).
Incorrect. As stated heat doesn't get 'trapped' or 'pool'. At worst that center will be the temperature of the other heatpipes as when the other heatpipes will cool the heat will simply flow back again. That center potentially being 80/90 degrees (which it won't, because that would mean the processors are even higher than that) is not bad at all as electronics are designed to withstand that heat and that hot center is not even near any electronics but suspended quite far above it.
No, if you add 90 degree heat to a 90 degree spot it won't get any higher than 90 degrees, ever. For that spot to be 90 degrees the processor has to be substantially higher than that.
Yes, tried and tested. With GPU load near zero, when I lower the temperature threshold for the fans the CPU fan goes max and the GPU fan ramps up too. With a 'cold' GPU that fan pumps out considerable heat that was transferred from the CPU side. Basically doubling the heatsink (but less than double cooling performance as there is only a single heatpipe between the GPU and CPU).
When you block the GPU exhaust with your hand the CPU temps go up, much like it would be with a split heatsink.
In my eyes the only reasons to go for a split design is that it is easier to produce/install with lesser chance of damage or faulty contact due too it being bent ever so slightly, not because it performs better, because by shear thermodynamics it simply cannot. -
Well, some of the experts posting here recommended using thermal pad touching plastic casing over proper copper heatsink claiming thermal conductivity is irrelevant...
Next, as I said in previous post "what I saw in this thread" is actual users screenshots out of the box Cofee Lake CPU temps under load going around 80 degrees C.
But on other threads of previous version with unified CPU GPU heatsink with Sky Lake and Kabby Lake hitting around 90 degrees C, my simple conclusion was this split heatsink is better.
@tijgert please share your initial test and thoughts when your machine ships... -
Meaker@Sager Company Representative
Those heatsinks focused on GPU cooling to maximise gpu clocks much like the original vapor chamber design.
Also yes adding heatsinks to enclosed tight spaces with low airflow can hurt thermal performance. A solid like plastic is better than air and it will be better than metal if it has cooling vs being insulated. It can also be better to pad to the PCB depending on the orientation of the controller. -
This laptop seems to have mxm 3 module ,so Can I upgrade to a future GPU ( like a gtx 2080 , 3080....) ?
With the same current heatsink ?
Currently the 1060/1070/1080 gpu use the same heatsink ? and power adapter ?
Thundebolt 3 , 120hz screen , Gsync is available on this laptop ?
Thanks -
At this moment, it is hard to say and it will depend a lot on how the new cards will look like. In the past, some cards were compatible, while some were not, for example 1080 has a power connector plug that wasn't there on 980 (if I remember well), so it wasn't compatible then, but now it should be.
Same power adapter for all of them
I think that the heatsink is also the same for all of them
120Hz screen - Yes
Gsync - Yes
TB3 - I think yes, but I am not 100% sure. Should be -
Given that pcie gen 4 is going to relaease just before volta gpus release, i think the nwere gpus will only support mxm4.0! So there is like little to none chances of using a 2080 in a laptop with mxm3!
Also thunderbolt is nothing but pciex1 port, so expect TB4 to release with newer laptopsGeorgel likes this. -
Thermal conductivity is never 'irrelevant', it's the most important thing in heaters like high end laptops. It can at worst be non effectual if you have copper that isn't thermally connected or insulated from airflow, then plastic with a pad beats that like fighting a tank with a hammer beats fighting a tank with a worm; both lost battles but hammers are better than worms.
The thing you need to be careful of is comparing apples with oranges and drawing conclusions based on wrong assumptions. The 8700(k) is not equal to a 7700(k)+2 cores and not all CPU's are created equal either (far from it in fact). You simply cannot state that because a Kaby Lake was hotter than a Coffee Lake the cooling system must be worse. There are SO many factors. The *only* way to be certain of the performance of either heatsink design philosophy is simply to use the exact same hardware and mount the two different heatsinks, it's the only way to be certain.
Now I am more than willing to test that and report back in great detail, and even willing to pay for it to finally KNOW, but I need both heatsinks to test on the same machine.
But where do I get those spare parts? Where can I get a set of those split heatsinks? (I assume it is the cheapest option rather than getting a TM1 machine first and then trying to find a unified heatsink).
Any reseller wanna chime in and help out? I'll test it and return it if need be.
I do indeed believe the GPU is the greatest heat producer. I also get crazy high GPU clocks on my GTX1080 and run circles around ALL the standard GTX1080 desktop cards. I fear with great fear that as a gamer my GPU performance will take a nosedive if I would switch to a TM1... -
You must be really careful in stating your 1080 runs circles around other 1080, any serious professional reviewer would ridicule you...
And you are right there is an error margin in this tests (includes binning lottery, ambient temp variation, thermal compound settling, nuclear explosion...radiation...gremlins in computers...etc...) that is ~2 % not over 10%...
GTX1080...ready...set...go...RUN! -
I should've put asterisks in there
I meant running circles with advertised OOTB clock speeds where 'my' 1080 has higher speeds than vanilla desktop cards (as soon as you lean towards strawberry it goed limp). Obviously desktop cards boost higher... but if I mention that I don't feel so special anymore
I do agree the temperature difference is substantial, but all joking aside Coffee Lake and Kaby lake are simply different chips, hence the name difference for the architecture, and cannot be compared 1:1.
The 50% more cores with only a marginal increase in TDP supports that.
And I claim that Gremlins account for far greater variation than a mere 2%, if fed after midnight.nedooo likes this. -
I'm sure a stock clocked 7700K will use less power in same benchmark. And higher power = Higher heat. The marginal increase in TDP as you say doesn't always show the whole truth. What is your Cpu Package power in same benchmark (stock clock and default voltage)?
Stock 6 core 8700K in Cinebench R15 and 112w.
Georgel likes this.
*** Official Sager NP9175 / Clevo P775TM Owner's Lounge! ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Oct 6, 2017.