The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** MSI 16L13 (Eurocom Tornado F5)/EVOC 16L-G-1080 15.6" Owner's Lounge ***

    Discussion in 'MSI Reviews & Owners' Lounges' started by Diversion, Oct 14, 2016.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    There will be more results when more people have this chip in hand(larger sample size)... But these were your words, and not mine words :D "As their overclocking headroom has not improved AT ALL over the 7700k" Aka a too fast conclusion!! But I agree with this one... This new 4 core cpu has nothing to do in the new chipset :cool:
     
  2. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Reading over the first thread you sent me, I did not see where the iGPU caused the woes of that thread poster. In fact, in his edited post, he said he was working with you, and said the following: "2.Then in XTU 'down volt' lowering both dynamic and cache voltage offset by -50.78. If you can get lower without wierd CPU respone or just a freeze needing a reboot then great.". In other words, he undervolted the CPU to get better thermals, which allowed him to maintain his boost. Nowhere in that thread did I see the iGPU be the cause of his issues, nor did he seem to do anything related to the iGPU that would have been attributed to his "fix". I specifically asked you for an example of the iGPU impacting CPU overclocking. Surely you have an example of this, in your many years of experience.
    Yes, it was wrong of me to come to a broad conclusion with limited data to go on, sorry for that. This entire X299 launch has made me bitter, given the headache it's caused on the other forums I frequent. Far too many threads answering the same questions of "7700k? or wait for 7740x?". A man's sanity can only last so long, lol.
     
    Papusan likes this.
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Considering the work spent one on one to discover the solutions to mitigate the negative effects of the iGPU and Optimus helped 100's of people directly, and probably thousand's more that didn't register and login to post, I think you'll find it all already checked and double checked, verified by happy owners after tuning their laptops to get around the iGPU ruining their laptop.

    I still get PM's there from time to time thanking me for helping them solve their Optimus, iGPU, and other problems.

    It's good though to discover something new for certain that you thought you knew. As I said in ending the last post URL I gave you:

    "In the end we are all newbies, when it comes to discovering things we don't know [​IMG]"
     
  4. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,649
    Trophy Points:
    931
    hmscott likes this.
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    As you can clearly read, they can't even run iGPU + dGPU at load on stock settings, let alone adding heat by Overclocking.

    I think you can use your "reasoning" to figure out that *increasing* multipliers would be disadvantageous to solving their problem - it's already too hot.

    That's the example I explained, found the URL's for you, and now after proving what I said were true actual events, you still don't want to accept it?
    Thanks man, I'll see if I can dig up my own posting in the GT80 Overlocking thread, I posted a few posts about trying to OC my iGPU and CPU simultaneously...
     
  6. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Yes, and my "reasoning" tells me that undervolting his CPU is what gave him the ability to run his native turbo boost table. The first URL you linked, had nothing to do with disabling one's iGPU. Which is why I asked for you to clarify it. The second link, is simply you explaining why you think it's the case, without providing any real evidence to back up those claims. It's simply hearsay. I understand you dislike me using the scientific method before whole-heartedly believing the words of others, but it's why I ask for others testing methodologies in the first place. You know, to prevent the very misinformation you dislike?
    I'll give it a look, thanks.
     
    hmscott likes this.
  7. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    What? Wait, what? Disable the iGPU?

    Ok, well, I think we are getting closer to your lack of knowledge... you can't disable the iGPU unless you have a MUX switch to disable it.

    The G750 with iGPU enabled, has Optimus to handle "switching" to dGPU for applications, there is no iGPU "disable".

    Even if all the applications are using dGPU, the video is still going through the iGPU, as are all the rendering for the Windows applications - they never use the dGPU. Those G750 owners were screwed by the iGPU, and needed to detune their CPU's to stay out of thermal throttling range.

    It may take you a few reads and some additional reasonings of those 2 threads I gave you before you understand the scope of that situation enough to understand what I am saying.

    You may have to read more threads around that time to get the context that lead up to that thread working to solve the problem they all already knew all too well - the newly enabled iGPU's were causing their CPU's to overheat - the same CPU's from the previous generation that ran 10c cooler... I tried to pre-load you with that info, but find it for yourself if you need to.

    The next link is for testing I did on my GT80 with a MUX which will let you run in dGPU mode or iGPU mode, without Optimus, and I found that when trying to OC the iGPU the CPU couldn't maintain OC because the power drawn by the iGPU was pulled from the CPU:

    http://forum.notebookreview.com/thr...es-test-and-tune.773245/page-32#post-10066681

    "Before I rebooted into SLI land again, I thought I would see why I couldn't OC the CPU, when I ran the iGPU stress test and enabled all the TDP readings, I saw why.

    The CPU Total TDP hit 71w, until it dropped to 47w, and the iGPU TDP went from 45w to 30w. The CPU was only at 3% utilization.

    There is no additional TDP headroom for a iGPU on a 47w CPU package, if you are going to use the iGPU - you won't have much left for CPU, and visa versa."

    There are a couple of images, and read a few posts before and after for additional context.
     
  8. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    So... you are confirming exactly what I said? lol... The iGPU being enabled in and of itself won't impact CPU overclocking unless you are reaching a power/thermal limit. I've said that several times, and here you are confirming it. Secondly, you specifically used that thread as a means to prove the iGPU was the culprit behind his higher thermals, but the OP of that thread specifically mentioned undervolting the CPU/cache as the solution to the problem, NOT undervolting/underclocking the iGPU. You've yet to provide sufficient sources to your claims. At the very least, you've shown you are capable of petty insults.

    I get it. You can't provide evidence in the form of actual images/testing methodologies, and you've gone on a tangent over what started as you being insulted by my lack of concern over a small HT anomaly. I deal with children on forums all the time, so I won't judge you negatively over it. Just take it easier next time around.
    Boy, that thread is a mess and a half. Some people suggesting to add more vcore for higher clocks, with others recommending undervolting for higher clocks. It's amazing how anyone was able to figure that thermal situation out. I did see you were trying to remedy their misconceptions, so kudos for that. Sadly, I could only find that one guy at the end claiming the iGPU being disabled helped him out, and he didn't really provide any before/after results in an image format. While I am inclined to believe him (it's not exactly something someone would gain anything from for lying over), part of me still wishes to see the proof.

    I get the feeling that somewhere along the way, we all got confused over what it was we were setting out to prove. So, i'll re-state my original point. The iGPU being enabled, in and of itself, will not impact CPU overclocking AS LONG AS it's not imposing a power/thermal limitation on to the CPU package. If someone can show me a situation in which they are not limited by a power limit, have plenty of thermal overhead, and lost 100-200mhz simply because their iGPU is enabled, i'll concede this point. It's that simple.
     
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Thanks for wasting my time, from now on I'll Ignore your posts and responses, as you have proven you aren't worth the time to try to help.

    Do us all a favor and ignore my posts too, so you aren't tempted to respond and waste other people's time by reading them.
     
    Last edited: Jun 26, 2017
  10. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    That's a shame. Now you'll never see me admit I was wrong once someone inevitably provides some real information. I am not above admitting when I am wrong, and I certainly appreciate the opportunity to learn from my mistakes. I only wish you were able to provide more information than just your random theories. What you've said thus far in those threads, are exactly in line with what I've stated. When your iGPU imposes a power/thermal limit on the rest of the package, CPU clocks suffer. If you have enough power (thanks Prema), and enough thermal headroom (yay for delids and real heatsinks) then simply having an iGPU enabled, shouldn't interfere with your CPU overclocking at all. This has been sufficiently tested in the desktop environment, and I expect someone of you have tested (or at least, attempted to as best you could given the extreme bios/hardware limitations being imposed by vendors) this as well. I am simply asking for the data so that I can compare, and perhaps even come to the same conclusion. What kind of world do we live in, where asking for evidence to a claim all of a sudden becomes a big deal?

    Again, I have to reiterate you had no problems with my mentioning of iGPU's and it's impact on overclocking until AFTER you saw me disagree with your HT errata reports. Surely (assuming you've yet to block me) you see my point of view, and why I am hesitant to take your word as gospel, right?
     
  11. leftsenseless

    leftsenseless Notebook Evangelist

    Reputations:
    426
    Messages:
    392
    Likes Received:
    664
    Trophy Points:
    106
    Those games don't require a super high frame rate, limiting the output to 60 fps would be great on the 120 Hz panel without Gsync.

    There are plenty of other frame limiting programs that won't introduce input lag. If you care enough about input lag then you probably already know this.

    If you think there is a huge difference between 16 ms and 8 ms you are fooling yourself, you'll have a harder time overcoming your network lag. Very few people can respond in that amount of time consistently. Even in fast twitch games. You'll need to be running LAN servers to enjoy the small benefit in the infitisemal difference. The smoothness is easier to perceive than the input lag at those frames (unless the application has suboptimal code).

    120 Hz coupled with a high spec machine gives options that can mitigate the need for Gsync reducing its value.

    Even with stutter higher frames reduce the perception as stutter is an inconsistency in frame times. But if the gaps pass by much quicker they become harder to distinguish. So a gap that refreshes at 8 milliseconds then at 16 milliseconds is going to be harder to notice than those happening at slower speeds. Would you agree? It becomes less jarring when it is less impactful.
     
    MageTank and hmscott like this.
  12. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Limiting the framerate in those titles is essentially what I have to do in order to have a great experience. I can't stomach playing GW2 at 165hz, and suddenly drop to sub 30fps as soon as the entire server crowds a world boss. This is with a desktop GTX 1070 clocked at 2228mhz, so I am not exactly starving for GPU power here. Even G-Sync can't save you once you fall below 30hz.

    While I agree with you that 16ms and 8ms are not much difference (we are literally talking just a single frame at 120hz), the "pro's" swear that they can feel it. They even invest in those 240hz panels and use programs that are designed to change the hops their ISP takes to the server. The irony is, they are still at the mercy of the server/client tick rate, and interpolation/extrapolation(favor the shooter). The biggest issue I have in regards to refresh rates, is the ghosting. It's even worse on the super slow laptop IPS panels that I've seen at Microcenter. I'd trade away G-Sync and a handful of other features just to enjoy less ghosting. Between jittering and ghosting, I end up with a pretty annoying headache after just a couple hours gaming. The sad part is, I swear it wasn't that big of a deal when I was a kid. Just wish I knew what happened somewhere along the way that turned me into somewhat of a panel snob, lol.
     
    leftsenseless likes this.
  13. leftsenseless

    leftsenseless Notebook Evangelist

    Reputations:
    426
    Messages:
    392
    Likes Received:
    664
    Trophy Points:
    106
    Yeah, I'm guilty of being a panel snob as well. I have a Pioneer Elite plasma that I will only trade up for with an OLED. Funny thing about laptop panels is we are forced to compromise. IPS has better viewing angles and color reproduction, while TN has faster response and less ghosting. When they get OLED to the point of being a viable option in laptops, I'll be pretty excited.
     
    syscrusher likes this.
  14. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Man, I'd love a 13.3 inch 1080p 120hz OLED panel with G-Sync, and an undervolted 1070 in it. While I know physics makes my dream impossible (for now), it would certainly be one amazingly portable, beautiful device.
     
    leftsenseless likes this.
  15. Lunatics

    Lunatics Notebook Evangelist

    Reputations:
    157
    Messages:
    520
    Likes Received:
    348
    Trophy Points:
    76
    It never used to both me either until I started using my 144hz Asus monitor. I think your issue is you experienced better things out there and going back bothers you now (at least for me that's what it is). Now I can't use anything less than 120hz. Even using standard monitors at work on desktops do not feel smooth and bother me. Trying to game on a 60hz laptop screen kills me now. When I had my Razer Blade I remember trying to play CS on it and even with 200-300fps the game felt laggy and jittery and not smooth and hurt my eyes to play. I would go back to my AW 17 and it was a little better but even then didn't feel smooth. Having this 120hz display on my laptop now is amazing and am so happy that I went with this display I never want something less again. Maybe I will buy into the whole Gsync thing when I actually see it and get some hands on experience with it, but I am not ready to buy into it yet and high refresh rate screens are more than enough for me. I never experience or notice any tearing or issues.
     
  16. leftsenseless

    leftsenseless Notebook Evangelist

    Reputations:
    426
    Messages:
    392
    Likes Received:
    664
    Trophy Points:
    106
    @Lunatics, This is similar to my experience as well. I can live without Gsync, but I have to have a high refresh rate and a faster response. I hate ghosting as well. That's why I've recommended to people that the trade off is worth it to not have Gsync on the 120 Hz panel. With the 4k 60 Hz panel I would definitely go for Gsync.
     
    Donald@Paladin44 and Robbo99999 like this.
  17. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    This makes a lot of sense. It even bothers me just moving the mouse cursor around on the desktop. I am curious. Do laptops support lightboost (ultra low motion blur)? I know with desktop panels, the G-Sync module has lightboost functionality built in, but with laptop panels lacking the module, I don't know whether or not they have access to lightboost. If they somehow still have lightboost, would be interesting to see if it works. I suppose if any laptop advertises itself as "3D ready", it would have the strobing feature. Would be pretty neat to have that on a laptop.
     
    Donald@Paladin44 likes this.
  18. Lunatics

    Lunatics Notebook Evangelist

    Reputations:
    157
    Messages:
    520
    Likes Received:
    348
    Trophy Points:
    76
    Does anyone know what type of headphones these laptops are capable of powering? I have a pair of AKG K7xxs and was thinking of getting a pair of the HD6XXs going on through Massdrop right now, wondering if the laptop would be able to power either pair of headphones or if it would struggle and I would lose sound quality without a DAC and or Amp to bring around with it?
     
  19. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    That December shipping date tho.
    But yeah the K7XX won't have any issues.
    The HD6XX will definitely benefit from a dedicated amp though , the system is a pretty "decent" line out/ pre amp.
     
    Donald@Paladin44 likes this.
  20. Lunatics

    Lunatics Notebook Evangelist

    Reputations:
    157
    Messages:
    520
    Likes Received:
    348
    Trophy Points:
    76
    Yes...the 6 month away ship date and the $50 price increase is what is holding me back from instantly purchasing but I may do it anyway. I do not necessarily NEED them and am not in a huge rush, plus am shipping my laptop out and won't have it for a couple weeks anyway probably heh. I have another day or 2 to decide, and there's always a chance that it will ship sooner? Though in my luck MD ends up shipping things later rather than sooner so I'm not hopeful of getting them before December/January. Granted, this would give me time to find a good, portable DAC/Amp combo that is affordable before I need to worry about the headphones :p.

    I guess I should try my AD700's or K7XX on my laptop when I receive it back to get a better idea of how the sound will be out of it, unfortunately it is being shipped out today and all packed up if Fed-Ex hasn't already taken the package so I do not really have the option of testing it with current headphones I own before making the decision on these. So far I have only used a couple pairs of cheap in ear headphones with my laptop and I thought I heard some hissing and not the best sound out of them, but it could be the headphones plus I was sitting near/in front of a window AC unit so I had air blowing past me that could have been messing with my perception of the sound. Wish I thought to just use my current headphones on the laptop and see how everything was before I shipped it back to HID.

    That was the one thing that really interested me in the Clevo line, I know the P650 had the integrated DAC and I believe the DTR equivalent version of this laptop did as well, but the MSI is so much lighter and fits what I need it for much better.
     
    Donald@Paladin44 likes this.
  21. syscrusher

    syscrusher Notebook Evangelist

    Reputations:
    564
    Messages:
    608
    Likes Received:
    1,176
    Trophy Points:
    156
    I'm pretty happy with the 4K IPS on my machine, but if HID Evolution offered a 4K IGZO I'd be seriously tempted.
     
  22. syscrusher

    syscrusher Notebook Evangelist

    Reputations:
    564
    Messages:
    608
    Likes Received:
    1,176
    Trophy Points:
    156
    I have Sennheiser HD25 headphones, which are a very audio-neutral design meant for engineering and content editing (which is what I use them for). This machine has plenty of drive for those.
     
    Donald@Paladin44 likes this.
  23. Lunatics

    Lunatics Notebook Evangelist

    Reputations:
    157
    Messages:
    520
    Likes Received:
    348
    Trophy Points:
    76
    I believe those only have a 70 ohm impedance similar to the 7xx however the 6xx/650s are a 300 ohm impedance headphone and I think that would make a pretty big difference.
     
    syscrusher and Donald@Paladin44 like this.
  24. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    I was playing FF14 with my wife and put both laptops side to side, 4K is just unbelievably better in terms of details and color. No way I can go back to 1080p display at this point, as long as GSYNC keeps it nice and smooth (Which it does, and does well). I've also grew quite fond of the warm color of the panel and found it to be easy on the eyes over extended period of time.

    So yeah, I'm with you brother. 4K rocks and wouldn't have it any other way!
     
    Last edited: Jun 27, 2017
  25. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    For those who wants to perform the bottom cover mod, I've put together a list of tools needed and general tips on how to do it.

    As @Mr. Fox mentioned before, make sure to measure the each side separately. They are not perfectly lined up and you will have inaccurate holes if you do not measure them individually. The 2" drill was perfect size for the fan intake, so I highly recommend using that one and make sure to cover the hole with a nice dust filter. I tried to super glue the dust filter onto the panel but it had a hard time staying, so ended up using Kapton tapes around the edges which held nice and tight.

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]

    I'd also recommend using sandpaper to smooth out the edges of the holes, since they are bit thick and leaves rough edges. The measuring can be bit tricky, but just take your time and do your best getting it as accurate as possible from the top/side of the laptop to the very center of the fan intake.

    With these tools, it shouldn't take more than 10 minutes to do and you will immediately notice the temperature improvements. Let me know if you have any questions and I highly recommend it!

    Check out my review for test results.
    http://forum.notebookreview.com/thr...16l-g-1080-15-6-owners-lounge.797128/page-653

    https://www.amazon.com/gp/product/B01N6BFVKO/ref=oh_aui_detailpage_o06_s00?ie=UTF8&psc=1
    https://www.amazon.com/gp/product/B006ZFQNT6/ref=oh_aui_detailpage_o08_s01?ie=UTF8&psc=1
    https://www.amazon.com/Black-Decker...1498585081&sr=8-1&keywords=circular+drill+bit
     
    Last edited: Jun 27, 2017
    UsmanKhan, Skylake_, Huniken and 10 others like this.
  26. leftsenseless

    leftsenseless Notebook Evangelist

    Reputations:
    426
    Messages:
    392
    Likes Received:
    664
    Trophy Points:
    106
    Did your record before and after temps? Would you care to share? Thanks in advance.
     
    Donald@Paladin44 and MageTank like this.
  27. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
  28. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Have you guys looked into using hexagonal steel mesh for the bottom of these? While I know dust control will be difficult, I imagine thermals might improve a little more as well (assuming any required channeled airflow isn't disrupted). I did a mod a really long time ago for one of my old desktop side panels, where I cut out a hole and added some hexagonal mesh to it: http://www.themeshcompany.com/products/Hexagonal-Mild-Steel-Perforated-Mesh.html

    Might be able to further mod that entire bottom cover to turn more of it into a perforated mesh. Again, it would be a matter of trading dust for performance, but with how easy these things look to disassemble, I assume most people won't mind that tradeoff if it indeed improves performance.
     
    Papusan and Donald@Paladin44 like this.
  29. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    Interesting idea, but testing result shows dramatic drop in temperature as it is and I refuse to let any dust get in my laptop for any reason. I highly encourage you to perform the mod yourself and report back to us though!

    EDIT: After looking at the hex mesh you linked, I think it would completely defeat the purpose of having a dust filter as the holes are so big. Might as well just leave it wide open at that point ^^
     
  30. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Yeah, if done, it would have to be left wide open. That's the real negative tradeoff. Dust control would be non-existent at that point. However, it would make for some great airflow if performance is the be all, end all for you. Would require a weekly cleaning at that point, lol.
     
  31. leftsenseless

    leftsenseless Notebook Evangelist

    Reputations:
    426
    Messages:
    392
    Likes Received:
    664
    Trophy Points:
    106
    Oh, yes. I recall reading through that being very impressed. Thanks for the link and sharing your tests. You've already put in a lot of work.
     
    Donald@Paladin44 likes this.
  32. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    Thank you. I know @Donald@HIDevolution will make the bottom panel available for sale soon, so I wanted to make sure you guys will be prepared to rock and roll the moment it comes out. Honestly, I'd just perform the mod right now and buy a spare if you really need one, as the temperature improvement is really solid and immediate. I really can't say more positive things about it and considering how easy it was to do, there's no reason not to go for it in my opinion.
     
  33. ydaf

    ydaf Notebook Consultant

    Reputations:
    18
    Messages:
    126
    Likes Received:
    11
    Trophy Points:
    31
    Thank you for your advice and work. This must have taken some effort! Even with a laptop air-cooler the CPU temps spike sometimes while the GPU is quite stable. I may try this modification if I can't get a modified panel. What would a good CPU undervolt be for 7700K with Throttlestop (at least to start with..)? Just to bring down the temps a bit without trying to change the bottom panel. Thank you :)
     
    Donald@Paladin44 likes this.
  34. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    As much as I'd like to help answer your question, I'm afraid it is outside the realm of my expertise. The laptop I received had stock OC of 4.8GHz (Due to me purchasing 5.2GHz rated SL CPU), which had issue with PCH temperature that would spike up to 90C+, causing thermal shutdown. To resolve this, @Zoltan@HIDevolution provided me with XTU profile with 4.7GHz OC, which has been rock solid but I simply lack the knowledge to know exactly what this profile is doing. I do not use any other programs other than Intel XTU, so no Throttlestop was used during my testing.

    I'm sure one of our fine gentlemen in this thread can better answer your questions regarding undervolting!
     
    Last edited: Jun 27, 2017
    ydaf likes this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,655
    Trophy Points:
    931
    Without the bottom cover mod the U3 would not help anywhere nearly as much. The U3 helps pick up the slack for the weak internal fans and helps the lappy have plenty of cooler air to gulp down. With the bottom cover blocked off it makes it very difficult for the machine to breathe like it needs to.

    That depends on what clock speed you are running. For 45x4 you should be able to run very stable with static voltage and somewhere between 1.060 and 1.075V. For 47x4 I use 1.150V, and 1.310V for 50x4. It will vary by CPU, as well as how hot the environment is where you are using it. A cooler CPU (or GPU) requires less voltage than a warmer one as a general rule. You will need to test to find the lowest stable voltage. Use Cinebench for testing. If the voltage is too low Cinebench will crash.
     
    Last edited: Jun 27, 2017
  36. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    @Mr. Fox , do you recommend us running static or adaptive voltage? It seems when I first apply the XTU profile, it gets applied as adaptive, but if I ever have to reapply the profile (sometimes it does not stick after reboot), it becomes static...
     
    Donald@Paladin44 and Mr. Fox like this.
  37. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,655
    Trophy Points:
    931
    I use static voltage (referred to as "Override" voltage in the BIOS). It's much better than adaptive on both of my machines. I do not use XTU. I use the BIOS and ThrottleStop. Are you getting the @Prema BIOS for your EVOC machine from HIDevolution?
     
    syscrusher and Donald@Paladin44 like this.
  38. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    I wouldn't miss Prema BIOS for the world! Had it shipped out yesterday to HIDevolution and extremely excited to see all the hard work you and Prema put into it. I'll revisit the ThrottleStop and BIOS configuration once I get it back and move away from XTU as you recommended. Thank you!
     
    Papusan, Mr. Fox and Donald@Paladin44 like this.
  39. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    @Mr. Fox, @Prema , if possible, could you provide us with a brief description of what to expect once we get our laptops back? I'm afraid all the incredible work you guys put in might just fly over our heads (mine at least) and I would be extremely grateful if you could provide some tips on how best to utilize it for regular/power users. @Donald@HIDevolution briefly mentioned it would help with couple of things, but I'm sure it is capable of much more and would like to know how we can properly leverage it.
     
  40. Lunatics

    Lunatics Notebook Evangelist

    Reputations:
    157
    Messages:
    520
    Likes Received:
    348
    Trophy Points:
    76
    I had tried using static and am at 45 on the first core and 44 on the other 3 (how it came from HID) and I was using TS to try and set it. I had luck with adaptive getting up to about -150mv and being fairly stable however when I tried going to static and going down to a similar number my system seemed to be very unstable. I will continue playing around with it more once I get it back and have Prema but I seemed to have better luck initially with adaptive rather than static. My adaptive UV was running great in PUBG and a couple other games I played however trying to play Overwatch was crashing my system on a regular basis so I will have to continue using that game to test stability when I get it back and can mess around more.
     
    Donald@Paladin44 likes this.
  41. ydaf

    ydaf Notebook Consultant

    Reputations:
    18
    Messages:
    126
    Likes Received:
    11
    Trophy Points:
    31
    Thanks for the info.

    I'm going to try playing with the voltages over the next few days and see. I'll try Cinebench too. My room temperature is around 23-24 C (in A/C). With just basic browsing, watching a movie or stream the CPU spikes at times to 80-90 C. It doesn't last longer than 1min or so at the most. I know it's high because the fans kick in and I can see the CPU's fan RPM is around 30-40% higher than the GPU's. This is with those average temperatures mentioned and a simple USB powered laptop cooler. The GPU doesn't pass 50 C (99% of the time) with those basic functions. I'm aware of 7700K issue but Intel should have better quality control for such a high end chip. The modded bottom panel seems like an essential for this laptop regardless.
     
    Papusan, Mr. Fox and mizerab1e like this.
  42. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    CPU spiking is definitely normal, but shouldn't spike that high even for 1 min. I see you bought it from Eurocom, did you get it delidded?
     
    Donald@Paladin44 likes this.
  43. ydaf

    ydaf Notebook Consultant

    Reputations:
    18
    Messages:
    126
    Likes Received:
    11
    Trophy Points:
    31
    No I never even knew of that term before, honestly, lol. I just read up on it. I never asked for it nor was it mentioned during the ordering process. I'll see if the software approach has provides any significant improvements.
     
  44. mizerab1e

    mizerab1e Notebook Consultant

    Reputations:
    193
    Messages:
    126
    Likes Received:
    337
    Trophy Points:
    76
    It's all good man, I was completely clueless on all this stuff until recently. However, let me tell you something about 'Delidding' your 7700K.

    I'd go as far as to say, it is MANDATORY for you to delid your CPU in this laptop if you plan on doing anything with it. 7700K simply runs way too hot in this laptop and you will struggle to keep your temperature low enough to do anything productive. The fact that your fans are going 100% while simply browsing and streaming means you have a bigger issue at hand. No amount of software approach will help you keep CPU temp down to reasonable level and I HIGHLY RECOMMEND that you delid your 7700K before anything else. We are talking 15~25C drop in CPU temperature by doing this.

    I'm actually stunned that Eurocom will sell this laptop without 7700K delidded. There are many resources on this topic. You can buy tools to do it yourself, or pay for someone else to delid it. I implore you to do this, and let me know if you need any help making this happen.

    You can buy all the tools you need to delid it yourself here.
    https://rockitcool.myshopify.com/

    Here is @Mr. Fox 's video on delidding, please take the time to watch it.
     
    Last edited: Jun 27, 2017
    pressing, ydaf, Papusan and 4 others like this.
  45. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,655
    Trophy Points:
    931
    Time permitting, this weekend I will do another video specifically for this machine with @Prema BIOS.

    Adaptive voltage is spastic and trying to set a similar value could be messing you up. Starting with the knowledge that stock voltage per Intel specs is like 1.250V (which is way too much and is enough for 5.0GHz on both of my 7700K) set the BIOS to override (or ThrottleStop to Static) choose a multiplier and start working your way down from 1.250V in 0.050V steps using Cinebench to test for stability. If you go to low and crash from not enough voltage, try going up in 0.010V, 0.015V or 0.025V increments to fine tune it (versus 0.050V going down). As little as 0.005V change can mean the difference between stable and almost stable.

    As you are doing this remember that the hotter the CPU gets, the more voltage it needs. You may want to let it cool off a bit in between Cinebench runs. When Cinebench crashes or you get a lock-up or BSOD that is your clue that the settings are not where they need to be. The most common bugcheck (BSOD/STOP error) codes for voltage too low are 0x0000124 (WHEA UNCORRECTABLE ERROR) and 0x00000101 (CLOCK WATCHDOG TIMEOUT) and 0x0000001E (KMODE or KERNEL MODE EXCEPTION NOT HANDLED). There is a list of them here [ LINK] that is a bit dated, but still applicable.

    My previous post mentioning values that should work for most is a good place to start as a time saver. You may have to go up a little, and may be able to use less than what I suggested. Be sure your Power Limit 1 and Power Limit 2 are set to at least 130000 (130W) each.
     
    Last edited: Jun 27, 2017
    Papusan and mizerab1e like this.
  46. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,843
    Trophy Points:
    681
    Bottom Service Panel Ventilation Mod - This is now standard on all EVOC 16L-G-1080 models.
    [​IMG]
    As others have found, this mod will lower CPU temps by about 8 Celsius, bringing them down into the 80's under heavy load.

    For customers sending their units back for the Prema BIOS flash, we will do this Mod for $29.

    For customers that just want to buy the modded bottom service panel it is available as an accessory for $59 plus shipping.
     
    Last edited: Jun 27, 2017
  47. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Those images are broken for me. Either way, $59 isn't bad at all. Especially given the results. I've seen people spend far more for far less of an improvement.
     
    Papusan, Donald@Paladin44 and Mr. Fox like this.
  48. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,843
    Trophy Points:
    681
    Edited and corrected.
     
    mizerab1e, Papusan, MageTank and 2 others like this.
  49. MageTank

    MageTank Notebook Consultant

    Reputations:
    92
    Messages:
    123
    Likes Received:
    129
    Trophy Points:
    56
    Wow, that is far cleaner looking than I expected. No offense to the homegrown modders, but I've yet to see them sand theirs to be that clean. It almost looks as if it's protruding just enough to be level with the rest of the under-surface.
     
    Papusan, Donald@Paladin44 and Mr. Fox like this.
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,655
    Trophy Points:
    931
    That's pretty sweet. Let's see now... @Prema BIOS, better optimized vBIOS, CMOS battery extension, bottom cover mod... what else? Quite the attractive list of standard features I'd say.
    They found a grille that fits like a glove. It does make for a very clean factory look.
     
← Previous pageNext page →