Check the bottom and see if it has a Sager serial number.
-
Meaker@Sager Company Representative
-
I don’t have it yet but I will
Can you possibly confirm that that they been on back order for a month -
Meaker@Sager Company Representative
Xotic will be the best people to speak to, I don't want to step on any toes in that regards.
Papusan likes this. -
I'll preface by saying I'm not an expert but I play competitive shooters so for me the 240-300hz panel was pretty necessary.
1. On all games I've tried (AC Valhalla, Cyperpunk, R2R2, RE:Village, Overwatch, Valorant, CoD, BF5, Fortnite,) the panel has been pretty amazing so far, no microstuttering. The one exception has been Outriders, but I think the microstuttering in that game is due to poor engine optimizations or something with map loading/cache usage. The game will stutter a little bit when you enter a new zone and then the stutter disappears.
In the past with my previous laptop I had major issues microstuttering (144hz G-synch panel on an Asus laptop) and after extensive research into a potential fix the only thing that made a definite difference was disabling Control Flow Guard in the Windows Exploit Protection settings (I disable it per application, not on the whole system). The stuttering was most noticeable on DX12 games. On certain titles using trick completely fixed the problem; AC Valhalla and R2R2 were literally unplayable to me till I disabled CFG, but also then again I'm used to having buttery smooth frames in other competitive titles so I might have been picky. Now I just disable it for pretty much any game I play just because once you get a positive effect like that you kind of get addicted hahah. This could also have been completely system/panel specific though, don't take this as me endorsing this as a workable hack I'm just providing this information because it has immensely helped several of my gaming friends using the same hardware and also because I feel confident in saying the panel on this system is pretty good.
2. Played a decent amount of AC Valhalla on the system in the last few days and I haven't noticed this ghostly image; it really could just be the person's phone/recording device that originally took the video. (I'm assuming that's a phone video?). Someone with more experience or who also has the laptop could weigh in on this though, again I'm no expert. I can try to upload a better video of AC Valhalla gameplay if that'd help. -
Meaker@Sager Company Representative
So is the whole market at the moment, it's just got so bad it's impacting the mobile segment too. -
Can confirm now that this was mostly due to the phone or camera that was used to record this, see screenshot below from video recorded with iPhone SE at normal levels of ambient light in living room, it's pretty accurate in representing what I was seeing on screen:
Papusan likes this. -
Is the screen bright
-
When we say "bright" what exactly is the point of reference or how do we gauge this? For me it's bright enough that I haven't needed to increase in game brightness slightly on the games I usually play so it is brighter than my last monitor. I don't think the panel is as bright as some of the higher end desktop monitors I've played with though.
-
win32asmguy Moderator Moderator
144hz FHD 300nit
300hz FHD 300nit
60hz UHD 400nit -
Thank you @win32asmguy
-
Good to know! I was just worried about being dimmer than my 17 r4
-
How's the minimum brightness setting on the panel? Is it nice and comfortable? I'm asking since I always use screens at their lowest brightness setting since anything higher is too bright for me.
-
It's pretty good, everything on the screen still legible and I just tried gaming with it and it's definitely doable. I'm more of a max brightness type of guy though so the opposite is better for me. I tried taking photos of max vs min brightness while in game but it doesn't look any different I think because of phone auto brightness/exposure correction :/Clamibot likes this.
-
On paper both the 144 and 300 hz panels are 300 not, however in real life the
There is some ghosting on those lanterns, does it look like that during normal usage ? Or during movement it looks like blurry / smodgy and it only looks like that because it's a still image from a video ? -
I expect this move from Intel will help Clevo to make an thinner successor to X170 if Clevo will continue with LGA machines in 2022. The race to thinner laptops with equivalent slimmer cooling will continue. This won't affect desktops but it will quite certainly affect laptops as the X170. This is bad news for the heatsink grills! And bad news for those who want the biggest and badest cooling.
Intel Socket V (LGA1700) for Alder Lake-S has a lower height and new hole pattern videocardz.de 24th May 2021
![[IMG]](images/storyImages/Intel-SocketV-LGA1700-AlderLake-Cooler-4-768x376.jpg)
Source: Igor’sLAB
Last edited: May 24, 2021DreDre, raz8020, Clamibot and 1 other person like this. -
I was hoping for a beefier heatsink. Clevo seems to go somewhat against market trends or at least resists them. Otherwise, they wouldn't still have DTRs built like tanks.
What actually happens next is anyone's guess though. If the next revision of the X170 has a MUX switch, I'll be happy since it's the only thing missing from the laptop that would make it perfection for me.DreDre likes this. -
All OEMs will go thinner if they can. BGA was the main reason for the race to thinner and skinny chassis design. The X170 is thinner than P870. LGA Socket with lower Z height will be transformed into thinner chassis design. Be you sure.
-
Meaker@Sager Company Representative
X170 is also a single GPU machine for that comparison. -
Yeah, and the X170 use narrow panel frame and aren't much wider than previous gen 15,6 inch laptops. Less wide and thinner normally means less space for cooling. Make it even thinner due use of new LGA socket will affect total space in the chassis. Aka the BGA route. The opposite of what you want for implementing better cooling
You can't fill up your old nice 10l wooden barrel with 13l beer.
A Cpu Vapor chamber can of course help but if the thinner and smaller fin stack (grills), the pipes and heat sink is fully saturated with heat due its smaller size it won't help so muchLast edited: May 24, 2021 -
The image from my phone is blurry / smodgy because its a still photo i captured from a phone video and at that precise second the phone captured the video in motion and it got blurry that way, I chose that still from the video to make it as close as possible to another image we were comparing from someone else's phone. I didn't notice ghosting irl when looking at the screenLast edited: May 24, 2021
-
Meaker@Sager Company Representative
-
@Meaker@Sager i found out today they got my laptop started and will ship soon
Last edited: May 26, 2021 -
I am looking at either this machine or the SM
Which is better?
i9-10900k with 2080 super or
i9-11900k with 3080
I will get it from ztec pc no matter whatelectrosoft and bca009 like this. -
Km
-
Meaker@Sager Company Representative
Excellent news
-
I wonder what HIDevolution status is on the backordered KMs. I have mine ordered and original ETA was end of May, then Sometime in June, website shows in stock so Im hoping itll ship soon. I ordered it without cpu or anything so they should just send me the barebones kit once they receive them in stock. I can't wait until I can nerd out tinkering with the OC settings and playing with different thermal experiments I have planned.
Last edited: May 27, 2021Clamibot likes this. -
Schenker XMG Ultra 17 in the test: benchmarks and efficiency (part 2 of 2) igorslab.com | Today
@yrekabakery Yeah, not even 3060 Ti performance. Brand this Mobile graphics card 3080 is scam! Nvidia robbed the laptop gamers. Laptop gamers vs Nvidia 0-1 (0-10).
We have to go back to 2013 to find similar performance from Mobile high end cards vs. the lower thier desktop cards.
Last edited: May 27, 2021raz8020, Clamibot and yrekabakery like this. -
yrekabakery Notebook Virtuoso
Yes, it’s a disgrace. The massive gulf between the desktop and mobile 3080 is like if back in 2015, laptops and desktops had the 980M and 980 Ti as their respective flagships, but they were called the same name.
The desktop 3080 isn’t even in those charts because the mobile variant is so far behind.
electrosoft, raz8020, Clamibot and 1 other person like this. -
Thanks for this!
I wonder what the numbers would have looked like if there is a desktop 3080 in the charts for comparison.
Calling that card a 3080 is arguably a sham by Nvidia, I wonder what the future holds for mobile high end gaming GPU-s after reading this and how will AMD's upcoming mobile cards would fare in comparison to the mobile 3080. If AMD comes out with a counter to the 3080 mobile without the brutal MaxQ/TDP limitation Nvinia does, then they could leave the big green in the dust. -
I wonder if there will be a shunt mod or similar for these cards to boost the performance. Would be awesome if we could get it closer to a 200w TDP and see if any improvement is there or not.
-
Hopefully. The 3080m on my machine usually doesn't go above 60c even in heavy gaming, it's nice but quite sad at the same time that the mobile variant for the 3080 only draws half the wattage than its desktop counterpart in its current iteration. I feel like there's huge room for improvement.
-
electrosoft Perpetualist Matrixist
-
yrekabakery Notebook Virtuoso
True, but only because:
- The mobile 3060 has more CUDA cores than the desktop 3060
- The desktop 3060 is pretty underwhelming as it is
-
The 3080m variants and the benchmarks always confuses me. Like I understand the TGP. But what I don't get is some of the differences of the benchmark of the different machines with the 165W TGP.
Like while some benchmarks have the 3080 laptop around the lvl as a 3060Ti or sometimes below, but some benchmarks have it at the lvl as a desktop 3070. Like if you notebookcheck and their benchmark of a 165W 3080 with a desktop cpu, you'll find it hitting the lvls as a RTX 3070 with cyberpunk. Sometimes a little better. Even with some other games. Perhaps it's different while in game than with some benchmarking software?
Like with my machine (4k monitor 60hz G-sync, RTX 3080 165W TGP, i9 10900k, 32GB DDR4 32000 MHz dual, 2 TB Samsung EVO 970 Plus), I was hitting 2080Ti performance exactly with Star Wars Fallen Order 4k. Digitalfroundry did a review of XMG Neo 17 RTX 3080 Laptop that I believe is 165W TGP. But their performance on Control isn't the same performance that I have. It's better for me. Although I'm not sure how they record their benchmark. If they use shadowplay, then I get the fps drops. If they use another secondary machine, then I don't get why they are getting lower performance than mine. It could be the cpu, but I doubt it since it's more gpu bound. Mine isn't overclock I don't think.
Even if you see other benchmarks other ppl have on youtube or some other sites, it's pretty much around the lvl of a RTX 3070 it seems instead of a 2080 Super or a 3060Ti? But perhaps I need to do more testing. Luckily my friend has a desktop 3070, so I can do a couple of comparisons. It won't have the same cpu, but at least I get something. But anyway, all of this is confusing me seeing the youtube vids and the testing. Even with my own testing.
One thing is certain tho, it isn't a desktop RTX 3080. Not even close. I did hear that the 4000 series is going to be more power efficient. Hopefully the 4000 series with the laptops are closer to the desktop counterpart. But I'm satisfied with the laptop chip of 3080. It's a powerful beast. But I jumped from a 980m to a 3080, so I'm just in awe that a laptop could handle that much power.bca009 likes this. -
yrekabakery Notebook Virtuoso
Because different games have different power demands, which are also affected by changes to resolution and settings, plus you have Dynamic Boost and operating temperatures also affecting the clock speeds, mobile vs. desktop CPU/RAM differences, people undervolting/overclocking vs. stock, Optimus, etc. There are just a ton of variables.
Clamibot, raz8020, DreDre and 1 other person like this. -
Yeah, you're right. It gets really confusing sometimes. I looked into it more with digitalfoundry and they weren't using a 165W TGP. Just a 135W with a 15W dynamic boast. This explains why I was getting higher fps and some other benchmarks other ppl had didn't add up. At least part of the reason. I don't know how I missed that. It really does get confusing at which variant someone is using and as you said, many variables have to be accounted for.
This gen is probably the most confusing gen. -
Glad that you are happy
-
Whats the point? Nvidia is ready to relase 3070 Ti to make the 3080 Mobile looks even more stupid as a 3080 card.
3090
3080Ti
3080
3070Ti
2080Ti
3070
3060Ti
Then 3080 Mobile right above 3060 desktop card.
And forget AMD as a viable mobile solution. They will become equal crippled as the Nvidia Ampere cards.
MSI GeForce RTX 3070 Ti SUPRIM and VENTUS 3X pictured, GDDR6X confirmed
Any reasons Nvidia shouldn't max out TGP for the desktop cards as they did with Ampere? They won't go in the same trap as Intel and let AMD come closer/in front.Last edited: May 28, 2021Terreos, Clamibot, electrosoft and 2 others like this. -
Meaker@Sager Company Representative
It more depends on cooling past a point. -
Yeah that makes sense, but somehow the dual 150w tdp 1080's were able to be cooled in the DTR in your sig years ago!?
-
So I am really confused,is the km thunderbolt 4 but the 10850k only sees tb3 speeds
If I upgrade to 11900k next year,will I get tb4?
what’s the difference between full speed tb and half speed?
Is km the full speed -
They were actually dual 190 watt cards so yeah...
The Clevo P870TM has a wickedly good cooling system.DreDre likes this. -
OMG even crazier! That chassis and cooling solution should be able to cool a desktop 3070 with ease if I'm not mistaken.
-
Indeed. There's way too much of a push for thin and light laptops, which is holding everyone back. Understandably, companies have to make what sells so they can stay in business.
However, gaming laptops are performance laptops. They are supposed to be engineered for performance, not thinness. A 10 pound laptop is an ultraportable in my opinion. It also makes a great workout weight.
The P870TM could cool a desktop RTX 3080 if someone would make the MXM card.DreDre, raz8020, electrosoft and 1 other person like this. -
Agreed but what is completely bonkers imo, is that Nvidia decided not to introduce new parts & naming (like 3040 for example) for ultrathins, instead nerfed the entire product line for an (arguably) very specific use case of ultraportables. Now anyone who buys these low TDP & HW gimped 3000 series laptops will have problems with RTX and DLSS due to the limitations Nvidia had to implement and that will make people angry.
Oh and where did this "too much of a push for thin and light laptops" preconception came from anyways? Its a market segment and I would think that the "I want a 3080 in my ultrathin" is not a mainstream requirement. (nor is DTR to be fair)Clamibot likes this. -
win32asmguy Moderator Moderator
The whole industry is corrupted from Apples 15 year push of thin and light being standard instead of niche. Yes, portability is important but an extra couple pounds is not the end of the world if it means space for better cooling, modular components ease of maintenance. Honestly ultra thin laptops do not need more than a 45w CPU and 65w GPU. Anything beyond that is uncomfortable for any sustained use. 3070 and above in those is just a cash grab. -
Has anyone gotten word from their re-sellers on an approximate ETA for the Prema modded bios for the KM-g yet? I'm assuming still no update?
-
Larry@LPC-Digital Company Representative
Correct. There is no ETA for this bios yet. -
Meaker@Sager Company Representative
The total cooling can be over 600W in this machine but that's with two cards. It's hard to justify that for a single card especially when Nvidia does not let it use that.
I'd love a large scale model based around some sort of 3090 mobile but we are not getting that.
*** Official Clevo X170KM-G/Sager NP9672M Owner's Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by win32asmguy, Mar 23, 2021.