Newer!!! Either get a cheap CromeBook, build a mini desktop or forget computers. You're crazy man![]()
-
-
Sent from my SM-G900P using Tapatalk -
-
I don't know it was done on purpose and you guys are only missing 1080 sli. Right now, you don't even have 980 sli. Bad planning, yes. Malicious, no.
If it allows 1070 sli, comparative to 980m sli, you are missing nothing in possible upgrades. Just another laptop will allow 1080 sli. It is like my hm being compared to the em released months later. It will be ok... It is a shame, but life goes on...
Sent from my SM-G900P using Tapatalk -
-
I'm waiting to see how 1080 SLI stacks up against my 980 SLI benchmarks. If a single 1080 can beat my 980 SLI benchmarks then it might be worth considering a single 1080. If it costs $1,000 for one GPU (a guess on my part) and it is not extraordinarily more powerful, it wouldn't make good sense to consider it. It should also be interesting to see what kind of gimping NVIDIA is going to do with drivers and vBIOS, and how difficult it will be for @Prema to make Pascal GPUs run at their full potential. NVIDIA isn't getting better at anything except crippling yesterday's GPUs with cancer drivers and interfering with performance using cancer firmware... they've been on a roll with those shenanigans for a few years now.DreDre, oveco, Shadow God and 2 others like this. -
at least msi has bigger psu than the echo models
on msi battery boost acts as backup so it doesn't kill your psu
on echo the battery boost acts as a 2nd psu, killing the battery in the process -
-
Battery boost isn't even available in the GT72(S) line from what I know, its pretty irrelevant to enthusiast systems.
-
-
Spartan@HIDevolution Company Representative
ssj92 likes this. -
Spartan@HIDevolution and Georgel like this.
-
-
-
Besides Dark Souls 3 and DOOM 2016 with Fermi (which they fixed in a week), and the 780M SLI issue with the Alienware laptops specifically (which they also fixed, albeit taking their SWEET time to do so, and only under extreme pressure from Dell) do you have any proof of this? A LOT of people on this forum like to say this, and as you know I * WILL* bash and criticize any and everything I see that needs this happening, but I really do not see any. As far as I have ever seen, old cards perform as well or better, give or take 1-3fps in some games (which I can forgive in a new driver if it occurs at a GPU bottleneck scenario) with new drivers. Benchmark scores have huge point differences for even half of a FPS, so it's more glaring there, but then you're looking at benchmarking drivers.
Now this is not to say that I don't feel they do anything underhanded... they do. But the only "gimping" I know of is "forced obsolescence" which, while still awful, is not nearly as bad as performance destruction of older cards. Their "forced obsolescence" involves getting games using their tech to run the levels (of say tessellation) higher than easily doable by their last generation of cards, and it inflates new card performances much higher than would otherwise be seen, so people feel that cards are stronger than they actually are (whether or not the performance benefits are real, they likely only affect a small set of games). But I simply cannot say that this is "gimping" older hardware. I've been using these 780Ms for about 3 years now. I've never seen performance drastically drop in any existing title I play, and I've played quite a lot of them. And on my 280M, I was in the same boat. I never saw direct tanking of performance in any title. I remember some titles updating to give issues (like Dark Souls 3, which after the last actual update has stupid points where my system util tanks for no discernible reason whatsoever for about 5 seconds, and is repeatable if I leave and rejoin the area) but that's just developers being stupid.
Remember, if we want to change peoples' minds we have to talk about things that we can prove with decent evidence. We've gotta be able to point and say "here, go search this" or "here, look at this video/article showcasing the problems" or "here, let me break it down for you, look at example A and example B and this is what they do and you can..." like that. I don't wanna be the person people stare at and go "yeah yeah let him and his tinfoil hat be". We gotta fight right!DreDre, CaerCadarn, Shadow God and 1 other person like this. -
My observation so far, on the bright side..
I think the P870DM should have been like the DM2/3 in the first place, They might have been thinking about that but considering the MSI going ahead given the market/demand time-frame, with the custom MXM design this 980 desktop happened, I'm glad that the MSI and Clevo alone have had the custom designs much much better than BGA filth (AW echo, Aorus & ASUS), They might have thought lets give them upgrade-ability but they missed the part how the enthusiasts who purchase those machines think. Also IMO Clevo made a mistake there by releasing the P870DM design unrefined, Still props to them for releasing a full GM204 beast paired with the best - custom sBIOS, partnered with savior Prema and performance with a welcoming new design.
As for the design change It's good that new machine has some real cooling tech not like that ASUS GX800 overpriced trash, Now that the shape of the 1080 SLI & rest, It had departed from the Clevo 980 200W by adding more length and different placing all vrms/mosfets/vram & different power connector (I don't get why this part happened). I just hope this is final standard for all the upcoming Clevo machines since the leaked 775DM2 also has the same GPU obv because it's a 1080 with same power connector, Clevo are very bad at PR, they should set the bar right there with transparency. Also another suspense here is how did Clevo made the 1070 card, while Prema said about the MSI 1070 card working in the 75xDM model. If that's also same as 1080 It'd be stupid while MSI had it at MXM3.0b yet again the form factor remains same for all Clevo GPUs which is a good thing right ?
Again the 200W single 1080 behemoth is not yet leaked (That old leak, gimped 1080 It might be from MSI and we know how many confusing variants that MSI already have with the GM204)...So these 1080 SLIs for 2x300W what TDP they are targeted at ? 180W ?
So as Prema said, You guys should take a chill pill and cool it offLast edited: Aug 6, 2016CaerCadarn, jaybee83, oveco and 4 others like this. -
Just my observation XD
Everyone should be much more happy with the news. everyone is getting better tech, even those who can't upgrade right away will be able to buy better things next time. everything is going for the better XD
What we need is to convince more companies to use brother @Prema 's BIOSes so that everyone with a power laptop or workstation will also have power bios.
Everything else will just be getting better for everyone. Even if innovation can't come to everything on this earth at once, it will come eventually.
Let's all be happy, we get new laptop designs every year, better and better XD
I just hope they won't ask for even more money, P870 w 980 and decent config costs around 2500 GBP right now, if it gets more expensive, it's in the expensive area again.oveco, Papusan, Ashtrix and 1 other person like this. -
pathfindercod Notebook Virtuoso
Newest tech and highest end hardware will demand premium price. Gotta pay to play.
-
What happen to that guy who was testing a 10 series mobile card?
ssj92 likes this. -
This I very much agree with. I wish mostly that Europe would get better access to Prema mods. mySN for the UK/EU and Aftershock PC in Singapore (I think) would be great, and I know Metabox is in the process of partnering with them (or so they say). Aside from that, we have HIDevolution who is the ONLY manufacturer I've ever seen selling Clevos which advertises a world-wide warranty (even if it's $300 USD extra to apply for it over the standard 3 year warranty).Georgel likes this. -
-
Georgel likes this.
-
I use a single board keyboard/trackpad wireless for mobile. It goes in the bag too. I also have short wired keyboard/mouse for certain jobs.
It's easier for me to set up the laptop and move the keyboard/mouse to where I want on the cube desk, around whatever else is set up by the client.
4.0ghz for day to day use on the 6820HK/6920HQ, and although you can likely reach 4.2ghz for benchmarking.
TDP is only 45w so it will fold to 3.5ghz sustained on long term runs - if you start out a 3.6ghz with a high undervolt it will sustain that indefinately.
A 45w CPU can't be a 95w CPU no matter how you OC it
Yes, both can use XTU/TS for OC.Last edited: Aug 6, 2016Georgel, CaerCadarn, Papusan and 1 other person like this. -
Better watching paint dry
ssj92, Georgel, jaybee83 and 1 other person like this. -
The drag strip is fun, but gaming is of more interest to more people most of the time, and for 99% of the games out there the 45w-47w CPU has been enough.
Once we get 1080 SLI w/45w CPU and 1080 SLI w/95w CPU and do some gaming FPS benchmarks, maybe it will be now that a >45w CPU will be needed to service the GPU's to 100% utilization.
I am thinking that's why Intel finally made a 65w variation, right in between 45w and 95w.
My gaming didn't have any lag, spikes, or delay when gaming with a 47w CPU and 2 x 980m GPU's.
Most of the time the CPU was at 35-55% while the GPU's were over 90% utilization. The GPU's were heavily used in gaming, while the CPU wasn't.
Over CPU'ing a laptop doesn't improve performance in gaming; the CPU is more idle while doing itLast edited: Aug 7, 2016Georgel and godfafa_kr like this. -
They only fix what is important to them. I think 880M driver-induced performance issues got fixed (sort of, but not 100%) after a couple of years of pain and suffering for 880M owners. I'd venture a guess that is would probably be accurate to say that it was fixed purely by accident... I seriously doubt it was because they care or actually tried to make things right after two years of leaving it broken.
And, look no further than my last round of benching with the P750ZM. They accidentally released a driver that allows 980M to perform better than they wanted it to, and I would not be surprised if they "fix" that and it doesn't overclock so well once they discover their mistake. NVIDIA notebook GPUs always under-perform with a stock vBIOS. Virtually every mobile vBIOS they have released from 580M through 980M gimps their hardware, artificially impeding performance, causing erratic behavior and throttling.
NVIDIA are the masters of manipulation and deception. The only reason I own their products is the only available alternative is worse by a gigantic order of magnitude. That's not a compliment. I hate NVIDIA, but not as much as I hate AMD, LOL. The Green Goblin is an organization composed of creeps and crooks and the red team is made of up losers that don't seem to care about their reputation or the lame products they sell.Last edited: Aug 6, 2016Georgel, DreDre, Johnksss and 1 other person like this. -
Georgel, CaerCadarn, oveco and 3 others like this.
-
Prema, DreDre, oveco and 1 other person like this.
-
My last one did have 1 CPU step up, but the one I had wasn't utilized enough in games for me to consider upgrading the CPU - it worked great.
I think the 6700K fits in this description too, as the Kaby lake CPU - while it will work in a 100 Series chipset motherboard - gets more functionality from a 200 Series chipset motherboard, so I'd rather upgrade the motherboard as well as the CPU.
Same on the desktop, at least for me, I start out with the top CPU so there isn't a must have CPU replacement - like 9950 Black => ??, 980x to 990x - not worth it, 2700K => ??, so going to the next socket/chipset is the upgrade point for me in desktop's too.
Not saying I wouldn't like a desktop CPU in my laptop, it's just not the #1 concern, until it limits utilization of the GPU's in apps/games.
Which might happen this time around with the 2x performance increase with Pascal GPU'sLast edited: Aug 6, 2016 -
http://en.community.dell.com/owners-club/alienware/f/3746/t/19528315
looking to try this, but any advice?
could you shoot me a pm? i think you are local to me -
*** Windows 10 + NVIDIA WHQL Drivers are Killing Alienware and Clevo LCD Panels ***
Windows 10 + GeForce Drivers are Killing Samsung and LG Notebook LCD Display Panels | GeForce Forum
***EVGA Precision X and Windows 7/8/8.1 and especially 10 bricking systems*** -
-
-
The 880M I just wrote off as a broken-out-the-gate card and I never really consider it in general. I know they're broken and I believe they will always be broken. It's actually news to me if they've been "fixed".
Perform better? Was that 369.00 that boosted firestrike scores a good bit? Yeah, that one will be one to watch.
I know the stock vBIOSes are always broken. I had artifacting at 120Hz with my 780M SLI machine without using svl7's vBIOS. I *LITERALLY* had a broken-out-the-gate product here. But I'm not talking about their vBIOSes right now. You're definitely preaching to the choir saying that to me, however I'm SPECIFICALLY talking about new drivers that make cards perform worse in older games without changing anything else (I.E. the cards are running properly at proper speeds etc but you have 10fps-20fps less, or something).
This, I 100% agree with. I find it appalling that without competition they seem to have no care for making good products, but my annoyance at AMD's unwillingness to compete and seeming lack of knowledge of HOW to compete trumps this. I even have long-time AMD fanboys (and I do mean "FANBOY" in the actual sense of the word; as in "will purchase their products because they are AMD and not nVidia/Intel and just freaking make do") who are saying if AMD doesn't bring anything to the table by october/mid-november, they're just going to go buy a nVidia/Intel system because they're fed up of waiting. They've gotten that bad. -
-
Stop and think about it a few minutes, bro. If NVIDIA was truly competent and their drivers were not cancer drivers, would @j95 spend so much time modding them to fix performance impairments? Would the GeForce Community be overrun with unhappy customers if their drivers actually worked correctly? They break things that worked right with older drivers and fix stupid/unimportant/less important bugs and leave their broken stuff broken for months (or years, as we have observed more than once).
Last edited: Aug 6, 2016 -
*eyes W10 suspiciously* I remember john saying something about suddenly being able to clock high in W10 after a recent update...Georgel likes this. -
It's all crippled GeForce cancer drivers, not Windows. Both of us are running hard-modded 980M cards, but even with the hard mods my max memory overclock was around 1550 versus 1800 using their cancer drivers. Before the GPU hardware mods my max memory OC with cancer drivers was about 1450. Not sure how high I might have been able to go with good drivers because there were not any good drivers available before they were hard modded.
I suspect NVIDIA had some artificial overclocking and performance limits hidden in their driver cancer code. You do know NVIDIA have blacklisted certain things and have some secret performance caps for benchmarks, correct? That way they can manipulate performance up or down using drivers so that their older flagship GPUs will not be able to perform as well, or almost as well, as their latest flagship GPUs.Last edited: Aug 7, 2016 -
Intel Xeon E3-1200 v5 series has a TDP of 80w at base clock with the E3-1280 V5 (higher frequency) using lower volts than E3-1245 V5 (but these run at lower volts and lower frequency than a 6700K)
If you upped the 45W 2.7ghz cpu to 4.1 on all cores, you'd be pulling 68W if you didn't increase the voltage. 2.7ghz uses less volts than 3.4-3.7ghz.hmscott likes this. -
And, reducing cache throughput can reduce performance the same as reducing the clock speed, and you can end up with higher core clock but get less "work" done. You need to test OC with the actual app and workload you are trying to reduce runtime / improve performance.
Get a nice solid sustained 3.6ghz bouncing just under 45w by using a high undervolt - you can get a higher undervolt starting at 3.6ghz fixed 4x core than starting at full boost, and avoid dropping down about 100-200mhz below 3.6ghz.
With more cores, the power is split between them, which means lower power per core, which means a lower than 3.6ghz clock is likely.
The same technique and rules, shifted up 1ghz works for 95w TDP CPU'sLast edited: Aug 7, 2016Georgel likes this. -
TomJGX likes this.
-
See this is what I'm asking. You said newer drivers perform better and that they probably had crap older drivers. I believe that. But they were crap from the start, and your statements affect overclocks, no? I do remember them "adjusting" the general performance with regards to power draw at some point near the clockblock, but aside from that period, I haven't seen drivers directly hurt performance of existing titles. Benchmarks is another realm I can't say I properly know, neither about that secret performance caps. I have heard about blacklists in the past, and I've seen people insist evidence that nVidia takes information from your PC via the streamer service even if you don't install it, and a whole lot of other stuff. Don't get me wrong. I'm not going to sit here and say they're great or an amazing company or anything even close. ALL I am saying, is that I specifically have not seen them gimp previous cards in games to sell newer cards. And you're not showing me that either, even if your claims about benchmarks are valid or not. -
-
Ok so let's sum it up
MSi
Pros:
1. 2 generation upgrade promise
2. BIOS can be unlock by svet
3. Demoed 2x330w Y-adapter
4. Choice of normal keyboard or mechanical (GT73/83 SLi)
5. iGPU / dGPU switch
Cons:
1. BGA CPU
2. Card TDP not as strong as clevo
3. Pay for svet bios, though not expensive
4. Expensive as hell for what they are
5. Possibly only 1x TB port
6. Nonstandard Windows key placement
Clevo
Pros:
1. Strong cooling system
2. Card TDP is high (biggest pcb)
3. Prema + NBR + T|I support
4. 2x TB port
Cons
1. No iGPU/dGPU switch
2. Uncertain upgrade path
Honestly if the MSi isn't so expensive I'll probably go with a GT73 SLi... -
Last edited: Aug 7, 2016
-
the problem is, even if prema can crack the pascal bios thing. there's not yet a working nvflash that supports modded biosMr. Fox likes this. -
Some things need to be kept a secret until it's too late for the powers that be (NVIDIA in this case) to take active steps to screws things up. We may also see some extreme black witchcraft from Micro$loth Windoze OS X interfering with GPU firmware and driver mods before long. The more the curtain gets pulled back with Micro$haft, the more ugly the crime scene looks.Last edited: Aug 7, 2016 -
I just hope that the clevos that have 1080 is able to run windows 7...alexandernigth likes this. -
TBoneSan, Prema, Papusan and 1 other person like this.
-
*** Official Clevo P870DM2/P870DM3 (Sager NP9873/NP9872) Owner's Lounge! - The Phoenix 2 is here! **
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Aug 3, 2016.