But the thing is, most of the stuff people want to buy right now IS expensive. Overly expensive, even. Remember how many people I said want MBPr and Blade laptops? Those things are so overpriced I could legit copy the specs in a Clevo and save $1000 USD. I don't think many people would complain if clevo suddenly bumped prices by $50-$100 depending on the machine and made sure all the heatsinks were as top notch as Alienware's used to be, and made sure there were no BIOS issues with SSDs etc (they've been rare, but have happened). QA is also to blame for not picking up problems that users encounter out-of-the-box with their machines, but still... the MACHINES shouldn't have them. Just like I always wondered why the Alienware machines had so many issues with DPC latency, that I never saw any (if any) Clevo owners get. But at least with the older AW models, those would be instantly covered should the user not want to manually attempt to fix.
If we're gonna pay through our noses, we should seriously have something good, right
-
-
Right, but we're talking even more expensive to build something the right way. I agree, they're selling trash for a premium.
-
I would guess that depends on the buyer, again.
I bought a Y50.
For under a grand it does everything I want it to do well.
It runs the games I like. No I dont play GTA V.
It runs Lightroom, Photoshop and Premiere with acceptable performance for me.
It is quiet and not hot at all.
Personally I am happy that it exists and never expected it to perform miracles, Just enough for what I need.
I will probably upgrade the ram and HDD and three years from now buy the next thing that fits my needs then.
And by then this thread will be long forgotten.
Eric -
Well yes. That's why I said it, because it has a purpose in the price. Like, Clevo cooling is bloody amazing, honestly, when it works. I legit cannot get my slave GPU to overheat. I don't know how. It just doesn't. Obviously my primary GPU cooling system has some problem that needs to be fixed, and I will eventually fix it... but other than that, if things worked they'd be near perfect. I'd LOVE if I had similar temps out of the box for both my CPU and GPU. That'd be beyond amazing.
Anyway, yeah. If only we could talk to Clevo or other manufacturers and make them listen, at least for their higher end machines.Mr. Fox likes this. -
I'm not convinced that most of the OEMs even know what "high performance" means any more. Their definition doesn't seem to sync up with mine. The direction things are headed, I find it increasingly difficult to be optimistic about the future for high performance computers in general, but especially mobile. It is very disheartening. The bar keeps getting lower and lower... all because they can get away with it. People value form over function and convenience more than quality. Just when we were finally getting closer to seeing a level playing field between mobile and desktop performance, they turn around and head the opposite direction. And, now it is even looking as if the cancer could be spreading to the desktop. The possibility of BGA desktop processors replacing sockets has come up more than once, AMD does nothing worthy of mention and it's hit or miss whether an NVIDIA desktop GPU is actually going to work as advertised.
-
If it was all about specs and if they were overpriced you'd see Clevo copying their build quality and form factor and charging less. They aren't, because they can't.
-
Not because Clevo can't; because it doesn't want to. Is Clevo external build quality actually worse? It's debatable. What about internal build quality, e.g. the design of the cooling subsystem, and ease of access to the major components for self service--upgrades, cleaning, and repasting? Clevo has Razer and Apple beat by a mile. And there are those who contend that mobile workstations from the likes of HP and Dell blow away your unibody Ultrabook in terms of build while still offering better performance and support for the dollar. As for the form factor, moot point because Clevo makes extremely high performance systems and such powerful components are not suitable for a thin form factor.Last edited: May 1, 2015
-
Well, gee... for under a grand that sounds appropriately priced to me, Eric... not trash for a premium. It's not a beast, but if it fits your budget, does what you want it to and you're happy, I am not going to criticize your decision. If it was $2000 or more, had a CPU that throttled under load, a GPU running on the edge of thermal limits, and could not do what I wanted it to, then it would be trash at a premium from my own personal perspective. If things continue on the path of decline, I'm probably going to spend $300 on a disposable Chromebook or an Acer crapbook from Wal-Mart and totally avoid burning any more cash on broken "high performance" filth that doesn't work well.
-
Oh believe me, at some point his P570WM is going to be unable to update GPUs (or it won't be feasible), and the CPU path is already dead. The good alienwares already prove that they can't handle Maxwell cards so easily, so a newer card shouldn't be expected to work either. When he needs moar power he might have to go desktop route, and get a throwaway notebook for when he's actually on the move.
Eric Auer likes this. -
A hex core Ivy will more than suffice for the next 5 years, especially when combined with DX12. Hell, even a quad Sandy 2600K that's more than 4 years old still keeps up, especially if overclocked. I suppose that's one good thing that came out of Intel's complacency -- no need to upgrade for 5 years on the CPU front due to 5% improvements every year.
I'm sure had AMD been as competitive as it was in the Athlon64 days, we'd get hex core i7's for the price of quad core i7's now, and there will likely be an i9 line with octa or even deca core processors. Although on the CPU side I don't blame AMD because Intel pulled the biggest cockblock ever and bribed OEMs to not use/stock/sell AMD chips.
Also I don't quite understand the obsession with thinness. What's a 1" difference gonna make? If anything weight, and screen size is what mostly determines portability, not thickness. -
Not so sure about the thin thing and why it is so popular but If you ask my wife who is a Interim Executive road warrior thin and durable is a must.
And she loves her MBP.
Thin in Gaming?
I have not been to college in 30 years but maybe that is where these things sell well.
Honestly did not think about it when I bought my Y50 but I certainly dont mind it being thin.
Eric -
Yeah true but you can make a thick and light laptop, just that it'll have a lot of empty, unused space.
It's just I always see thin and light go hand in hand. OK I get it for "ultimate portability". But really would adding back 0.3" and making an "adequate" cooling system that let's the laptop run much better really hurt anyone? Then again I guess "unhindered performance" isn't exactly on their list if they're looking for a thin and light laptop.D2 Ultima likes this. -
Except that it is, and they think it happens.
-
Right. I like a laptop that is well engineered and designed to optimize size and weight. Just don't join the race to be as thin as you can get. I am still floored by the Clevo P750ZM and how it is not much bigger or heavier than the mobile CPU counterpart and still cools it very well.
TomJGX, Papusan, zizimonzter and 1 other person like this. -
^that's true, it's a race to the bottom thinnest, fueled by market obsession, and that's the part that gets me.
MAGIC -
To that effect, the M.2 drives save space, and the lack of an optical drive means it can do more. Besides, the LCD cover is thinner than the old P1xxSM-A drive, as you and Luna found out. All that adds to the lack of bulk and weight. But all things considered, it offers "less" than the old SM-A models, though the parts itself are more high end. Which is where the tradeoff comes in that I was saying. If you want to cut down on thickness or weight, you lose features. It's unavoidable. I'm not saying the P7xxZM models are bad, or that the lack of the ODD etc is necessarily a bad thing... but it is true that it doesn't break
It's called reviews and reviewers that don't think "not boosting" or "not turboing" isn't considered "throttling".n=1 likes this. -
The Linus Syndrome
-
Double standards much. If a desktop CPU Linus was reviewing dropped below its max TB bin for the # of cores loaded, he would definitely call it out as throttling.
Oh well, at least D2 Ultima's incessant yelling at Linus effected some change. Now he reports the actual CPU frequency under load of the notebooks he reviews, although he still doesn't consider dropping to base clock or slightly above "throttling."Last edited: May 2, 2015D2 Ultima likes this. -
Yeah. People need to understand this. If you go in XTU and check power limit, thermal limit or current limit, it *WILL* say it throttles. That's Intel's official software too. Can't tell me it's not throttling if Intel says it is.
octiceps likes this. -
This 3Dmark11 benchmark test is run with driver 350.12 as several experienced as crippled with gtx780m .. P9528 http://www.3dmark.com/3dm11/9761352
It had been fun to see the most recent Intel BGA trash along with gtx780m in 3Dmark11...
-
Ever tried to go higher? I have been able to (for a while, at least) get 1110/6000 stable with +0.87v on a single 780M. Maybe you can bump that a little
.
Also, I didn't actually understand what you were trying to say so much O_O. -
Not everyone has a golden chip
I do not think I've seen anyone with 1110/6000 stable with only + 0.87v on 780M ... Wish I had one
I have to use 1 volt higher than your 0.87v golden gtx780m with my tiny Overclock. However the Physics score is ok... I just showed that the 350.12 Nvidia driver works without trottle (overclocked) on Alienware with gtx780m.
-
GPU upgrades could definitely start to be a huge problem for everyone, even as early as the next generation of GPUs. It might just be paranoia and a dread of the lackluster, but I am beginning to think the OEMs and NVIDIA could be intentionally conspiring and doing vile things with firmware and drivers to maliciously and effectively block GPU upgrades across the board. Doing so allows them to force obsolescence and drive new system sales instead of leaving the door open for more frugal component upgrades. This also helps them lower the bar so they are not in a position of competing against their own legacy products that run better.
But, I'm not so convinced the notion of the CPU path being dead is accurate in terms of performance. If the circus we have all witnessed since the release of Hotswell and the pathetic low-TDP BGA throttlemaster comedy of errors that swiftly followed is just a small glimpse into the future, my 2920XM, 3920XM, 4930MX and 4930K are in no danger of taking a back seat against newer CPUs in terms of performance. They all literally sodomize the "efficient" garbage that is being peddled today and nothing that is new performs as well or better than they do except for the fully unlocked enthusiast-grade K and X CPUs with the same or higher core count.
If this sad trend continues, the already old and moldy dual-GPU beasts will continue to be the grim reapers of the high performance notebook niche for at least another 5 years if their motherboards hold up that long, LOL. People are all giddy about die shrinks and low voltage feces and toss around the term efficiency as if it somehow means something magical, but where, oh where, is there any value in that silly nonsense when the newer chips are behaving like a gutless one-shot wonder? The watt-for-watt analysis of marketing liars is totally worthless pie in the sky hype when the end result is those shiny new craptastic power-sipping wastes of silicon get their skulls bashed in by an antique CPU. I mean, really, who cares if a new turd CPU performs 20% better at 2.5GHz when the antique monster can run 4.5GHz, hold it indefinitely under sustained full load and ultimately mop the floor with the fancy new piece of junk that is so wonderfully "efficient" in all of its earth-friendly lead-free-solder-filled tree-hugger glory?Ashtrix, Papusan, TBoneSan and 1 other person like this. -
Ah, ok.
I don't know if I can get 1110/6000 in SLI stable. I just tried it with BF4 a couple times. I think my main GPU gets hot but holds clocks ok, and my slave doesn't like to OC well but is cool as hell haha. -
Well as I already pointed out earlier, your 4930K will more than suffice for the next 5 years, and with DX12 on the horizon I'd wager your 4930K would die from extreme overclocking before it ever becomes obsolete LOL.
And yes I do think MXM will be going away within a generation or two, and we'll have nothing but BGA left in the mobile space.
-
I meant "there's nothing new coming out for the socket", and nothing more.
I'm fairly certain I'll be fine performance-wise (considering the laptop sector) with what I have now, far less if I get a 4910MQ or 4930MX/4940MX and swap the heatsinks and keep it over 4GHz for the foreseeable future. I'll be happy with a 4.3GHz CPU and a couple 980Ms for quite some time, methinks. Or whatever the upgrade to the 980Ms are. Assuming they're not more gimped. XDMr. Fox likes this. -
What I meant about intel bga ; It had been fun to seen how a i7-4980Bga had worked with an overclocked gtx780m, and cpu and gpu had shared heatpipes(AW17R2). I think maybe that Gtx780m is a warmer graphics than a gtx980m. This would surely affect throttling in a new modern thin gaming laptop.
-
There's no way to get a 4980HQ and a 780M in the same machine. The AW17 R2 use maxwell and soldered chips. And yes, 780Ms are *FAR* hotter. FAR. I'm talking, sometimes 15 degrees hotter.Papusan likes this.
-
There are certain parts on the 980M that get blistering hot that do not seem to get nearly as hot on 780M. The core temps on 780M are higher, but some of the components on 980M get insanely hot. All you have to do is feel of the heat sinks and they are too hot too touch sometimes even when the system is idling and core temps are low. And, 980M isn't really a power-saving GPU except at stock clocks. Under an overclocked load they can actually draw as much power as 780M running at its maximum stable overclock.
D2 Ultima likes this. -
I know this D2 Ultima: I said it had been fun and seen how the most powerful Intel Bga processor had worked in a thin laptop with an overclocked gtx780m and shared heatpipes which are in newer laptops..
-
Guess Maxwell is nVidia's Haswell. Haswell is more efficient at stock, but start overclocking and efficiency gets defenestrated.
-
Oh. That would have been hilarious. XD. Maxwell is the only reason thin/light exist as they do now. No wonder they want us running at stock
.
Also, off-topic, I actually rendered a video using CUDA tonight. I thought CUDA was a myth. I am amazed.
Haswell isn't efficient at "stock", it's efficient below a certain clockspeed. ~3.2GHz appears to be the level. Beyond that it gets rapidly worse. Below that, it gets better and better. Broadwell is flat out worse though; it's literally a hotter, more power-drawing haswell, so even at lower levels it draws more power. I think there is a reason they're probably not releasing any socketed broadwell chips for laptops. The laptops probably wouldn't be able to handle them and haswell would have been dropped from the market, killing the thin/light movement. -
Examples of 980M "efficiency" *wink*
Single 980M with AW 17 r2 BGA low-TDP turd CPU (got to love those 180W AC adapters)
[parsehtml]<iframe width='640' height="360" src="https://www.youtube.com/embed/s4zXCut58FY?rel=0" frameborder='0' allowfullscreen></iframe>[/parsehtml]Single 980M with 3920XM CPU
[parsehtml]<iframe width='640' height="360" src="https://www.youtube.com/embed/mpHOWJUZVdQ?rel=0" frameborder='0' allowfullscreen></iframe>[/parsehtml]980M SLI with 4930K (you'll need to watch this one full screen to see the power draw reading)
[parsehtml]<iframe width='640' height="360" src="https://www.youtube.com/embed/2Lhp7lpABOw?rel=0" frameborder='0' allowfullscreen></iframe>[/parsehtml] 780M SLI with 3920XM and 4930MX (for comparison)
[parsehtml]<iframe width='640' height="360" src="https://www.youtube.com/embed/7CoeOq-lkdY?rel=0" frameborder='0' allowfullscreen></iframe>[/parsehtml]Last edited by a moderator: May 6, 2015Kade Storm, D2 Ultima and Papusan like this. -
Perhaps this new i7-4980bga had throttled down to 800MHz instead of 2.8GHz as some do today...
D2 Ultima likes this. -
Does an amazing job, doesn't it? Now you know first hand why @j95's mod is so valued by those that want it. Just another piece of evidence that the folks at NVIDIA have lost their ever-lovin' minds. Still cannot believe they no longer support CUDA encoding... idiots. They definitely do not have our best interests at heart. They are doing what is easiest and best for themselves only. Customers and product quality have become expendable, and both are an afterthought.Kade Storm likes this.
-
I actually did it on stock 350.12 drivers. Clicked "check GPU" in Sony Vegas and it said "CUDA is available". It rendered on my slave card by default.
It also used my CPU quite a bit, but it rendered far faster than normal.Kade Storm and Mr. Fox like this. -
Depends what kind of efficiency you're defining and where. On mobile though Ivy Bridge seems better in every way except idle battery life *yawn*.
-
If I limit my CPU to 1.6GHz, with all 4 cores & threads available, TSBench and XTU stress don't pass 15W (this is with a -50mV undervolt though). Might explain why the ULV chips for haswell were so decent.Mr. Fox likes this.
-
But, is that actually an accomplishment in a powerful beast machine like yours, or just something weird to pass the time if you're bored and have nothing better to do?
-
Well it was something I noticed when we had a blackout one day. My CPU speed got limited to 1.6GHz due to my windows settings etc, and I ran TSBench for the lulz, and was impressed that I simply was unable to make it draw more than 15W. It's impressive in that scope, I suppose, if intel wanted to make a 20W, 1.6GHZ quadcore and call it a ULV, they could.
But no, it does not help unless we have a power outage and I've gotta make the battery last. I was just making the point that haswell can shine somewhere. Not in the enthusiast bracket, but down in the low-power market I guess it's worth something. -
You can run a 3Dmark11/Fire strike with maximum overclock on yor graphic and only 4x1.66ghz on processor... Have a nice run...☺
-
Ah, ok, that makes sense. Would have been nice to have had the ability to also use the Intel HD Graphics during the blackout. In the rare event I have to run one of my M18xR1/R2 beasts on battery with iGPU, I am glad they get over 5 hours on a full charge, but the poor performance in that configuration certainly does increase my appreciation for having it plugged in and running at maximum performance.
-
Well there you go. Ivy Bridge can't be undervolted so it's not exactly stock-for-stock is it.
-
True, but I won't lie. I *FAR* rather take my ~2 hours battery life and keep my 120Hz screen XD.
Seriously Mr. Fox, you should see if you have the eDP connector on your motherboard for a 120Hz panel on your P570WM. I know your model can take it, but I am unsure if the P570WM3 variant was the only one that came with the eDP connector soldered to the board. I do not think you will regret the upgrade if it is possible. You might actually never go back to 60Hz again
It can't? Wut? O_OTBoneSan likes this. -
Haswell brought back undervolting on mobile for the first time since Penryn (Core 2). Clarksfield, Sandy, and Ivy couldn't be undervolted.TomJGX likes this.
-
Couldn't you simply set a voltage value though? Was the minimum on the slider the default for that chip?
-
You're probably unaware of how much control over voltages Haswell gives you due to FIVR. With 1st to 3rd gen Core i it was barren in comparison.
-
I guess I'm not sure how it offers less than the old SM-A models? The only downfall is the shared heatpipe configuration. They just need to decouple the CPU and GPU heatsink config. Otherwise they offer pretty much all the same upgrade options. Sure the lid is a bit thinner, but you still have decent multiple options for LCD's. Unfortunately if you want a high gamut ultimate LCD you're kind of out of luck at the moment, but that's more due to LCD makers not making for this type of profile (yet).Mr Najsman likes this.
Nvidia clockblock: vBIOS (unblocked in 353.00)
Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Feb 23, 2015.