There's the 200W 980 in the P870DM [This one is longer and wider than standard MXM3.0b, Eurocom calls it MXM3.0c, whether that's just a placeholder or it will be an actual thing, we don't know] (No SLI)
There's the 180W 980 in the P775DM1 which should have the same clocks as the 200W 980 but it won't overclock as high due to less power phases, etc. [This one is almost standard MXM3.0b but is slightly longer] (Supports SLI since it has the connectors)
There's the 150W 980 in the GT72S which has lower clocks and 5Ghz memory I believe and even less power phases than the 180W 980 but should be within 5-10% performance of the 200/180W models stock vs stock. [This one has a weird shape and is long left to right on one corner] (SLI Unknown)
There's a "120W" version apparently that we have not seen yet. Whether it is real or not we will see. This one is rumored to be the one in the GT80 and would most likely be the standard MXM3.0b format. Should have same clocks as the 150W version but no less than 1064Mhz core/5Ghz memory. So still should be much faster than 980M. (Supports SLI)
-
Although this throttling algorithm works most of the time and without issues, it completely breaks SLI if you have cards with different ASICs, or any game that doesn't push the GPU hard enough. I've had games crash at stock because the GPU was trying to boost to a certain clock state without enough juice due to voltage crossover. Just ask Ethrem, he knows what I'm talking about.
So if nVidia decided to completely eliminate this throttling nonsense for Pascal, then I can see how Pascal has the potential to become Thermi 2.0Last edited: Dec 8, 2015 -
Last edited: Dec 8, 2015
-
-
LOL, took me a bit to discover what u mean, HAHAHA! gonna change that right away
and no autocorrect, rather a "Freud´sche Versprecher", probably thinking about my boss at that moment
Omnomberry likes this. -
More refreshes. GTX 965m ti?? http://www.notebookcheck.net/Nvidia-Geforce-GTX-965M-Ti-coming-Q1-2016.155408.0.html
-
One benefit of HBM though is its much lower power consumption compared to GDDR5. Not just the memory chips, but the memory controller can be simplified as well, lowering power consumption/TDP further. I suspect the prime motivation for using HBM on Fury X isn't so much for the bandwidth, but rather to keep TDP under control, as I imagine a hypothetical GDDR5 based Fury X would easily go over 300W even with an AIO. That and probably to ease the transition into HBM2, since the board design would be so radically different from GDDR5, and a node shrink, arch update, and memory change all at once might be too daunting. Just FYI the last time nVidia tried to do all 3 things in one go we got Fermi. -
Robbo99999 Notebook Prophet
jaybee83 likes this. -
Either way though, you have to admit the P870DM does bear some resemblance to the 2013 AW machines, particularly the lid design. That said, the same could be said of the GT80's lid design as well. But speaking of which, I just realized Clevo may have taken some cues from GT72 as well.
Exhibit A:
Exhibit B:
I'm not trying to crap on Clevo here, but there's just something about those sleeper laptops that I find very appealing. It was also one of the defining characteristics of a Clevo laptop. I actually do like the new, modern Clevo laptops better, but again, I'd much prefer if Clevo came up with their own design instead of imitating what others have done. Plus there's nothing wrong with a black, rectangular laptop, it's a style of its own. I mean anyone who's thinking about buying a P870DM values performance first and foremost, and probably doesn't care all too much about flash. If the P870DM had kept the P370SM's design, I doubt many walk away in disgust because the laptop just looked too plain.hmscott likes this. -
TomJGX and i_pk_pjers_i like this.
-
PrimeTimeAction Notebook Evangelist
i_pk_pjers_i and jaybee83 like this. -
Meanwhile, AMD has completely withdrawn from the mobile graphics space. No MXM release since the R9 M290X.....in 2013. Here's to hoping Pascal will be something great and not half assed since there is no competition. -
there is no mxm 3.0c standard though....especially since each and every single 980 mobile variant comes in a different form factor with different dimensions
TomJGX, deadsmiley, i_pk_pjers_i and 1 other person like this. -
I think this will be a one time thing with the desktop card being all of these different sizes. I think for pascal even if a desktop 1080 came out, it should fit a standard MXM board IF they use HBM.
AMD should be back in the game when they launch Arctic Islands.
2016 will be exciting for GPUs AND CPUs.jaybee83 likes this. -
-
I don't see resemblance of P870DM to any Alienware model, current or past. The lid hinge is very different. The lid cover has different shape and angles. Granted the lighting has some resemblance. The rear panel, just see a lot of vents. Did you want Clevo to have squiggle shaped vents?
-
Compare the lid hinges of the P570WM, P370SM, P770ZM, and P870DM, then take a look at the lid hinge of the 2013 AW machines. Tell me the P870DM doesn't at least bear more resemblance than the other 3 Clevos. As for lid design, this was my basis for comparison:
Alienware 18
P870DM
Yeah it's definitely the lighting as you said.
As far as vents go, why not make them straight lines? Why do they have to be angled, and in the same direction as another laptop no less? -
Yea it definitely does take a lot of design cues from it's competitors, doesn't take much to see it.
-
Who cares about cosmetics....give me good internals and structural integrity!
jaybee83 likes this. -
Well that's just isn't it. Up till a few years ago Clevo was never really known for its aesthetics, but as the only viable alternative to Alienware machines if you wanted all out performance. Heck, Clevo is still the only OEM to make laptops that use desktop chips and that counts for a lot.
Clevo built its reputation as an OEM that "can and will shove anything into anything and make it work". Sure some of their early models weren't anything to look at, but as you said, who cares as long as the performance was up to par. Besides I'd argue anybody looking for a Clevo likely didn't care too much about aesthetics (or more accurately, flash) in the first place, else they would've got an Alienware.jaybee83 likes this. -
shove anything into anything and make it WORK! thats what im talking about baby! *rofl* loving it
-
You really gotta hand it to them with the P570WM -- a cooling system that can handle an overclocked 4930K is no joke. And that's why I also like Clevo, because they seem to do the opposite of everyone else does: all out performance as top priority, not shying away from making big laptops bulky and heavy, and not giving a flying Peking duck about BGA crap.
-
if only clevo could offer that 17 inch 4k ips screen with 980ms in sli....
-
-
Ionising_Radiation ?v = ve*ln(m0/m1)
I was just reading through this thread. Can someone tell me why components are still so inefficient? I mean, 200 W for a piece of silicon to throw around electrons and display 120 million coloured dots every second (FHD 60 FPS)? If someone games for 5 hours every day, and only the GPU is taxed, that's 1 kWh every day. I don't know how much electricity costs elsewhere, but where I live it's ruddy expensive. One could end up paying $75-100 extra every year, besides the initial costs of the extra-powerful laptop. My ceiling fan uses less power than that (it's just 75 W). We can completely forget monstrous 800-1000W desktops.
Not to mention the thick, heavy cooling apparatus needed to keep everything at a reasonably low operating temperature. I wonder what data centres and workstations make of the costs. Probably 50% goes to paying for the air-conditioning.
Also, speaking of 4K - I hooked up my laptop to my 4K TV and started up Witcher 3, cranked up the resolution and attempted to stumble through the game, at 10 FPS. It was fun. Oh, the lag.Last edited: Dec 10, 2015 -
Plus the TDP is the max cooling potential required, and won't necessarily draw that much power. Especially if you use G-sync, and FPS is limited to 75 which the GPU can handle easily at 1080p. -
Gaming and general computing won't always draw 100%. On the other hand Heaters and Aircons are usually going full bore all the time.. And yes, come winter / summer I do notice that.
It's worth it to me if it costs an extra $100 a year to give an awesome gaming experience. -
triturbo, i_pk_pjers_i, jaybee83 and 2 others like this.
-
-
Ionising_Radiation ?v = ve*ln(m0/m1)
Well, to each his own. I haven't got independent means (like probably many others here), which means I don't exactly have the right to use as much electricity as I need/want. Also, my smartphone bill is $23 a month. The 5-hours-of-gaming-a-day thing was simply a mathematical estimate - an extreme case. Nevertheless, AAA gaming on a high-end desktop is likely to suck up lots of power at any rate.
i_pk_pjers_i likes this. -
To me this was mindblowing when I switched.
I can have more performance for less power with less noise in a far smaller package( which also powers the screen).
It can feel slow, yes, but it still moves forward. (fast enough for me to still be amazed)Ionising_Radiation likes this. -
i_pk_pjers_i Even the ppl who never frown eventually break down
You're concerned about $75 extra a year when you likely spent over $2000 on your laptop? Color me a bit confused. -
Ionising_Radiation ?v = ve*ln(m0/m1)
-
i_pk_pjers_i Even the ppl who never frown eventually break down
That makes sense and that's a bit of a different story, however, with a powerful laptop/computer it should not be too surprising that it will cost a lot for electricity. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Certainly, things have improved a lot from 2010 and even more from 2000, but there's a long, long way to go.
I'm hoping that Pascal makes add-on board sizes smaller, so we can have MXM cards at most 3x the size of a SO-DIMM, what with HBM and stacked memory and all.
Actually, is stacked memory for Volta, i.e. SoC GPUs? -
would big pascal or entry level pascal be released first? i know the 750 ti and 860m were the first maxwell cards released as a teaser. is nvidia going to do that crap again or will they just release the new titan or 1080 first
-
i_pk_pjers_i Even the ppl who never frown eventually break down
-
your guesses are as good as anyone´s at this point in time
it really depends on how high the yield of the new chips is. the lower the yield, the higher the likelihood that theyre gonna release the lower-tier gpus first.
-
King of Interns Simply a laptop enthusiast
Sent from my SM-A500FU using Tapatalkjaybee83 likes this. -
If GP100 (big Pascal) gets released first expect to pay through the nose ($1500+) for them. Plus you don't want to show your entire hand from the get go, so I strongly suspect nVidia will continue their model of releasing the fake medium die "flagship" first, then releasing the true big die flagship 6-12 months down the road for double dipping.
TBoneSan, Robbo99999 and jaybee83 like this. -
Sent from my Nexus 5 using Tapatalk -
King of Interns Simply a laptop enthusiast
Then i will be able to link my CPU and GPU cooling modules together to help cool the stupidly hot 920xm!
We all are allowed to dream... -
"No it is unrealistic"
look whos talking! the guy who managed to clock his "stupidly hot 920xm" up to 4.52 Ghz!!!Last edited: Dec 11, 2015 -
King of Interns Simply a laptop enthusiast
Under load I can "only" run 3.7ghz all cores stock air-cooled. -
cheap excuses!
(over here its 5.0ghz "for science" and 4.7 under load, 4.3 for everyday)
Sent from my Nexus 5 using Tapatalk -
King of Interns Simply a laptop enthusiast
I envy you. Your chip ain't a 6-7 year old chip at 45nm sucking more than 150W at high load
If I were in your shoes I would be shooting for 5.5ghzthe crazy guy I am..
Pascal: What do we know? Discussion, Latest News & Updates: 1000M Series GPU's
Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Oct 11, 2014.