There's the 200W 980 in the P870DM [This one is longer and wider than standard MXM3.0b, Eurocom calls it MXM3.0c, whether that's just a placeholder or it will be an actual thing, we don't know] (No SLI)
There's the 180W 980 in the P775DM1 which should have the same clocks as the 200W 980 but it won't overclock as high due to less power phases, etc. [This one is almost standard MXM3.0b but is slightly longer] (Supports SLI since it has the connectors)
There's the 150W 980 in the GT72S which has lower clocks and 5Ghz memory I believe and even less power phases than the 180W 980 but should be within 5-10% performance of the 200/180W models stock vs stock. [This one has a weird shape and is long left to right on one corner] (SLI Unknown)
There's a "120W" version apparently that we have not seen yet. Whether it is real or not we will see. This one is rumored to be the one in the GT80 and would most likely be the standard MXM3.0b format. Should have same clocks as the 150W version but no less than 1064Mhz core/5Ghz memory. So still should be much faster than 980M. (Supports SLI)
-
The laptops do look more uhh, modern without that 90's black box look. Although I have to say they most certainly ripped off took design cues from Alienware
Well Maxwell is efficient due to aggressive dynamic throttling as well as being completely FP64 gimped. If you follow that Tom's link I posted, this is what they had to say:
There you have it, GM204 is really a 250W card in disguise. This also explains why compared to GK110, GM200 runs ass hot, throttles at stock, and basically pushes the reference blower beyond its limits even at stock. Hell with a modded vbios I can get my 980 Ti to suck down 400W just playing BF4, that's some insanity.
Although this throttling algorithm works most of the time and without issues, it completely breaks SLI if you have cards with different ASICs, or any game that doesn't push the GPU hard enough. I've had games crash at stock because the GPU was trying to boost to a certain clock state without enough juice due to voltage crossover. Just ask Ethrem, he knows what I'm talking about.
So if nVidia decided to completely eliminate this throttling nonsense for Pascal, then I can see how Pascal has the potential to become Thermi 2.0Last edited: Dec 8, 2015 -
theres no universal answer to that. u add more cores, u need more memory bandwidth to keep up with that and vice versa. also depends on the arch and the software situation were talking about. what ive seen with the 980M ist that increasing core clocks brings more of a performance boost than increasing memory clocks, albeit both DO give boosts. with core clocks being the same and increasing the number of cores by 33% id bet you that the resulting boost is closer to 33% than 10% by far!Last edited: Dec 8, 2015
-
Nice autocorrect
-
LOL, took me a bit to discover what u mean, HAHAHA! gonna change that right away
and no autocorrect, rather a "Freud´sche Versprecher", probably thinking about my boss at that moment
Omnomberry likes this. -
More refreshes. GTX 965m ti?? http://www.notebookcheck.net/Nvidia-Geforce-GTX-965M-Ti-coming-Q1-2016.155408.0.html
-
28nm cards don't really benefit too much from HBM because the GPUs aren't bandwidth starved. Hell overclocking the memory from 1750 to 2000 (7 to 8 GHz effective) gives me a whooping 5-7% performance increase, and this is with a GM200 (980 Ti) overclocked past 1500.
One benefit of HBM though is its much lower power consumption compared to GDDR5. Not just the memory chips, but the memory controller can be simplified as well, lowering power consumption/TDP further. I suspect the prime motivation for using HBM on Fury X isn't so much for the bandwidth, but rather to keep TDP under control, as I imagine a hypothetical GDDR5 based Fury X would easily go over 300W even with an AIO. That and probably to ease the transition into HBM2, since the board design would be so radically different from GDDR5, and a node shrink, arch update, and memory change all at once might be too daunting. Just FYI the last time nVidia tried to do all 3 things in one go we got Fermi. -
Robbo99999 Notebook Prophet
Haha, hopefully your boss is not a notebook enthusiast!
jaybee83 likes this. -
Weren't the Clevo Alienwares criticized for being nothing more than "reskinned, fancy Clevos" that had aesthetic changes only with no functional improvements? Or am I thinking of something else.
Either way though, you have to admit the P870DM does bear some resemblance to the 2013 AW machines, particularly the lid design. That said, the same could be said of the GT80's lid design as well. But speaking of which, I just realized Clevo may have taken some cues from GT72 as well.
Exhibit A:
Exhibit B:
I'm not trying to crap on Clevo here, but there's just something about those sleeper laptops that I find very appealing. It was also one of the defining characteristics of a Clevo laptop. I actually do like the new, modern Clevo laptops better, but again, I'd much prefer if Clevo came up with their own design instead of imitating what others have done. Plus there's nothing wrong with a black, rectangular laptop, it's a style of its own. I mean anyone who's thinking about buying a P870DM values performance first and foremost, and probably doesn't care all too much about flash. If the P870DM had kept the P370SM's design, I doubt many walk away in disgust because the laptop just looked too plain.hmscott likes this. -
I see no resemblance there except that they both have lots of vent holes which are required to cool the components. /shrug/. It looks pretty different to me. And I can't blame them for adding a little more character to the chassis design to reel in AW customers that are pretty irritated with AW considering they screwed the pooch and are nothing more than soldered generic laptops now.TomJGX and i_pk_pjers_i like this.
-
PrimeTimeAction Notebook Evangelist
What pisses me really off that this Nvidia GTX980 for notebook not only screwed up the mobile GPU naming but also MXM naming. There is s "standard MXM3.0b", a "wider than MXM3.0b",an "almost MXM3.0b" and a "God knows what MXM". Good luck trying to find a proper card for replacement/upgrade 5 years down the line.i_pk_pjers_i and jaybee83 like this. -
I think the special mobile GTX 980 is MXM 3.0c
Meanwhile, AMD has completely withdrawn from the mobile graphics space. No MXM release since the R9 M290X.....in 2013. Here's to hoping Pascal will be something great and not half assed since there is no competition. -
there is no mxm 3.0c standard though....especially since each and every single 980 mobile variant comes in a different form factor with different dimensions
TomJGX, deadsmiley, i_pk_pjers_i and 1 other person like this. -
I think this will be a one time thing with the desktop card being all of these different sizes. I think for pascal even if a desktop 1080 came out, it should fit a standard MXM board IF they use HBM.
AMD should be back in the game when they launch Arctic Islands.
2016 will be exciting for GPUs AND CPUs.
jaybee83 likes this. -
No, thats what eurocom calls the 200w version of GTX 980. MXM3.0c. As for the 150w, 180w, or possibly the 120w, no idea.
-
I don't see resemblance of P870DM to any Alienware model, current or past. The lid hinge is very different. The lid cover has different shape and angles. Granted the lighting has some resemblance. The rear panel, just see a lot of vents. Did you want Clevo to have squiggle shaped vents?
-
Compare the lid hinges of the P570WM, P370SM, P770ZM, and P870DM, then take a look at the lid hinge of the 2013 AW machines. Tell me the P870DM doesn't at least bear more resemblance than the other 3 Clevos. As for lid design, this was my basis for comparison:
Alienware 18
P870DM
Yeah it's definitely the lighting as you said.
As far as vents go, why not make them straight lines? Why do they have to be angled, and in the same direction as another laptop no less? -
Yea it definitely does take a lot of design cues from it's competitors, doesn't take much to see it.
-
Who cares about cosmetics....give me good internals and structural integrity!
jaybee83 likes this. -
Well that's just isn't it. Up till a few years ago Clevo was never really known for its aesthetics, but as the only viable alternative to Alienware machines if you wanted all out performance. Heck, Clevo is still the only OEM to make laptops that use desktop chips and that counts for a lot.
Clevo built its reputation as an OEM that "can and will shove anything into anything and make it work". Sure some of their early models weren't anything to look at, but as you said, who cares as long as the performance was up to par. Besides I'd argue anybody looking for a Clevo likely didn't care too much about aesthetics (or more accurately, flash) in the first place, else they would've got an Alienware.jaybee83 likes this. -
shove anything into anything and make it WORK! thats what im talking about baby! *rofl* loving it
-
You really gotta hand it to them with the P570WM -- a cooling system that can handle an overclocked 4930K is no joke. And that's why I also like Clevo, because they seem to do the opposite of everyone else does: all out performance as top priority, not shying away from making big laptops bulky and heavy, and not giving a flying Peking duck about BGA crap.
-
if only clevo could offer that 17 inch 4k ips screen with 980ms in sli....
-
Well you can get it with 980m SLI. 4K is still out of its reach though. It needs to be 2560x1440. Forget about 4k.
-
Ionising_Radiation ?v = ve*ln(m0/m1)
I was just reading through this thread. Can someone tell me why components are still so inefficient? I mean, 200 W for a piece of silicon to throw around electrons and display 120 million coloured dots every second (FHD 60 FPS)? If someone games for 5 hours every day, and only the GPU is taxed, that's 1 kWh every day. I don't know how much electricity costs elsewhere, but where I live it's ruddy expensive. One could end up paying $75-100 extra every year, besides the initial costs of the extra-powerful laptop. My ceiling fan uses less power than that (it's just 75 W). We can completely forget monstrous 800-1000W desktops.
Not to mention the thick, heavy cooling apparatus needed to keep everything at a reasonably low operating temperature. I wonder what data centres and workstations make of the costs. Probably 50% goes to paying for the air-conditioning.
Also, speaking of 4K - I hooked up my laptop to my 4K TV and started up Witcher 3, cranked up the resolution and attempted to stumble through the game, at 10 FPS. It was fun. Oh, the lag.Last edited: Dec 10, 2015 -
Is $75-100/year really that much? Adding an extra $6-8/mo to your electric bill? I mean people don't flinch to pay $50/mo or more for a smartphone bill. And gaming 5 hrs/day, that's like a part time job. Sure even in my heyday with no kids and only apartment I'd have sessions where I'd play 5-6 hours a day over a weekend, but never *every day*. Anyone with a full time or even part time job couldn't manage that. At my peak I was probably only playing 25 hrs/week. 150hrs/mo gaming seems a bit excessive. Just compare the performance per watt where we were five years ago. It's staggering how much better power consumption has become.
Plus the TDP is the max cooling potential required, and won't necessarily draw that much power. Especially if you use G-sync, and FPS is limited to 75 which the GPU can handle easily at 1080p. -
Honestly, I haven't even noticed any difference in price no matter what system I've used. That's based on exclusively using devices ranging from 100w - 1000w in literally the most expensive place in the world for electricity.. Adelaide (oz) and now, #5 most expensive Japan.
Gaming and general computing won't always draw 100%. On the other hand Heaters and Aircons are usually going full bore all the time.. And yes, come winter / summer I do notice that.
It's worth it to me if it costs an extra $100 a year to give an awesome gaming experience. -
Spend $3k on gaming laptop and $50 per big release game... worry about $100/yr electricity? Nope. It's a hobby. You suck up / budget for the costs.triturbo, i_pk_pjers_i, jaybee83 and 2 others like this.
-
That's it. Spending big $'s on a system then begrudging oneself to get value from using it doesn't make sense.
-
Ionising_Radiation ?v = ve*ln(m0/m1)
Well, to each his own. I haven't got independent means (like probably many others here), which means I don't exactly have the right to use as much electricity as I need/want. Also, my smartphone bill is $23 a month. The 5-hours-of-gaming-a-day thing was simply a mathematical estimate - an extreme case. Nevertheless, AAA gaming on a high-end desktop is likely to suck up lots of power at any rate.
i_pk_pjers_i likes this. -
Yea, but on the other hand my new gaming laptop displays far more advanced graphics at max 230W than my 5 year old pc with HD6700 (<- not sure any more which model)@ 300W.
To me this was mindblowing when I switched.
I can have more performance for less power with less noise in a far smaller package( which also powers the screen).
It can feel slow, yes, but it still moves forward. (fast enough for me to still be amazed)Ionising_Radiation likes this. -
i_pk_pjers_i Even the ppl who never frown eventually break down
wat
You're concerned about $75 extra a year when you likely spent over $2000 on your laptop? Color me a bit confused. -
Ionising_Radiation ?v = ve*ln(m0/m1)
I suppose I ought to clarify - I did not spend $2000 on my laptop - someone else bought it for me. Just because someone spends a lot of money in one go doesn't mean they can afford to slowly leak money... -
i_pk_pjers_i Even the ppl who never frown eventually break down
Oh, alright. I assumed that if you were concerned about the cost of electricity, that likely meant that you bought your own laptop as well since it would mean that you were handling finances if you were worried about the cost of electricity.
That makes sense and that's a bit of a different story, however, with a powerful laptop/computer it should not be too surprising that it will cost a lot for electricity. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Fair enough. And that brings us back to where we started - powerful components, or computer components in general, appear inefficient.
Certainly, things have improved a lot from 2010 and even more from 2000, but there's a long, long way to go.
I'm hoping that Pascal makes add-on board sizes smaller, so we can have MXM cards at most 3x the size of a SO-DIMM, what with HBM and stacked memory and all.
Actually, is stacked memory for Volta, i.e. SoC GPUs? -
would big pascal or entry level pascal be released first? i know the 750 ti and 860m were the first maxwell cards released as a teaser. is nvidia going to do that crap again or will they just release the new titan or 1080 first
-
i_pk_pjers_i Even the ppl who never frown eventually break down
I would probably say high end Pascal would be released first. -
your guesses are as good as anyone´s at this point in time
it really depends on how high the yield of the new chips is. the lower the yield, the higher the likelihood that theyre gonna release the lower-tier gpus first.
-
King of Interns Simply a laptop enthusiast
Mxm 3.0a 1080M would be tasty!
Sent from my SM-A500FU using Tapatalkjaybee83 likes this. -
If GP100 (big Pascal) gets released first expect to pay through the nose ($1500+) for them. Plus you don't want to show your entire hand from the get go, so I strongly suspect nVidia will continue their model of releasing the fake medium die "flagship" first, then releasing the true big die flagship 6-12 months down the road for double dipping.
TBoneSan, Robbo99999 and jaybee83 like this. -
that would be insaaaaane! *lol*
Sent from my Nexus 5 using Tapatalk -
King of Interns Simply a laptop enthusiast
No it is unrealistic. However not impossible for the 1070M to be MXM 3.0a. If they use HBM AND the card consumes less power and therefore produces less heat it is feasible to use MXM 3.0a form factor!
Then i will be able to link my CPU and GPU cooling modules together to help cool the stupidly hot 920xm!
We all are allowed to dream...
-
"No it is unrealistic"
look whos talking! the guy who managed to clock his "stupidly hot 920xm" up to 4.52 Ghz!!!
Last edited: Dec 11, 2015 -
King of Interns Simply a laptop enthusiast
Lol that was all in the name of science with +200mv overvolt that was disgustingly unstable. I did that air-cooled though
Under load I can "only" run 3.7ghz all cores stock air-cooled. -
cheap excuses!
(over here its 5.0ghz "for science" and 4.7 under load, 4.3 for everyday
)
Sent from my Nexus 5 using Tapatalk -
King of Interns Simply a laptop enthusiast
I envy you. Your chip ain't a 6-7 year old chip at 45nm sucking more than 150W at high load
If I were in your shoes I would be shooting for 5.5ghz
the crazy guy I am..
Pascal: What do we know? Discussion, Latest News & Updates: 1000M Series GPU's
Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Oct 11, 2014.