that's good news. But not helping me between P6 orP7;-)
@Kittys I remember You got a P6x (but can't remember the exact model): what's the noise on it?
-
That CPU is the kicker 33% higher clock. The physics score tells all. Draw calls galore over DX11. 4.7G Turbo, that's insane, of course double TDP. Not to mention the GPU are nominally overclocked and the ram is clocked faster. It's just all around faster. -
Close Encounters with Dellware4h50m These aren't 3rd Party systems that we just rebrand & throw our name on ... that ... that's lazy, that's lazy design. If you're a REAL gaming company?, you wanna design your own thing?, we take alot of pride in designing these things, these aren't lazy rebrands
N3RDFUSION PODCAST
(oh my, potshot at OriginPC???)4h46m25s 'the 1080 is rated up to 180watts ... the previous 17 was rated up to 100watts ...'
(980m rated up to 100watts, but the 17 used 88watt 980m, right? wrong? ... so ... 'up to 180w may or may not mean all 180 in AlienWorld & in AlienSpeak)
Jailienware PAX Day1
2h 17m Q: Unlocked Bios?
Ab:There's tons of fea ... there are features in there that allow you to unlock, but you get the most unlocked features when you get the K-Series processor; otherwise, that's just like overclock mode 0 1 2, that kinda thing
(Got PremaMod?)Ab: The graphics cards & processor are soldered on the motherboard. We don't have MXM on our notebooks; but you know, MXM?, has alot of drawbacks, right? With every MXM module?, you actually have to make a custom thermal module, per card, so think about that, however many graphics cards & notebooks & cpus we have, we'd have to develop new heatsinks, make sure that those work, & then make sure that all the different thermal modules for that MXM module?, will work within the Industrial Design; so there's a lot of complexity there, & it's another part that you actually have to build on the notebook at the factory, so that introduces the opportunity for more human error, you know, so by having
Q: MXM or BGA GPU?
Aa: that's something that people don't know, which we see on our side; we have the history, we have the numbers on MXM vs BGA, we see the downsides of MXM, the incredible amount of human error that goes on with that
So you know, we wanna, quality is something we take seriously, we wanna make sure that we have the least amount of issues out in the field, & going to that soldered model is definitely awesome since it's allowed us to have even better quality, better results,
& also it's thinner, because for those sockets, you have to add for ...
yes, think of all the mechanical parts that the MXM module has, just to hold & secure that thermal module, you know, you get rid of that stuff
and some of the most important things to people is power performance & size; & yes there's people that like their big-fat laptops, but, we're not the kind of people where just because it's smaller, you're not losing performance; we're making it smaller, we're making smart engineering decisions to not just make it thinner but also to make it perform at peak, optimal performance
all of our new notebooks are thinner than our previous generations; the new 15 & 17?, they're both thinner than the previous gen, but, when we do that we make sure that with each generational change we make & when we do go thinner, that we're not sacrificing any performance, so in the case of the 15, 100watts max, & we're thinner, so we're still gonna get that 1070 performance & it's gonna be working at its full potential, right, so it's not like we put a 1070 or 1060 & under-deliver power, ummm no, it's getting the full-rated power, & in the case of the 17 we got thinner & we're gonna be able to support that 1080 card, which is a 180watt card, more wattage, so across the board we haven't hurt performance at all but we've been able to get thinner ...
_____________________Hyper-real gaming companies might be defined as those who undertook the challenge of complexity & human error in designing dual-MXM 1080s + 2x330w bricks: enthusiast-class gaming
'so there's a lot of complexity there'
'REAL' gaming company, but avoids complexity
sounds like equally lazy design to me
Q: What type of gamer will buy the P21x?
A: People who won't want to upgrade for a while
(ya think? what's the definition of a while ... next week or never)
Q: What scope is there for overclocking?
A: Because it's a laptop we have to be a little conservative in this regard, as we don't want it ('it' being the BGA parts) to catch fire
(lol, under the microscope it looks like little to no scope)
Last edited: Sep 6, 2016CaerCadarn, Papusan, Mr. Fox and 2 others like this. -
When nixing intensive workload it takes a few minutes for fans to ramp back down to silent too.
Sent from my ZTE A2017U -
I WANT AN EVERKI TITAN SO BAD.
My birthday is in november, just saying
No but really, my Alienware Orion backpack was really good. But it started falling apart after a couple of years. I got it repaired REALLY well done by this company here for a really low price, but it just started falling apart and fraying around the straps and stuff at other parts. I miss it, but boy do I want that Everki Titan -
Sent from my ZTE A2017UD2 Ultima likes this. -
Mr Najsman likes this.
-
Sent from my ZTE A2017U -
-
ThePerfectStorm Notebook Deity
@Mr. Fox - are you planning on getting the P870DM3 with SLI 1080s and a Desktop CPU (6700K)?
Sent from my SM-G935F using Tapatalk -
Note how her physics score was 12500? The 6920HQ at 4GHz (I know it can clock up to and hold 4GHz on all 4 cores without issue as long as the laptop isn't limiting it somehow) only had 11059 for Physics. That's too large a discrepancy; it was throttling pretty hard. My 3.8GHz 4800MQ can get about 10400 with crap RAM; Skylake at 3.8GHz or a bit less might hit 11,000. But that wasn't 4GHz. -
1.2V can be had with a VBIOS mod and no it does not have to stay fixed in P0 or at highest boost, that was done to show max and min clocks during the bench, something someone was bellyaching about earlier with Pascal.
Was trying to be helpful but can see it didn't work out that way so I'll give up while I'm behind..
-
Then I asked if it was a laptop. Then I went back and checked and saw that it wasn't. So I edited my post to show I answered my own question.
I totally understand that there are 2 1.2V vbios floating around since I did have desktop sli 1080's in the top 10 HOF, but those cards did not run the 1.2V mod, but ran the power limit modded vbios. So I totally get why you showed this.
Was not trying to rain on your parade my friend.
To shed light over here as well. @hmscott should like this part.
Posted 33 minutes ago (edited) · Report post
http://www.3dmark.com/fs/9595299
That is a 6700k, where do you get 6920hq from that?
I think you might want to go back and check first before jumping the gun my friend....
These are more closer to what you should be looking at.
http://www.3dmark.com/compare/fs/10011170/fs/10001580/fs/10059807
And like i have pointed out many times before. A single gpu physics score will always be higher unless the person benching does not know what they are doing by a very very long shot.
(Apparently not the case with 6920HQ and 980N. See below)
http://www.3dmark.com/compare/fs/9269885/fs/7575204/fs/8619427/fs/7602138/fs/9312667/fs/10053831
But this one is the one that would prove your point..
And now I understand why hmscott made the comment about higher physics with dual cards. That would seem to hold true for the 6920HQ.
http://www.3dmark.com/compare/fs/10053831/fs/7808124/fs/8437252/fs/8124431
Edit: My apologies to @D2 Ultima though! I did not mean to offend!Last edited: Sep 6, 2016 -
I did a little benching with moderate overclocks connected via Team Viewer to Eurocom's campus with an unattended Sky X9E2 this afternoon. I was not able to control the fans and the CPU temps are totally out of control, so this was with whatever thermal paste they have and sitting on a flat surface with some pretty severe CPU thermal throttling. I could not push the CPU any further without direct access to the machine, but still not too shabby all things considered. With a delid and Liquid Ultra the CPU should be fine, just as it was with the P870DM-G. The GPUs were warm but not overheating. I had to also play it conservative since they are about 1500 miles away and I had to avoid doing anything to cause the machine to freeze or lock up, as they are closed for the day (after hours). All things considered, I am really impressed. I used the Clevo CPU and GPU overclocking tools in Clevo Control Center.
http://www.3dmark.com/3dm11/11556701
http://www.3dmark.com/fs/10072213
http://www.3dmark.com/sd/4289532
Last edited: Sep 6, 2016 -
Last edited: Sep 6, 2016Ashtrix, Papusan, Kade Storm and 3 others like this. -
Ot: my everki titan suffered a damage from one of the side pocket elastic strap, everki is asking me to either provide a "donation receipt" or destroy the bag
Anyone got a template for a donation receipt that I can print? :| -
I can't seem to find a suitable 18.4" laptop.Mr. Fox likes this. -
Sent from my ZTE A2017Ubirdyhands likes this. -
Just thinkin' aloud.... -
-
-
CaerCadarn likes this.
-
PrimeTimeAction Notebook Evangelist
Last edited: Sep 7, 2016hmscott likes this. -
Last edited: Sep 7, 2016ajc9988 and Kade Storm like this. -
-
You just need to buy the right game...SirSaltsAlot, mason2smart and Mr. Fox like this. -
Question for all the Optimus (and Optimus-hating) experts in this thread. When I hook up an external screen to my GS43VR (which has Optimus) and set it to display only on the external screen, and then reboot, the external display will pull from the dGPU only, so clearly the HDMI and DisplayPorts are hooked up directly to the dGPU. But if I then unplug the external display, it appears that the dGPU stays switched on and powers the internal display - there is no dGPU/iGPU switching I can detect until I reboot again. So it seems that after plugging in an external display and rebooting, the computer will essentially stay in a "dGPU-only" mode even after the external display is disconnected, which is pretty exciting.
Is this normal behavior for notebooks that have Optimus but have ports wired directly to the dGPU? Or is this something new to Pascal, or possibly to MSI VR laptops? And does it suggest that there might be another way to disable Optimus and have the internal display run entirely off the dGPU?Last edited: Sep 7, 2016 -
ajc9988, CaerCadarn and hmscott like this. -
Do your thing until you sit at the desktop on the laptop screen alone, when you're supposedly running dGPU only. Open GPU-Z and navigate to the dGPU. Tell me if your dGPU has any vRAM being used.
Launch a fullscreen game without the external being plugged in. Doesn't matter the game as long as it hits true fullscreen (Dead By Daylight; all Micro$haft UWP titles, Binding of Isaac rebirth/Afterbirth, etc do not count) and then exit it immediately after. Check your vRAM utilization again on your dGPU.
Reboot the system so that "optimus" is working, and check if you're getting vRAM utilization on your dGPU again.
If the answers are "yes, yes, no" then you have a built-in MUX switch somehow.
Also test if sleeping and waking it up keep the "dGPU-only" benefits, and if closing the laptop lid when the option in Windows power options is set to "do nothing" (but no external screen is plugged in) and then re-opening the lid (it kills and restarts the display) makes any changes.
It's not normal for Optimus machines. MSI has always had external displays wired to their dGPUs (which is great; it allowed them to use higher resolution, higher refresh external displays with nVidia tech since their GT60/GT70 series at the minimum), however a dGPU powering the internal display on its own REQUIRES a MUX switch present, whether or not you can manually switch it at will. The reason is because the internal display has to be "wired" to something. If it is wired to the iGPU, then wiring to the dGPU does not need existing, and even if external displays are wired to the dGPU, it doesn't necessitate the dGPU having any internal wiring at all. But if you can get the screen to actually be on the dGPU directly (and not just having the dGPU get stuck in an "on" state) then wiring to both has to exist physically and the system must have some sort of MUX to switch which display is which.
Also, check if your nVidia control panel looks like this in "dGPU only mode"
If that's what you see when only the laptop screen is attached and running, then Optimus is also in effect. On the other hand, if what you see is this (note the extra options on the side):
AND the options actually do something, then dGPU mode works. Ignore SLI and Stereoscopic 3D; that's quirks to my laptop and won't exist on even other dGPU-only ones. If you want to test if things work too, try enabling DSR and raising the actual resolution of your laptop screen. When DSR is enabled you get more choices under change resolution for the default laptop panel, like this:
Mr Najsman, hmscott and Prototime like this. -
mason2smart Notebook Virtuoso
Only issue i am having is one of the shoulder straps stretched really thin
-
mason2smart Notebook Virtuoso
-
After rebooting into "Optimus" mode, the dGPU is off on start-up according to the MSI Dragon Center, but navigating to the dGPU setting in GPU-Z activates the dGPU and the dedicated memory usage is at 2MB (0MB for dynamic). That's such a low amount of usage I'm not sure if it counts, which is why I said I "believe" yes on #3. But regardless...
So does this mean the dGPU actually isn't powering the internal display after unplugging the external display, but is just sitting there turned on and generating heat without actually being used?hmscott likes this. -
mason2smart Notebook Virtuoso
New 1080 SLI laptops don't seem to be scoring where they should... seems like some of the 980 sli laptops score about the same. And the 1080 sli are only getting around 5,000 more than my 980m's on firestrike... that and cpu performance seems to decrease...
CaerCadarn and hmscott like this. -
PrimeTimeAction Notebook Evangelist
-
hmscott likes this.
-
mason2smart Notebook Virtuoso
I havent hit over 16,000... Sure we are talking about the same benchmark?
Highest GTX1080 i have seen so far:
http://www.3dmark.com/fs/9954310
Most people seem to score around here:
http://www.3dmark.com/fs/10071898hmscott likes this. -
Last edited: Sep 7, 2016Papusan, temp00876 and mason2smart like this.
-
Too little. vRAM usage in windows should be a minimum of 256MB. On average, you should be around 400MB on the card, or higher. Your iGPU would show about that amount of usage if you checked it, I am certain.
Then you're still using Optimus.
Yes, you're correct. The dGPU is just remaining on because... well I don't really know why. Sounds like the Optimus tech is bugged in itself, as most laptops won't wire things to the dGPU at all. Windows in itself is not allowed to change the primary graphics adapter without a reboot, but it is allowed to run multiple graphics adapters at the same time (you can use iGPUs on desktops to run secondary monitors, etc etc).
Welcome to another gripe about Optimus! ;D
For example, take a run I did last night:
http://www.3dmark.com/fs/10073378
my "score" is 10588. My "graphics score" is 13813.
1080 SLI get around 40k graphics score. Total score will be much lower.Prototime, hmscott and Kade Storm like this. -
Graphics score, not aggregate. Remove confounding variables.
hmscott likes this. -
Here is the highest recorded Fire Strike score for 980 SLI (notebook) and graphics score is only 27K with a big overclock and @Prema vBIOS. http://www.3dmark.com/fs/8291254
That's a 52.7% increase in the graphics score over 980 SLI with an un-level playing field tilted in favor of the 980 SLI setup. http://www.3dmark.com/compare/fs/8291254/fs/10072213Last edited: Sep 7, 2016CaerCadarn, Papusan, TBoneSan and 2 others like this. -
mason2smart Notebook Virtuoso
-
Mr Najsman, TBoneSan, hmscott and 4 others like this.
-
On a more serious note, look at everything. That 1080 SLI example was done with an overheating CPU that was thermal throttling using Team Viewer from over 1500 miles away. Had it been running cool like the CPU was at 4.7GHz on my own P870DM-G the gap would be wider.
If you isolate the graphics score, gimped vBIOS and all, with a wimpy overclock the 1080 SLI absolutely obliterated the extremely optimized 980 SLI with @Prema vBIOS, overclocked and overvolted. Putting everything into perspective, that is a truly massive performance increase. This dwarfs any jump in performance that we have seen before, and Kepler to Maxwell was certainly nothing to ignore.CaerCadarn, Mr Najsman, TBoneSan and 7 others like this. -
Edit: How long did I take to write one sentence??
Edit2: Harmonic mean according to @Q937: so like how resistors combine in parallel.. -
hmscott likes this.
-
mason2smart likes this.
-
1080 SLI vs 980m SLI Graphics ==> + 117.7 % 46,467 21,345
http://www.3dmark.com/compare/fs/9954310/fs/5708173
The other score @mason2smart posted was 40,525 which is +90% faster than my 980m SLI.
It's a bit early to start doing firm comparisons against 1080 SLI vs other long established SLI tests, but I'm pretty happy with the +120% improvement range for starters.Last edited: Sep 8, 2016birdyhands, mason2smart and Mr. Fox like this. -
Last edited: Sep 8, 2016birdyhands, TBoneSan, mason2smart and 3 others like this.
-
mason2smart and hmscott like this.
-
Friends don't let friends buy Optimus laptopsLast edited: Sep 8, 2016Ashtrix, Miguel Pereira, Mr. Fox and 6 others like this.
*Official* nVidia GTX 10xx Series notebook discussion thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Orgrimm, Aug 15, 2016.