I don't think, i know the maximum is ultra nightmare
-
Danielegiangregoreio Notebook Enthusiast
-
mason2smart Notebook Virtuoso
No4k textures? -
Danielegiangregoreio Notebook Enthusiast
https://steamcommunity.com/app/379720/discussions/0/357286663692816522/?l=german&ctp=2 it seems like there aren't higher textures because of they will Take big VRAM usage -
How do you like the new Doom overall? I wasn't that impressed. I thought it looked kinda cartoonish.
BTW, last night I played Witcher 3 totally maxed out on stock speeds and voltages. Perfectly fluid and without hiccups. This machine is a beast!
No need for overclocking, that's the way I like it.
mason2smart likes this. -
I still don't quite understand how the Witcher 3 looks as good as it does AND runs as good as it does. Its witchcraft I tell you.mason2smart and stank0 like this.
-
Had my GT80 about 10 months now. Just took the backplate off and cleaned out the two fans and the exhaust grills. Not a great deal of dust in there, but seems to have dropped my temps by 2-4c across the board. I'd recommend others to do the same as part of general housekeeping after a similar amount of time.
-
Hmmm. The site doesn't use M designation for mobile notebooks like 960m, 965m etc, so at least we know there is a GT83 with dual 1080m/non M version planned
mason2smart, hmscott and MiSJAH like this. -
http://www.laptop4pro.vn/msi-gt83vr...2gb-512gb-ssd-1tb-g3-gtx-1080m-8gb-p2043.aspx
With 'm', not sure of that site though.
Last edited: Jul 15, 2016mason2smart and hmscott like this. -
Cool, that's the 1080 desktop GT83VR-6RF, and here are the GT73 versions:
NB-MGT73V7Q7H GT73VR-6RE i7Qu+1070+H+16
NB-MGT73V7Q8HU GT73VR-6RF i7Qu+1080+H+16+UHD
Those model names are for .ZA, so elsewhere they might be different.
It's odd that the extension VR is used in all the model names, where 6RF seems to indicate a 1080, but they can't be putting a full 1080 in the GT73 and GT83??
This one doesn't use 'm' in the naming...mason2smart and MiSJAH like this. -
If you check the image, the GT83VR-6RF is listed with 1080m, not just 1080's. Also, I thought the GT73 was going to be sli but no info on that yet....
-
The GT73S is supposed to be the 'm' SLI version, and the GT83 is the desktop mobile SLI version. I am surprised there is no 'm' in the 1080 GT73 naming in the price list model numbers.
-
coretech.co.za/ list all gpus without the m, 9XX have no m on that site.
Link to the site I got the image from - http://chantracomputer.com/index.htmlhmscott likes this. -
oh, 3rd world seller
no wonder the price is out of this worldmason2smart likes this. -
FTFY.
-
-
lmao how this thread goes off topic, i havent checked back in like 2 days and it happened.
that escalated quickly.
gx800 review is out on notebookcheck but disappointing, dont get to see the inside. probably part of the review NDA or w/e it is ASUS wont allow them to open it up.hmscott likes this. -
The GX800 is listed as a prototype in the Preview, not sure if Asus will release the 2 x 980 GX800, or wait now to release the Pascal version. Tough call to make a purchase of the GX800 Maxwell SLI within sight of the Pascal mobile releases - still not announced.
Nothing disappointing about the GX800 performance, it uses 2 x 330w power supplies and gets much better performance than the GT80S 980 SLI can show for the single 330w power supply + battery boost.
" Power Consumption
Asus ships the GX800 with two 330-Watt power adaptors (1x for the notebook and 1x for the water cooling) to cover the enormous power consumption. That 660 Watts are not overkill is confirmed by our measurements. Up to 590 Watts under maximum load and almost 200 Watts in the first scene of 3DMark06 remind us of a desktop PC.
Another comparison with the GT80S, where the GPUs are not overclocked: The rival from MSI is much more frugal with 411 and 163 Watts, respectively. The GX800 pulls between 58 and 74 Watts from the socket while idling – again more than the GT80S (44-62 Watts)."
Asus ROG GX800 Notebook Preview
Florian Glaser ( translated by Andreas Osthoff), 07/15/2016
Gaming Geforce Notebook Skylake Windows
Prestige object. We are one of the first editorial offices with the chance to have a close look at the upcoming GX800. Compared to the predecessor, Asus not only increases the performance and the dimensions but also equips the device with a 4K display and G-Sync support. Can the cooling system handle the components? Our preview answers initial questions.
http://www.notebookcheck.net/Asus-ROG-GX800-Notebook-Preview.169306.0.html
"There will obviously be a "real" review as soon as we get our hands on the final product."Last edited: Jul 16, 2016stank0 likes this. -
so about the battery boost thing for gt80, is it possible to disable it and make system full usage of power brick?
-
I believe that the battery boost works the other way around. It draws additional power from the battery if needed. That would explain some people comments about battery drain when under a high load.hmscott likes this.
-
but that's what I don't like about the GT80 power system
-
I wonder if MSI would use dual 330w instead of power+battery for upcoming GT83.
mason2smart and hmscott like this. -
I have
-
I thought that the gt80 is limited to 330w by EC? -
tbh, 1080 isn't pascal. its a mere die shrink with some removed maxwell feature put back on, pascal was meant to be a different design completely.
i'd be safe to assume that 2nd block connect to water block can actually have power flow into the laptop right? other wise its meaningless LMAO.mason2smart and hmscott like this. -
@stank0 @hmscott @mason2smart
heres another CDM result of single drive sm961 this time with more performance due to different driver, under windows 10.
look at the 4k result of single SSD, almost no point having raid. if this is raid 0, with server operating system, can expect 4k random read probably top 70MB/s and 4k random write to hit 450-500MB/s, blazing speed man.hmscott likes this. -
Yes, the 660w powers the water cooling too, but there is much more left over for powering the 980's/1080's/CPU OC than a single 330w air-cooled laptop would have. That's why Asus and Notebookcheck's OC performance results are so much higher than the power constrained GT80S 980 SLI.
-
Well, Yeah, Pascal was supposed to have features that upcoming Volta was supposed to have, but long 28nm generation forced Nvidia to change its roadmap few years ago. Pascal became pretty much what we have now, updated Maxwell with little extra. Next wave of Pascal will probably have HBM2 and that would be full intended Pascal lol.mason2smart, ole!!! and hmscott like this.
-
Yes, that is what I said, battery power used to boost performance, not that Nvidia thing
stank0 likes this. -
u know what sucks though, is that all those power, and can't even overclock cpu past 4.2ghz. its a real shame a water cooled laptop yet crippled with such underwhelming cpu performance, if it can do 4.5ghz i would have less regret and complain and just buy it in 1 go.mason2smart likes this.
-
There really isn't a laptop CPU that gives more 'pute right now without going to a desktop CPU which is fine and all but really adds to the heat load and power draw greatly.
But, indications are that more CPU power is needed for the 1080 and 1080 x 2 for sure.
95w is too much, we only just got a 3rd cooling fan for the CPU struggling at 47w dissipation, IDK if there is room to cool a 95w CPU in the current design.
We need about a 65w TDP CPU, and it looks like Intel could deliver the CPU required in 2 versions with Skylake.
With an older released 6700 non-k CPU:
http://ark.intel.com/products/88196/Intel-Core-i7-6700-Processor-8M-Cache-up-to-4_00-GHz
Or with the brand new 6785R embedded CPU, with EDRAM and other goodies:
http://ark.intel.com/products/93339/Intel-Core-i7-6785R-Processor-8M-Cache-up-to-3_90-GHz
http://www.anandtech.com/show/10281/intel-adds-crystal-well-skylake-processors-65w-edram
Or, with some new Kabylake CPU...but the "good ones" with EDRAM and GT4e won't be released until this time next year, Q2 '17, rumored.
http://laptopmedia.com/news/kaby-la...4-of-2016-along-with-apollo-lake-and-broxton/
http://www.channelpro.co.uk/advice/9734/skylake-vs-kaby-lake-7
http://wccftech.com/intel-2016-road...es-10-core-broadwelle-apollo-lake-processors/Last edited: Jul 16, 2016 -
What happen when you unplug battery cable from motherboard and attempt to break the 330w power limit? System throttle?
Probably bad binning?mason2smart likes this. -
that isn't it dude. to take actual TDP rating number from intel seriously tbh is a big joke as they can easily restrict what we do via binning or power restriction through bios or hardware.
if assuming no power restriction etc, desktop and laptop are same architecture, they are similar, mobile just has an overall smaller area and no IHS and perhaps a different graphic chip, thats about it. the reason here is poor binning just like Mobius mentioned and intel wants to control that and make this a standard to last several generation so we get use to lower clock speed, but on desktop they know enthusiast dont accept those bs otherwise their chip won't sell.
4 ghz 4cores at 1.25 to 1.3v is extremely poor tbh, my 8c chip isnt even considered to be that good and im capable of running 8 cores at 1.24v at 4.2ghz.
the truth doesnt change, intel is giving us garbage chips and via BGA method so we have no choice to get them. intel get away by calling it "thinner".
adding an extra fan by MSI is a good thing, well they knew their cpu ain't getting any room temp air except the left over hot air from GPU heatsink so yeah, those guy should know better if they going to clock a garbage chip past 4ghz while having two hot GPU running, cpu cooling would be extremely bad it was expected. also adding additional fan might not even help if the chip binning continue to get worse, like 1.35v for 4.2ghz or 1.4v for 4.2ghz LMAO, temp will be the same even with extra fan, or worse.
yepLast edited: Jul 16, 2016hmscott likes this. -
I used to have a laptop with desktop cpu about 12 years ago (another Sager). I'm very reluctant about getting ANYTHING desktop crammed in a laptop chassis after that experience.
I know technology advanced but... it's burned in my brain. LOLhmscott and mason2smart like this. -
I'm curious about that too. I've got an extra power brick from my dead Sager which looks identical to the DT80s one, but I'm afraid that the heat from that extra power will be something our Titan won't be able to handle.
-
mason2smart Notebook Virtuoso
Lol that's cheap -- maybe someone can prove match them lol. -
mason2smart Notebook Virtuoso
Nah they will stick with gddr5x for pascal to milk it more - only the high end chips will have it in the first release. Second release all will have it -
mason2smart Notebook Virtuoso
What happened? -
mason2smart Notebook Virtuoso
Taken down already -
Well, what you think? LOL
Heat related crashes, abysmal battery life. all you can imagine.Last edited: Jul 16, 2016hmscott and mason2smart like this. -
mason2smart Notebook Virtuoso
Why do people swear by them then? -
You are letting "garbage BGA" thinking cloud your reasoning
At the same lower TDP range, the 6700K and my 5960HQ performed CPU benchmarks the same.
No garbage, max performance based on max TDP allowed before power throttling.
At 47w limit my CPU would hold 3.6ghz steady on all 4 cores indefinitely, cooling was great, and temps were in the 70c range.
Given 2x the TDP power limit, the 6700K has more headroom for performance.
If we get a CPU halfway between, say 65w TDP, which is more likely to be powered and cooled in a GT80 as is, we could meet the CPU requirement better for feeding 2 x 1080 SLI.
It's just good engineering.
-
HK CPU with additional cooling fan in the middle would prove useful.
-
-
There have been plenty of people with GT80's using a 2nd power supply, and with some small gains, mostly just sharing the load across 2 power supplies is the benefit.
The EC/BIOS needs to be modded to allow more power draw in the newer GT80S as they have fixed limits compared to the GT80's experience.
Some owners of the GT80S 980 SLI tried to get MSI to increase the power draw limit through AC power, instead of drawing from the battery, but MSI "maxxed" it out within safety limits of the power adapter and it didn't help enough.
A new GT83S / GT73S would need to be designed to support more power from a larger external PSU to exceed the 330w single PSU to really improve things. -
ofcourse 12yrs ago every technology itself is overheating pretty much. that example itself is terrible lmao.
i think you brainwashed yourself somehow and then let other people who thinks similar help that brainwash some more lol
i guess enthuaiast always look for something better, ultimately it'll just come down to paying the same amount of dollars yet getting 10-20% more juice out of it. if you don't wish 20% extra juice then thats fine too lol.mason2smart likes this. -
@stank0 has it right, and time hasn't changed anything.
Heat is heat, when you put in a 95w CPU, OC it and push it to the limit, you are going to be running fan's at 100% or running too hot - or both.
Watching the Clevo Tuning and P870 forum experiences reported over the last year confirms it.
There are people that don't mind living with constant tweaking, re-pasting, replacing heat plates after they warp, and running high fan speeds all the time at the limit to run the highest of performance, and then there is everyone else
We like Xtreme performance, we don't want to put up with the overhead of the never ending pursuit of getting and maintaining that Xtreme performance.
I get more than enough performance out of the box with the MSI GT80 SLI-263 for gaming, the OC is measurable, but mostly unnecessary - and even gets tuned out when I reduce FPS to meet the refresh rate.
I took performance up a little further than out of the box, mostly to test the stability of my new hardware, but I usually keep a reduced OC for normal use.
Then I tuned the gaming FPS for best usable performance, within the refresh of my display, limiting the performance - reducing the heat and power draw, reaching optimal usable performance + low noise + low power - I never see the battery drain others can see at Max OC, and I get 98% of the gaming performance they do.
There is nothing wrong with that. It's not to be ridiculed or derided, in fact most people don't even achieve that, yet have 100% satisfaction and fun gaming with their out of the box GT80.
What are you doing here critisizing people for optimizing their GT80's, enjoying their purchase, enjoying their gaming, and living just fine with out Xtreme OC'ing and spending more than 50% of our computer time tweaking the computer, instead of enjoying it.
I can do the Xtreme Tuning thing, have done it years ago, but woke up the fact that it really didn't provide any benefit's over stepping down a bit and optimizing my optimizing time.
It's an experience thing, not "brainwashing", and don't worry, if you keep going you may be lucky enough to get there too
This doesn't mean I am set on an MSI GT83S 1080 SLI as a final choice, as if a Clevo, or an Asus, or a ?? are better performing, cooling, lower noise, taking the price into consideration - I may go with something else.
It's gonna be fun
Last edited: Jul 16, 2016 -
I guess they don't have problem justifying the negatives by increased power, albeit marginally?
ROTFL
I don';t think the 12 years plays any role here. My last Sager had 2x GTX680m and it was... ok. But there were people who bought GTX680 desktop version and you can read their rants right here http://forum.notebookreview.com/threads/official-clevo-p370em-sager-np9370-owners-lounge.685692/ And is about 3-4 years old machine.
Precisely.hmscott likes this. -
it does play a role, not many yrs ago from 12 yrs ago, they tried to use desktop fan concept in laptop LOL which puts fan vertical and as you can imagine the height of laptop cant be over several inches long so the fan is PUNY which is pretty funny.
theres no desktop 680 version for laptop, theres only 680M for laptop, and 680MX was only ever released in MAC all in one. i have had two gtx 680M in my alienware and they run as cool as they can be, you just using a terribad designed machine as an example LOL. when the cooling and design is adequate you'll see amazing things.
not fully understanding hardware cooling and CPU TDP will lead to wanting less power, that in itself is bad bad excuse as OEM will take advantage of that and give you lower performance junk
which is a big nono.
mason2smart likes this.
***The Official MSI GT80 Titan Owner's Lounge***
Discussion in 'MSI Reviews & Owners' Lounges' started by -=$tR|k3r=-, Jan 13, 2015.