So, I must ask, if this had this dock, tb3.0 with eGPU support with sli, the internal card being mxm for upgrades, larger power supply, the dock also having the normal outlets a dock would, would you then be happy?
-
-
Seems to take some styling cues from Asus -
Oh and throw in a desktop CPU while you're at it. -
Robbo99999 Notebook Prophet
-
-
-
ajc9988 likes this.
-
I believe this is MSI GT72 and it looks thin (note left side of chassis, GS models don't have those buttons, also display top edge is straight in thin models):
This is Asus G752 and it looks thin:
We know both are big, thick and heavy, with decent cooling.
For comparison, here is that EVGA one (it can still be as thick as those two):
But yes, EVGA is exaggerating in either "thin" or "very high-end" claims. I just like to think it was "thin" that was overplayed -
So how's that supposed 990M 150W claim going?
I'm studying if I need to get a second power brick just to run two of those cards (who wants to bet they won't support SLI)?
Here's to hoping that a 450W-500W power brick somehow arrives on the market for the P770ZM's successor and that it fits in the same slot as the 330W brick and I can use it and I won't need to worry about multiple power bricks? =D? I hope 500W.ajc9988 likes this. -
I am sorry if your SLI rig is broken right now, but fix it, don't trash it
The rest of us are running SLI just fine - and getting huge FPS increases.PC GAMER likes this. -
PC GAMER likes this.
-
-
Last edited: Sep 2, 2015
-
-
Seriously, the only reason I even bother with SLI is because of games like the Witcher 3 where a single Titan X at 1440p would not be adequate
PC GAMER likes this. -
There have always been games that don't take advantage of SLI, and even the recent UE4 fiasco will pass, there is too much demand for SLI games - I won't buy non-SLI games - and I am sure there are many others that feel the same way.
Why buy a non-SLI game when there is another cool game releasing that supports SLI? It will show in their bottom line quickly.
Using the fastest GPU is recommended - don't SLI cards even 1 step down - get the fastest single GPU first. It's a good rule to follow, but if you can't afford it, SLI'ing another card over time when you can afford it is still fun, even dedicating a gen older card for PhysX is nice
DX12 may flop due to Windows 10 only availability, with all the bad juju around Windows 10, what developer is going to jump on board that sinking ship now?
SLI in a laptop is awesome, there is no way around it, it's just cool
I do understand all the Alienware badness killed off SLI joy for a long time now, but that is only one screwed up manufacturer, that seems to be trying to make it right now.
There are now so many other manufacturers shipping SLI solutions, it's actually a great time of renaissance for SLILast edited: Sep 2, 2015 -
http://www.pcworld.com/article/2978...iquid-cooling-and-a-secretive-nvidia-gpu.html
here's another article but in english.
so where's clevo/sagers solution to this? cheaper prices for the same performance would be nice. any legit benchmarks? -
Sounds like a win for SLI
Do you go into the configuration and disable SLI every time you do something else?? Or do you let SLI run in everything, but only think you need it in Witcher 3? Do you enable DSR in other games? -
PC GAMER likes this.
-
"Does ARK: Survival Evolved support SLI?
Currently with DX11, SLI is not supported - however when DX12 rolls around in the Summer, it will be."
http://steamcommunity.com/app/346110/discussions/0/613957600545122958/
DSR comes on via GFE tunings, at least they have for me, or you can do it manually. It does help, and with all the headroom with an SLI GPU configuration, you don't notice the FPS hit, it just runs and looks great.
MGS V optimized on GT80, DSR 2:1 set.
Dead Rising 3 – Enable NVidia SLI Support if you don't use GFE profiles.
http://steamcommunity.com/app/265550/discussions/0/613935404125359416/
Nvidia released GFE SLI profile for Dead Rising 3, 8/22/2014:
http://steamcommunity.com/app/265550/discussions/0/613939841479465491Last edited: Sep 2, 2015deadsmiley likes this. -
TomJGX likes this.
-
I don't think Asus would have gone through all that effort unless it needed to be done.
Let us know how it works out if you end up giving it tryLast edited: Sep 2, 2015 -
There are limits to SLI, yes of course, but it is better than not having it. It's as simple as that.
You have a choice to run single GPU or SLI, why wouldn't you want that if it works, and it does work enough of the time to get it and have it available.
It's not perfect, but it does so much good that it is worth having, if it is available.
Looking at the positives is so much more beneficial than "trashing" everything -
-
Can't wait until 2015 is over, so we can close this ridiculous thread, lol.
TomJGX, PC GAMER, deadsmiley and 1 other person like this. -
TomJGX, PC GAMER, jaybee83 and 1 other person like this.
-
-
-
-
Then you would understand that I never said it would 100% be GM200 but there was certainly possibilities that we would get that. But most importantly you would understand that a low clocked GM200 would match a high clocked GM204 but the user would have bigger overclocking headroom to unlock more performance. You know, for the benefit of gamers in this forum...
You are so lost if you think a air cooled GTX 990M will only be as fast as GTX 980M. It will completely destroy a 980M.
You have 33% more cores to play with and the TDP will not be 180W or 165W or whatever Notebookcheck posted. @Prema can probably brief you on exact TGP info but it will not be higher than 880M and my guess is that it could be lower. It really isnt that hard to understand. -
-
Might want to word yourself differently if you are talking about something else.Last edited: Sep 2, 2015 -
I still doubt the existence of this GPU or believe it might be an Apple exclusive
-
-
-
"Undocked" it will smoke GTX 980M. Stock vs stock it will be around 40-50% faster than 980M. Overclocked you have 33% more cores to get more performance out of than 980M. TDP of the chip will be much lower than the catastrophic picture you have been trying to paint about this GPU. If its worth it or not or if one should wait til Pascal, thats entirely subjective. Period
Bye bye -
Turns out the ppl who kept saying it exists were right after all.
-
-
-
It's like Asus decided they wanted to stay competitive with companies that were making E-GPU's but didn't want to give their customer's too much of an upgrade path so they went with a blower on the back of their laptop instead.
-
-
Here are the specs for the G752, and the GPU's have multiple sized VRAM again, like when the G750 was "MXM" - non-standard MXM, but swappable.
Last edited: Sep 2, 2015 -
Last edited: Sep 2, 2015
-
hmscott likes this.
-
G752 is MXM?
Pretty sure the GPU in "Dat Ass" is soldered...D2 Ultima likes this. -
alright enough goofing around here kids, time to bring out the big guns
so Asus hid the 990m pretty well and all the info people can get is from DX diagnostic tools
btw mods, feel free to add spoilers for me
353.97, probably a beta driver lol, and "graphics device", c'mon you guys can do better than that...
clocked at 1190MHz/5GHz core/mem, so essentially a downclocked GTX 980, VRAM is probably toned down for power purposes
8GB of VRAM as expected, so let's cut the 16GB bs from now on. on a side note, device ID 10DE? interesting name...
more info the GX700 here http://www.computerbase.de/2015-09/...hltes-laptop-mit-2.048-shader-gpu-von-nvidia/
seems like this water cooling thing is only for the dock... and asus claims 80% higher perf with the dock, sounds more like throttled by 80% without the dock...
edit: for those who actually believed in wu haijun, that man is known for leaking half true news, so I hope all of you took a grain of salt with whatever he says... 990m will not have 980m SLI performance by any means, maybe with a crazy OCLast edited: Sep 2, 2015 -
Knew it wouldn't have that level of performance from the get go. Pascal will stomp on it, though.
TomJGX, PC GAMER and Kade Storm like this.
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.