I completely agree, especially considering that was with Turbo Fan on.... on my GT60, turbo fan drops my GPU temps substantially. Granted, MSI has a very slow non turbo fan speed. For comparison, my 780m runs mid 70'sC with turbo fan on all day long while under heavy use.
This is my concern and reservation for ordering a Razer 14".... if only they stuck with the 860m Maxwell this generation...
-
But it makes no sense to me why Clevo do this because the only difference between the two is obviously the larger screen, and the 17" offers two 2.5" bays, 15" only one. They should both get the 230W PSU IMHO.deadsmiley and Mr Najsman like this. -
-
deadsmiley likes this.
-
Killerinstinct Notebook Evangelist
I wonder if alienware is gonna change up their lineup , its highly unlikely they are gonna have a completely different design but maybe slightly thinner
Sent from my LG-D800 using Tapatalk -
Killerinstinct likes this.
-
-
ThePerfectStorm Notebook Deity
-
-
Sent from my SPH-L720 using Tapatalk -
Last edited by a moderator: May 12, 2015 -
-
-
Same can't hear a thing .
-
i edited the video, sorry my bad
and thats 50% only..
or for the GE series :
Last edited by a moderator: May 12, 2015 -
Last edited by a moderator: May 12, 2015sasuke256 and ThePerfectStorm like this.
-
Reminds me of this video I saw a few years back:
Last edited by a moderator: May 12, 2015 -
Small fan, few thin heatpipes to the CPU and GPU.
GT series is the only ones that cut it. The fan is pretty beefy and can move a lot of air. The WHOOOSH sound this beefy fan makes isnt so bad actually.
I owned the GT70 with 680M and I never had to use the turbo fan mode. Neither did it go in full speed automatically either.
Another story with the GT60/70 with 1GHz 780M (880M) I guess. I don`t think the cooling system in the GT series isnt capable for that sort of heatLast edited by a moderator: May 12, 2015 -
Lets hope so, because they are the first to release their 870M/880M enabled laptops
-
Yeah, that's been an issue since I can remember.
<iframe width='480' height="360" src="//www.youtube.com/embed/PFZ39nQ_k90" frameborder='0' allowfullscreen></iframe>Last edited by a moderator: May 6, 2015 -
GT70 with GTX 870M
86C for 870M and 82C for the CPU
GT70 with GTX 880M
89C for GTX 880M and 96C for the CPU
All running a simple 3DMark Vantage test.
LOL
MSI GT70 Dominator & Dominator Pro with nVidia GTX880M, GTX870M Review Benchmarks Unboxing - YouTube -
That's in US, come to see what it will be like in Tunisia (ambiant temp in summer ? 34°C and it's not the max)
ThePerfectStorm likes this. -
sasuke256 likes this.
-
Last edited by a moderator: May 6, 2015
-
Maxwell chips are gonna make the heat go away for a while but I think nvidia will push to 100W TDP to create the refresh high end GPU for laptops !
-
Killerinstinct Notebook Evangelist
....
Sent from my LG-D800 using Tapatalk -
Now imagine how powerful a 20nm 250W desktop card would be. THAT would be something truly worth drooling over.
-
ThePerfectStorm Notebook Deity
Where I live (Chennai, India), ambient temps are 37-38C, the GE series would replace an induction cooktop.
Cloudfire, TomJGX and sick19thang like this. -
ThePerfectStorm Notebook Deity
-
Hi guys!
I'm just one of the thousands who follow this thread for the latest information about Maxwell. You've really got some knowledgeable people here.
Anyway seeing that we were veering off topic slightly from gpus to the origin of human life, just thought I'd contribute my two cents and clear the matter.
I work often in ambient tempratures of 50-55°C in 100% relative humidity with NO VENTILATION.
Despite sweating like a pig and despite "wikipedia" I'm fine.
Hope that settles it.
Cheers !James D likes this. -
Back on topic.
The statement below was based on reading old articles... sorry.
<strike>I have read a few articles about TSMC and Global Foundries having difficulty getting 20nm FinFET process to work. nVidia needs this process to get the performance out of Maxwell. If the fabs can't do it, then Maxwell isn't going to double the performance of Kepler.
Thoughts anyone?
</strike> -
Great. So now Nvidia wIll probably have to pay Intel a license fee to use their tech.
Beamed from my G2 Tricorder -
I just realized I was reading articles from mid 2012. Here is something a bit more recent and from the horse's mouth.
TSMC Tweaks 16nm FinFET to Match Intel - Electronics360 -
20nm FinFET won`t exisit. Just a normal 20nm process.
They will go straight to 16nm FinFET which is already being produced at TSMC.
Worst case scenario is that they skip 20nm and do Maxwell on 28nm and do 16nm FinFET GPUs next year. But 20nm isnt a midstep they skip like 22nm was. And 16nm will be even more expensive than 20nm and harder to produce, so the theories involving cost and yields about 20nm means we never get any new nodes.
Killerinstinct and SinOfLiberty like this. -
Meaker@Sager Company Representative
Now that we have reached the proper conclusion however: (yes I am a stickler for accuracy, though usually I am corrected tech mistakes, not biological)
Intel was the first to launch finfet but it did not originally come up with the idea AFAIK, so it's not their technology as such just something they figured out how to use before anyone else.
The 20nm node should be exciting enough by itself for now though. -
Beamed from my G2 Tricorder -
SinOfLiberty Notebook Evangelist
28nm and above are here to stay for forever. And u shall\ be grateful that our lord even blesses us with such a wonderful joy as purchasing new GPU!!
who even said it? -
-
-
As Alternai pointed out. I don`t think TSMC had the equipment nescessary to produce 22nm FinFET, Intel is ahead here. And maybe it just wasn`t worth it, performance and cost wise. I don`t know.
The good news is that TSMC is ready with their 3D FinFETs and will be using them on 16nm. That should give some benefits. FinFET is pretty interesting since you get bigger current flow and less leakage. All in all a more efficient chip.
But we are talking maybe late 2015, early 2016 here. Intel will probably be at 10nm when its available. Intel is in a pretty sweet spot. Ahead in semiconducting industry, and ahead in CPU designs. And now managed to beat AMD in their graphic APU game. No wonder they make a crapload of money each year.
They could probably sell just as cheap CPUs as AMD and still make a much better profit than AMD which is fab less and have to pay for wafers, but why bother when you are so far ahead? Sweet position indeedSinOfLiberty likes this. -
SinOfLiberty Notebook Evangelist
Cloudfire likes this. -
Hey guys, nice thread and a lot of good info(though a lot more speculations
). Please keep it on topic.
Cloudfire, LTBonham and ThePerfectStorm like this. -
I wonder what "stacked VRAM" can do... -
Tomorrow could be the day where more is revealed regarding Maxwell
Twitch
Here is some of what Jen-Hsun Huang shared with us last year :thumbsup:
<iframe width='560' height="315" src="//www.youtube.com/embed/BYJ1-XQzHx4?list=PLZHnYvH1qtOY0ZrWQgnQlj4dwGZ1pVgoj" frameborder='0' allowfullscreen></iframe>Last edited by a moderator: May 6, 2015Mr Najsman, Killerinstinct, Robbo99999 and 1 other person like this. -
Actually, I have heard that Intel has excess high end (22nm? 14nm?) capacity at the moment and is allowing certain companies to use their fabs. -
SinOfLiberty Notebook Evangelist
and of course the gpu (yes, I know , no need to tell me) everyone of us has been itching to buy, 790!! -
SinOfLiberty likes this.
New details about Nvidia`s Maxwell
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Feb 12, 2014.