Not trying to defend my brand or anything, but my 780M also stays relatively cool. Never goes above 85 when gaming. Most of the time it's around 78 when gaming. Still, Alienware has awesome cooling. What about ASUS though?
-
SinOfLiberty Notebook Evangelist
25% more cores than 650 Ti
Expected performance according to the source where it all came from"Sweclcockers" 25% over 650 ti
Core clock increase 22% over 650 Ti
Now, how come does it only have 25% over 650 Ti when the amount of cores is 25% and core speed is 22% higher. This suggests the new architecture.
On the other side: It might have been a kepler card that is waiting for be released(with a slight delay) in the mid range segment. I am definitely with it.
Edit: Has very similar specs to 675MX kepler card. In fact, it is a down clocked version of it. -
I didnt mean to bash any brands. Its just my observation that atleast Alienware 18 seems to run cooler than what other brands can offer. 85C with GTX 780M is not bad either 1nstance
I think Asus G750 have pretty good cooling as well, but I`m not sure. They will use the same model with GTX 880M so they seem to trust their cooling system.1nstance likes this. -
Robbo99999 Notebook Prophet
Well, if you're basing Maxwell vs Kepler performance efficiencies on the result of this leaked information then I think you would conclude that Maxwell is less effiicient than Kepler given the results I posted. Can't believe NVidia would develop an architecture that is less efficient on a per core and per Mhz basis, unless the new architecture allows something like 50% more cores per mm squared when compared to Kepler when comparing both architectures on the SAME node size (say both on 28nm for arguments sake). -
-
GTX 650 Ti:
768 cores @ 1046MHz (803328)
4995 GPU points in 3DMark11
GTX 750 Ti:
960 cores @ 1176MHz (1 128 960)
5608 GPU points in 3DMark11
5608/4995 = 1.13
1128960/803328 = 1.40
Blimey you are indeed correct. Efficiency have indeed gone down. By a lot.
Is there any way in reducing power consumption without increasing performance/core with a new architecture? Perhaps thats was Nvidia`s goal with Maxwell in 28nm. Just reduce power consumption and wait with performance until 20mn is here? -
SinOfLiberty Notebook Evangelist
This is so boring.
Once upon a time everyone praised the mighty Kepler, now the audience is sick of it. I mean, it is time for him to shade away and let the successor take the throne. -
Robbo99999 Notebook Prophet
-
Less transistors in a core? If that is true for Maxwell, we are looking at a huge increase in core count if they are gonna double the transistor count with 20nm right? And it means performance/core have again gone down with Maxwell vs Kepler. Kepler cores was already 1/3 performance of a Fermi core if I remember correctly.
But you might be on to something robbo. Sweclockers revealed today that GTX 750 is also coming along with 750 Ti, and it require no external power supply. It can suck its juice directly from the PCI-e port
http://wccftech.com/maxwell-gtx-750-nonti-inbound/ -
Robbo99999 Notebook Prophet
There's nothing magical or indivisible about a "core". They're only made up of transistors right, so if NVidia design a new architecture, then they can choose to design their "cores" with a different layout, either using more or less transistors (and probably changing a bunch of other stuff too that I don't understand). So, if they've designed Maxwell to have less transistors per core, then it could well be that we can see can improvement of efficiency per core in terms of space saving on the die and consequently as decreased power consumption per "core". The 960 cores on the 750ti in the leaked info, if they were these new "cores", then that's the only way that we could say that Maxwell was more efficient that Kepler - based solely on the leaked info that you showed us. As I said though, we might be basing all this discussion on a leak of dubious integrity. It's so hard to explain these technical things in words, I'm doing my best, but not sure I can explain it any better.
EDIT: to answer your Kepler vs Fermi point. 1 Fermi Core is about as powerful as 2 Kepler Cores. Fermi cores were 'hot clocked' on the shader, at twice the frequency of the core clock (that's where the factor of two comes from). So, yes, that's another example of when NVidia "redefined what a core was".Cloudfire and deniqueveritas like this. -
-
-
My ambient temperature is quite low and I've got a custom cooling pad as well. That is likely a factor as to why. -
ThePerfectStorm Notebook Deity
Then what in the name of god is the Lenovo 860M 4GB? As far as I know, Lenovo do NOT offer MXM. 3 versions (4GB soldered, 2GB soldered, 2GB MXM) is straignt into the realm of impossibility.
-
Karamazovmm Overthinking? Always!
dosed, mixed and stirred, not shaken, from taptalk -
ThePerfectStorm Notebook Deity
It helps because it gives people a way to determine versions.
Also, depending on the resolution (3K MSIs) and the game and setting, 4GB can give better performance than 2GB. -
Meaker@Sager Company Representative
-
Learning from this, I repasted the CPU by also caking on the MX-4. With this my 4900MQ never runs a hair above 78 C during the XTU stress test (avg = 73 C). Prior to repasting, it would spike up to 91 C, and the average would hover around 85 C.
I don't want to sound ridiculous here, but at least for my P370SM, it seems that applying gobs of MX-4 actually does more good than applying "just enough" or too little. Perhaps you might want to try just caking on the thermal paste and see if that helps? I used the line method for the the 780M and 4900MQ, 2 lines each about the thickness of a rice grain on the 780M and 4900MQ die.
EDIT: Should really mention that these temps were obtained using the laptop on a U3 cooler with the fans on medium setting. Also, -80 mV undervolt with 4900MQ. -
Robbo99999 Notebook Prophet
-
Right, so you can probably imagine what a proper paste job (that still involves smacking on the MX-4 but without the air bubbles) could do to the already IMO pretty good temps.
And yeah I picked the MX-4 specifically for its ease of application and removal (ie n00b-friendly); heard the IC Diamond is a biatch to apply, and will etch dies if not properly removed. Besides I decided to personally boycott the IC Diamond after the whole IC vs Techpowerup fiasco. -
Robbo99999 Notebook Prophet
SinOfLiberty likes this. -
Robbo99999 Notebook Prophet
-
SinOfLiberty Notebook Evangelist
From Video cardz, in the comment section:
No, GM107 is basically a GK107 refresh. Even the specs are supposedly the same. Just better binned, more power efficient version.
This article non directly suggests Maxwell details to be revealed in upcoming weeks.
OT:.
Edit: Might be called gm 107 while still being a Kepler refresh. -
SinOfLiberty Notebook Evangelist
-
well this means probably a rebadge for the GTX 860M
-
Those specs are wrong. 30 ROPs? Look at the GPU-z screenshot in the previous page. It says 16 ROPs.
That is GPUBoss. Do they even have any credibility?
The cards are either Maxwell or it is Kepler. It can`t be somewhere in between. GM107 can`t be a Kepler refresh. Its called GM for a reason.
It seems that the core-clock efficiency have gone down a lot with the GTX 750 Ti compared to GTX 650 Ti. So it looks like an entirely different architecture than Kepler. Less performance efficient than Kepler, but it looks like it will be concentrated around power reduction instead. Meaning they can shove in many more cores with 20nm when it gets here.
Either GTX 750 Ti is GM107/117 aka Maxwell or it is GK10x aka Kepler. -
SinOfLiberty Notebook Evangelist
Gpu boss revealed the important part.
They got rop count wrong but the leaked gpu screen also has some parts grayed out. U cant reveal everything, either gray it out/ mess up a bit of info. Same happened to 780, its specs differed from what It has but gk 110 was confirmed. -
Karamazovmm Overthinking? Always!
dosed, mixed and stirred, not shaken, from taptalk -
Karamazovmm Overthinking? Always!
dosed, mixed and stirred, not shaken, from taptalk -
ThePerfectStorm Notebook Deity
How much vram do you actually need? - Graphics Cards - Linus Tech Tips
Remember I said sometimes, not all the time.reborn2003 and Cloudfire like this. -
GeForce GTX 750 Ti vs 650 Ti Boost
"by hassan ( Oct 2013)"
Which happens to be when Videocardz posted this fake GPU-Z screenshot.
"Oct 3rd, 2013"
¨http://videocardz.com/46347/nvidia-geforce-gtx-750-ti-pictured-detailed
Exact same specs, exact same fakeness. -
Karamazovmm Overthinking? Always!
Oh I feel forgot crisys is one of the few 64bit games out there...
dosed, mixed and stirred, not shaken, from taptalk -
And Crysis 3 doesn't have an 'Ultra' setting or 16x MSAA.
And having 3 configurations of the 860m isn't impossible. The 660m came in 1GB MXM, 2GB MXM, 2GB soldered versions and the 650m came in 512MB soldered, 1GB soldered and 2GB soldered versions iirc.HTWingNut likes this. -
Cloudfire likes this.
-
I think all high end GPUs should come with 3GB or 4GB VRAM because there are some scenarios in games where it actually does hit over 2GB in VRAM usage. Like Skyrim with several mods running @1080p use 2.2-2.4GB VRAM. Not to mention several games would be dangerously close to 2GB. And you have to pick either 2GB or 4GB for 256 bit GPUs. 2GB would leave me very sceptical while playing with GTX 880MX for example.
reborn2003 and Robbo99999 like this. -
Robbo99999 Notebook Prophet
-
Totally agree. 4GB should be standard.
Anyhow, what happens if you cross 2GB usage in a game and you only have 2GB? The extra VRAM usage is put on the DDR3? Or does it start clearing stuff which is cached and it doesnt need? -
Truthfully though, even though a game shows 2.5GB vRAM usage, does it really mean it's taking advantage of it or just it's textures staged there for use? If you had 1GB vRAM and it used 2GB, it would prioritize what it needs, and the other 1GB would just be staged in your system RAM the way I understand it.
-
Robbo99999 Notebook Prophet
@HTWingnut, that's an interesting theory, I don't know how it's proved or disproved though. Maybe if a 680M 2GB went up against a 680M 4GB when running Skryim that uses 2.3GB on the 4GB card, and comparing performance results between the two. Don't know if that's been done before though!reborn2003 and Cloudfire like this. -
edit: Not Skyrim, but here's some benchmarks 2GB vs 4GB single card:
http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/reborn2003 likes this. -
Robbo99999 Notebook Prophet
-
Robbo99999 Notebook Prophet
Strange Double Post!
-
There was a test, but I just can't find it. I don't think it was Skyrim, but maybe GTA IV or something where vRAM usage was like 2.5GB, and they tested 2GB and 4GB cards and the result was the same at 2k or lower resolutions.
I dunno. If I had an unlimited budget and that was my life/job, I'd love to test/evaluate stuff like this. But two $800 cards is a bit out of my budget.reborn2003, Cloudfire, Robbo99999 and 1 other person like this. -
I think the GPU put everything on the VRAM. It doesnt make sense to me to put important things like textures and such on the system memory, because its slower but also because there will be latency from moving files from DDR to VRAM if needed.
The system memory is used for something obviously since games do use DDR3, but im not sure what exactly those are. All I know is the system use both DDR and VRAM and the memory you see used in GPU-z is only from VRAM and not VRAM+system memory.
It would be interesting to see what happens to performance if you use 2.5GB and only have 2GB VRAM. Is the performance hit from clearing VRAM cache to make room for things thats need to be written or is it from reading game files from the slower system memory? I`d like to think it is because DDR3 is slower, but I`m not sure. -
Robbo99999 Notebook Prophet
-
Robbo99999 Notebook Prophet
-
Karamazovmm Overthinking? Always!
dosed, mixed and stirred, not shaken, from taptalk -
Robbo99999 Notebook Prophet
I wonder if the 880M will be able to handle games developed on Unreal 4 engine!? Here's a video (a boring one) showcasing something created on Unreal 4, the lighting looks good. Anyone know how far away we are from games on Unreal 4?
Unreal Engine 4 Architectural Visualisation - Hind House - YouTube -
ThePerfectStorm Notebook Deity
This might help though - UNREAL ENGINE 4 FAQ . Read this before asking.Robbo99999 likes this. -
Here's Unreal 4 engine Hind House:
And the real Hind House
ThePerfectStorm likes this.
Clevo notebooks with 800M series coming out February 2014
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 11, 2013.