You're talking about cost-effectiveness with a laptop that's equipped with a 12-core hyperthreaded desktop workstation CPU...? Seriously, if you even CARE about the cost by that point, then you're probably giggling to yourself about how much money you have or about how epic this laptop is gonna be...![]()
-
-
Oh don't mind him, he's known to do that on occasion. When you post as much as he does, a head-scratcher is bound to slip out sooner or later.
-
Robbo99999 Notebook Prophet
reborn2003 likes this. -
Robbo99999 Notebook Prophet
Jobine likes this. -
-
GTX 580M: N12E-GTX
GTX 680M: N13E-GTX
GTX 780M: N14E-GTX
K5100M (8GB VRAM): N15E-Q5-A2
GTX 880M: N15E-GX-A2
Here comes the confusion:
GTX 680M: N13E
K5000M: N14E
Both cards identical core count.
GTX 780M: N14E
K5100M: N15E
Both cards identical core count.
GTX 880M: N15E
K5200M: N16E?
Both cards identical core count?
-
-
-
Well it depends on the game:
For example:
Bild: x-plane_2012_03_15_00fekmx.jpg - abload.de (1024 resolution)
Bild: cod_ghosts_1_levelzrs61.jpg - abload.de
Bild: hma_2013_03_30_19_14_izpsj.jpg - abload.de
But well i know only a handfull of games, where you 3+GB Vram. So you are right, 8GB Vram seems a bit off the mark. -
^Dat GTX Titan SLI doe.
-
Just wait until the card is out before concluding on anything will ya?
Nvidia haven`t put out any 8GB for mobile cards before, even for their rebrands. So either the specs on the slides are wrong or Nvidia have plans for them. -
anyway, in terms of bus width i'm again disappointed, in the best case the GTX 860M will be a maxwell more or less equivalent of 960 Kepler cores with 2gb of GDDR5 @ 128bit.. an GTX 770M @ stock.. which is a GTX 765M OC + 135/1000
-
Meaker@Sager Company Representative
-
EDIT: I see that the GTX 760M is a 128bit. Holy moly, for realz? -
-
Too many GPUs to keep up with HT. Im getting old too you know.
The 660M is a 384 core GPU. With 128bit.
760M have double that core count and still 128bit.
Isnt your 765M bottlenecked? -
reborn2003 and Cloudfire like this.
-
Meaker@Sager Company Representative
It's the ROP hit that hurts too and in fact mostly.
-
-
sasuke256 likes this.
-
The GTX 780M will not run The Witcher 3 on Ultra, that much is a guarantee. But neither will the GTX 880M, if it's also 28nm. -
Meaker@Sager Company Representative
Memory speed helps, Rop performance helps more.reborn2003, Mr Najsman and octiceps like this. -
reborn2003 likes this.
-
-
Meaker@Sager Company Representative
-
Beamed from my G2 Tricorder -
I play Crysis 2 at 40fps with fsync on and all is well, no artifacts whatsoever and pretty smooth image. -
NVIDIA Maxwell GeForce 800M Mobility Lineup Launches in February 2014 - GeForce GTX 880M To Lead The Pack
reborn2003 and Robbo99999 like this. -
reborn2003 and Robbo99999 like this.
-
- They don`t know anything more than us. They are speculation as well.
- Look at this picture
Look at P37xSM-A. It feature both 880M SLI and 860M SLI. Why should it have 8GB DDR3 on the 880M SLI model and 4GB DDR3 on the 860M SLI model?
- GTX 880M have 8GB VRAM according to a chinese reseller. That was already revealed way before I found the Clevo roadmap.
Taobao
Same with GTX 870M. It have 6GB VRAM like shown in the roadmap.
N15E-GS-A1 A2 GTX870M 6GB
It makes a ton more sense to have 8GB VRAM and 6GB VRAM because of the memory bandwidth, than changing the DDR3 on the machine according to the GPU. Atleast in my head
Today they posted results from a GPU blog run by a dude that have absolutely no idea about performance and basically just pulling numbers out of thin air. So wccftech and the rest are just posting whatever they find on the internet and writes a story about it mixed with their own opinions.
http://wccftech.com/nvidia-maxwell-800m-series-predicted-benchmarks-surface-compared-700m/ -
Robbo99999 Notebook Prophet
Ah, that's interesting Cloudfire, when I looked at it I initially thought that the link that Octiceps posted was a bonafide article rather than just a post. What you say makes sense, it would be silly to say only 880M systems get 8GB of system RAM and 870M systems can only have 6GB system RAM - that wouldn't make sense. I tend to agree with you that it's referring to the amount of VRAM. Still, it was a good find by Octiceps, I wonder where they got their info from that it was just referring to the system RAM. (Haha, and the content of that whole article does mirror the discussions & conclusions we've come to in parts of this thread - makes you wonder whether the creator of that post (article) really got his information from here!).
-
GTX 670MX: 960 cores @ 600MHz. 192Bit. 24 ROPs
GTX 675MX: 960 cores @ 600MHz. 256bit. 32 ROPs
The 675MX is roughly 27% faster than GTX 670MX
Question is: Is it because of the ROP or is it bandwidth starved since they tested the cards maxed out on 1080p? -
Notebookreview listed 800M cards today btw and referred to this thread so they are indeed searching around for information.
NVIDIA GeForce GTX 880M - Notebookcheck.com Technik/FAQ
That update regarding DDR3, yeah who knows where they got it. Second thoughts by the author maybe? Can`t blame him, since the charts are a little confusing like octiceps says (GTX 880M +8GB) and not (GTX 880M 8GB GDDR5). 8GB VRAM is pretty hardcore compared to the Kepler`s we have today. Heck many are even running on 2GB VRAM. And now 8GB?! I think the chinese clevo reseller along with this information nudge it toward 8GB VRAM. In my opinion. After all, we have already K5100M with 8GB VRAM and to me it seems strange to change DDR3 according to the GPU. Atleast when its the same notebook model featuring it.Robbo99999 likes this. -
Notebookcheck is not accurate, as usual. Memory and core clock of the 675mx are about 13% higher than the ones of the 670mx, you need to consider this when comparing the cards.
670mx: 700MHz Memory, 601MHz core
675mx: 900MHz Memory, 667MHz core
For anything up to 1080p 6 or even 8GB is utter non-sense and complete overkill (unless for some very specific workstation applications.)
At 3k resolutions and more there are scenarios where the additional ram might help, but the upcoming GPUs will lack the power for gaming at resolutions with settings that would result in such a high vram usage.
Expect another rebadge round, or if Nvidia goes crazy then we might see a cut down GK110 chip, e.g. with 10SMX enabled. Though that would be a novelty, GK110 with 256bit bus. Currently I think that's the most likely option. -
I thought about the GK110 option. There are several things that goes against it in my eyes.
GK110, Die size: 561 mm^2
Compare that with our 780M, which is 294mm^2. Are there really any room for that die with that size on current MXM boards? GK110 is almost double the size of a GK104.
I don`t think I have ever seen a die this big in notebooks before.
Memory interface on GK110 is 384bit. Our current MXM cards have a limit on 256bit right? Putting out GK110 with 256bit, wouldn`t that basically mean having to rebuild the chip?
I heard rumors about a 1920 core GPU releasing soon (10SMX) which could be our candidate, but it was said to have 384bit bus (GK110?) and 3GB VRAM (conflict with the 8GB information we have) -
GTX 880M
28nm GK110
10 SMX
1920 CUDA cores, 160 TMUs, 32 ROPs
671 MHz core, up to 820 MHz with GPU Boost 3.0
5400 MT/s memory
8GB 256-bit GDDR5 @ 172.8 GB/s
125 W TDP
-
Yes, I don't think that the ROPs matter a lot in this scenario, at least not at stock clocks in the most games where the fps aren't too high with these cards anyway.
I consider the die size argument invalid ever since the abomination called 480m, which was more a cooking plate than a working GPU... but it actually got released and sold:
(thanks @SFVogt for the nice pic)
Die size roughly 530mm^2
The memory bus width can be adjusted with little effort.
I doubt that Nvidia is going to go the monster die way again after the 480m fiasco. Could be fun though. Besides, there was at least one mobile card with an even bigger chip if I'm not mistaken.Cloudfire and Robbo99999 like this. -
The embedded 860m shows 2GB vRAM where the GTX version shows 4GB. It's all silliness until someone gets hurt.
-
I don`t think its the size of the die itself that caused the temperatures of the 480M to get so insanely hot. I`m pretty sure the heatsinks and fans are big enough today to efficiently cool off a die like GK110 especially when Nvidia disable so many cores. I`m pretty sure 480M became hot because it was their first Fermi high end card and they messed up with the architecture and the scaling and such.
How can they change the bus width so easily btw? GK110 consist of 6 * 64bit memory controllers to make the 384bit bus. I have never seen them disable memory controllers to make the bus width smaller on an existing chip, but just disabling SMX`s. Here is GTX 780 and here is GTX Titan. All of them have the memory controllers untouched.
Are you sure they can just disable few of them without redesigning the core layout and such?
-
Cloudfire likes this.
-
Cloudfire likes this.
-
GTX 670MX is GK104, which should look like this.
That one have 4*64bit, 256bit memory bus, but GTX 670MX is actually 192bit.
Well that leave me a little more relieved. Maybe there will be GK110 with 10SMX`s after all? Still hope for Maxwell though and since our 8SMX (780M) is already touching the 100W limit, that GK110 better be efficient if they plan to -
Meaker@Sager Company Representative
Notice that the GTX485M blasted past the 480M and no amount of tweaking could save the 480M.
The 780M would murder a cut down GK110 forced into a 100-125W envelope and missing a 3rd of its memory bandwidth and rop performance.
If the GTX670MX was so bandwidth starved it would scale linearly with memory clock and hardly any with core clock yet performance scales with the core more than the memory.sangemaru and Robbo99999 like this. -
Nothing new here but thought I would share anyways . ^^
NVIDIA GeForce GTX 880M - NotebookCheck.net Tech -
Lets look at it from both a historical and business perspectives..Nvidia never cannibalizes its top card, it rebrands a somehow worse version of the top card and makes that worse version the 2nd best..Here is what I mean:
2011:GTX 580 Top
2012 GTX 580 gets over-clocked becoming GTX 675M (Top card gone 2nd)
2012 GTX 680 is released, a new top card...
Since there is a huge gap they re-brand the previous top card (580) 2 more times as GTX 670MX, 675MX only in Kepler
2013 GTX 770 re-branded the 2012 top card but worsens it by decreasing from 256-192 and raising clock speeds..(about 7% decrease in performance)
Isn't the conclusion GTX 870=rebranded 256>192 worsened GTX 780 with raising clock speeds to make up?
Why are we expecting much more? Tech perspective offers hundreds of possibilities but business wise, it is looking grim for the customer... -
680m "pwn" 770m real hard btw. -
The 670MX and 675MX have no relation to the 580M
The 770M has no relation to the 680M
This has been the worst GPU history lesson of the day, one false connection after another. Nothing from the past leads to the conclusion that the 870M with be a 192-bit version of the 780M. Nvidia has never done such a thing.octiceps, Mr.Koala and Robbo99999 like this. -
-
CES 2014.
That is when there most likely will be official acknowledgement of the 800M series from Nvidia. Especially since Clevo are featuring them in their notebooks in February.
I remember CES 2012, Nvidia revealed GT 630M, their first Kepler card. Note that no other Kepler cards was announced.
GT 640M >>> GTX 660M was announced in late March. GTX 680M, their high end card was announced in early June. Just to give you guys a brief history on what happend on the Kepler launch back in 2012.
So keep an eye out for January 7-10th guys!reborn2003 likes this.
Clevo notebooks with 800M series coming out February 2014
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 11, 2013.