Yes, 16GB does seem unnecessary, although I can imagine more than 8GB being used in a couple of years, but praps not a full 16GB! I don't think the 990M would be obsolete next year though - if it's close to desktop 980 performance then that's not gonna be obsolete next year - for 1080p gaming you'd be good for the next 3 years I reckon for decent settings.
-
Robbo99999 Notebook Prophet
-
16GB could def be a hint that this GPU is targeted toward 4K and Nvidia is trying to make it seems as exclusive as possible. At 2500 cores we are getting closer to 4K gaming. 4K displays are missing but atleast we got DSR
Well obviously that core count wasnt 100% accurate. I said 980M had 1500 cores when it have 1536 and he responded with a similar roughly estimate 2500 cores instead of 2560 or 2432 cores.
But yeah, don`t see how 16GB VRAM will benefit anyone except Nvidia lol.
2432 cores should be about 60% faster than 980M. Add some driver optimizations and we got roughly a 980M SLI situation like the Hasee dude spoke about
Don`t expect a Pascal chip of say 300mm2 like GTX 680M, to outright crush a 600mm2 chip like GTX 990M (if that happens).
Last thing Ive heard is that Nvidia is aiming to release the biggest Pascal chip first this time, which may not be ideal for notebooks, so mobile Pascal could maybe be here in June-October 2016 somewhere, which is maybe why Nvidia is releasing the GTX 990M to have something new to offer between October 2015 to that time period.Last edited: Aug 9, 2015 -
Robbo99999 Notebook Prophet
I can't imagine 2500 cores in a mobile platform though, I can imagine the full GM204 chip with 2048 cores just like the desktop GTX 980. Certainly can't imagine 2500 cores at the same frequency as the 980M - would just run too hot & consume too much current I reckon. -
well, it depends. if you design this chip for SLI laptops it would definitely be possible to keep the heat and power consumption in check, just not as an actual SLI configuration
also, dont forget the info we have from notebookcheck, with the 990M having variable TDP ranging from 100 - 200W. so that "regular" MXM 990M for existing notebooks could be a quite cut down version of the "real beast" 200W TDP 990M, which would then require a new kind of pseudo-SLI notebook offering only one MXM slot (or half of the mobo being reserved for a gigantic soldered GPU *lol*) -
Ok, maybe not full 1070MHz but perhaps 1000MHz. A desktop chip with that clocks should be around 190W area which is what GTX 680 was.
We have GTX 980Ti with 2880 cores which is a 250W TDP GPU but it
Yeah I saw the article from notebookcheck. I`m still not sure if they got that information from their own sources or this thread? Because I dont remember anyone here saying the chip was only for non MXM.
The strange thing is that I have spoken to some people that said the 990M is indeed MXM and have tested it and that Clevo have MXM versions of it. Would be very strange to offer 990M as full fledged solder version and 990M as cut down 990M MXM wouldnt it? That would be an entirely different GPU and should have a different name.
The rumors about this chip is just all over the place. Ive yet to come across any benchmarks of it or any software with support for the GPU so I have no idea where to lean most too. There are many scenarios that can happen here, so I dont know yet -
I forgot to say that the codename "NVIDIA E-GXX" which notebookcheck refer to as GTX 990M in the article is probably Tesla M60. Their sources say its a GM204 which is probably 100% correct, but I highly doubt its the GTX 990M.
Tesla M60 was just added in AIDA64 to support it
"E-GXX" is a codename I have never seen before and its because mobile have never had a Tesla chip before
Codename for GTX 990M is probably "N17E-GX" like CEG says. Nvidia always use these standard codenames and go up a number (980M was N16E-GX) -
Robbo99999 Notebook Prophet
Yeah, you're referring to the point you made in the very first post of this thread comparing the fact the we got a 1536 core 780M from a 190W 1536 core desktop 680 - so from that point it makes sense, it's just that if you extrapolate it up the other way from what we currently have & know about the current draw and temperatures of a 980M at 1536 core I can't see us reaching a whole 2500 cores, not without SEVERE frequency cuts. 2048 full GM204 seem to extrapolate more sensibly up from the 1536 core 980M that we know & love (haha!) now. -
The 990M will be a disaster if it has that many cores. Manufacturers already supply inadequate PSU's for their products. Just imagine laptops with this thing: they will require more than 300W's to run. What a waste... It doesn't even seem logical nor possible at this point. Computers are getting thinner and lighter.
It's going to be a nightmare... Very few systems will probably have this, if it even exists.
Pascal is not coming in October 2016, lol. A Pascal GPU may be released around this time, yes. But that is not when mobile Pascal is going to be introduced. I would bet my life on that. Intel sets the path for launches because they are the leader in this industry. As Intel has basically skipped Broadwell and went straight to Skylake, graphics manufacturers must also skip and move on. That means a good estimation of when NVIDIA is going to launch Pascal would be Q1/Q2 2016 after Intel officially launches at CES 2016.Last edited: Aug 9, 2015 -
at this point in time not even nvidia really knows when pascal will actually be available....
Mr Najsman likes this. -
I've predicted several launches based on scheduling. It's definitely plausible.
Companies launch at events like CES, PAX, etc. Find those dates, factor in quarterly earnings reports, and you have a good estimate. -
How much are you willing to bet that a mobile Pascal GPU will absolutely arrive before October 2016? I`m not saying it won`t but you seem extremely sure about it all.
GP100 taped out and is headed for a Feb-March release. Probably no ideal for mobile. How do you know when a GP104 arrives? I said June-October somewhere and I think that sounds perfectly reasonable.
Well we know that a GTX 980 consume 12W less than GTX 680. Then I know TSMC just got a new 28nm process called HPC that give you 20% less leakage and 10-15% performance boost.
Perfect for reducing TDP and increasing performance over 980M ey?
http://www.edn.com/design/integrate...ct-How-to--Fully-utilize-TSMC-s-28HPC-processMr Najsman, Robbo99999 and jaybee83 like this. -
I said I'd bet my life on it. It won't be introduced as late as October. That's absurd.
Q2 2016 is the latest. Q1 2016 is the most likely, considering they've already begun production. -
I think it's still worth addressing this concern. What does everyone else think about power consumption with such an insanely unrealistic chip?Last edited: Aug 9, 2015
-
GTX 980M launched in October 2014.
GP100 ie probably not suitable for mobile have begun production, not GP104
GTX 980M didnt launch during these big events you talk about. Neither did GTX Titan X. Nvidia do their own thing now.
You seem extremely sure about something you don`t seem to have the full picture of
-
That's based on last year's launch schedule. It was a wild card. Intel is back on track now.
Look further back. Forget about last year, it was ridiculous. -
We will see J.Dre. Just don`t be so certain about these release cycles. They rarely go the way we plan. TSMC is steering the ship. Its like being on board with an unreliable drunk captainTomJGX likes this. -
It's more legitimate than your "chat" with Clevo that I replicated in MS Paint in a few minutes.
Why because you say one thing, it must be true. But if someone else says something, it must be wrong? You are an entitled individual. -
Where did I say you were wrong and I was right?
Enough of this chitchat for today I think. Have fun speculating -
Implications all over the place.
It is all speculation and you see it as fact. If someone else suggests something else, you take it personally. -
Robbo99999 Notebook Prophet
Well, in that case with the improvements in efficiency of the 28nm process that you quoted combined with maybe some aggressive binning then maybe they could indeed stretch to 2500 cores just - I don't see why they need to though, because AMD's not doing much in the mobile sector, so I don't see why they should bother releasing a 2500 core Maxwell mobile monster when they could keep it simple with a full 2000 core GM204?! -
Get rid of GM200 chips that had damaged cores that they can`t use for GTX 980Ti/GTX TitanX.
As of now they have zero products for that.
GTX 990M have 2560 cores/256bit perhaps, disabling one entire GPC with 4SMMs? Or one additional SMM from the other GPC?
For those that don`t know, this is a full GM200 ie GTX Titan X. GTX 980Ti have 2 blocks/SMMs disabled.
Last edited: Aug 9, 2015 -
The 990M is going to be amazing. Unfortunately, my 230W PSU won't be able to support all 2536 of those cores at 200W TDP.
Gonna have some serious power draw issues with this card. Dual-PSU modification required! Forget portability... -
Robbo99999 Notebook Prophet
I see what you mean by using up 980ti and Titan X 'rejects', but didn't you say this was all possibly gonna be based off that new efficient 28nm TSMC process called HPC? If that's the case then they would be newly manufactured chips rather than 980ti/Titan X rejects. So, isn't that a bit of a contradiction?Mr Najsman likes this. -
wouldnt it also make more sense to produce smth like a desktop 975 instead of a monster gm200 mobile chip? cant wait for this thing to come out
Sent from my Nexus 5 using Tapatalk -
I`m not sure why you are so sceptical J.Dre.
We had 230W notebooks that could run GTX 880M.
990M with 2500 cores shouldnt be any different -
Skepticality comes when something is difficult to believe.
What I'm unsure of is your excitement about the 990M (or anything Maxwell) with a new architecture, stacked memory, NVLink, 1Tbps+ bandwidth, etc. on the horizon. I've seen enough of 28nm, haven't you? It's time for some real gains and improvements... Let's bring out the big guns.
Forget Maxwell. It's old, dried up and rotting away. Let the past be past. It's time to rejoice in the birth of Pascal.Last edited: Aug 9, 2015 -
16gb of vram? Geez. Most I've been able to crank out in light video editing is 5.5gb of vram, but again, that was only 1080p video with a few effects and a short video. I'm sure 8gb could be hit easily with 4k video and more complex projects. I'd be game to try to push and use up the 16gb of vram. Maybe I'd start trying 3d modeling and CGI work lol. I'd definitely get a 4k screen with this gpu.
On a side note, since the power consumption is probably going to be crazy, does that mean doing 990m SLI is completely out of the question? -
The last time nVidia tried to make a 384 bit card into a mobile 256 bit card we got the disaster that was 480M. Yes I know process improvements, 28nm is very mature, Fermi bad, Maxwell good. But I'm just sayin'.
TomJGX likes this. -
moviemarketing Milk Drinker
If you build it, they will come. If we make a thread about it, maybe they will build it?
Quick, someone post a new topic about AMD launching mobile version of the R9 Fury X
Mr Najsman, TomJGX, Robbo99999 and 1 other person like this. -
"And it will be a disaster again because we never learn. But don't worry, we've got driver support." -NVIDIA
I want NVIDIA to make a sticker of this fall out of their packaging when we unbox their GPU's.TomJGX, n=1 and Robbo99999 like this. -
I might be the only one thinking of this, but (and if we think this is true since we have skeptics, which is fine) why wasn't MSI mentioned? Aside from Clevo, they're the only ones selling laptops with MXM slots. Alienware being mentioned is odd, but there's not much to talk about it.
(I think ASUS' laptops are soldered only, but I will double check since I don't know for sure.) -
So much speculation here. May as well add my 2c.
I fully expect the 990M to simply be a GM204-400 (2048 shaders) core downclocked to ~1000mhz. Doing this on a desktop GTX980 already brings it down to that magic 130W region. Also keep in mind, the cooling power budget can be removed from the GPU power requirement since that is now system controlled in a laptop. TDP (thermal requirement) remains the same, but power budget for the GPU effectively increases, assuming there is some sort of hard limit of ~130W on MXM.
This should put it in the region of 130W TDP which Nvidia have definitely done before (ie 880M).
If TSMC's HPC upgrade actually pays what it promises then a fully clocked GM204 core may even be possible.
In more crazy-think, it may be be a test of HBM (or HBM-like RAM). HBM could theoretically drop the TDP enough with an underclock as well as reduce overall space required (rather significantly actually). Or even better, maybe it's an early mid/high range Pascal chip. Similar to how GM107 was released as the 750Ti and 860M long before the full GM20X chips came about. Effectively testing the waters and seeing how efficient the new chip was. But that's crazy talk
moviemarketing likes this. -
thats what the thread is for, crazy-talk
Sent from my Nexus 5 using TapatalkTomJGX likes this. -
http://forum.notebookreview.com/threads/on-the-subject-of-maxwell-gpus-efficiency.773102/
Take that 130W with a massive grain of salt. Especially when using higher resolutions that ramp the GPU a lot harder, like half of these silly 4K gaming laptops. -
Robbo99999 Notebook Prophet
Haha, tell that to the folks here!:
http://forum.notebookreview.com/threads/windows-10-upgrade-warning-for-alienware-owners.779449/ -
Robbo99999 Notebook Prophet
Yep, so you're saying that the 130W would be an average over time, where due to it being Maxwell it has massive variations in current load with very short maximum peaks over 200W for example. From a laptop cooling perspective I suspect the average value is the number that determines if the system can handle the cooling, but how that translates to laptop PSUs being able to cope with massive spikes I don't know. What are the implications of what you're saying in terms of your speculations on upcoming mobile Maxwell 990M? -
Well, it can be a big deal if it spikes TDP for a short while but long enough to overdraw a power brick, though that's probably speculation on my point.
The REAL issue is that the lower TDP is only a thing if the voltage keeps clocking down low... higher resolutions and heavier and heavier loads on a GPU can force the voltage to be higher, or the GPU to work harder. For example: try running a game with unlocked FPS. Make sure your GPU hits 99%. Then use nVidia DSR or in-game supersampling to set the game to 4K resolution. See how much hotter the card gets. The problem is there may come a point where voltage needs to remain high so often that 130W won't be the limit, but rather it might be closer to 150W, or 160W, or even higher. Couple it with a 180W PSU (or even a 240W PSU) and some problems might show up. It'll also likely mean that SLI notebooks will be wholly out of the question, unless Clevo somehow gets a 660W single PSU. -
Robbo99999 Notebook Prophet
I think the Maxwell micro second voltage regulation is independant of GPU load to some extent - I think it happens even during what GPUz would describe as say 99% GPU load - this is what I remember seeing from some Maxwell hardware reviews on Guru3d and other places. So I don't think gaming at high GPU loads & high resolutions needs to be taken into consideration when talking about the GPU consuming consistently more power than it is normally rated for (e.g. a 130W limit for example). (Talking gaming loads, not Furmark or something).
EDIT: gaming load current fluctuation in Watts seen in this review (still the large microsecond fluctuations in current that allows Maxwell to keep average power consumption low):
http://www.tomshardware.co.uk/nvidia-geforce-gtx-980-970-maxwell,review-33038-12.htmlLast edited: Aug 10, 2015 -
Bumping resolutions (and potentially DX12's workings) might force the GPU to work harder however. Remember: the voltage goes up and down as the GPU is able to relax; like a micromanagement. If the GPU cannot relax as often due to higher resolutions or being used in a different way, then its voltage remains higher, and it draws more power. The as with the piece I linked, n=1's card has well surpassed its 165W (even its 195W) when voltage needed to remain high, and usually stays above that with his vBIOS mod to keep voltage constant.
I think designing a maxwell card to aim for 130W average heavy gaming load can cause problems, especially at higher resolutions. -
Nevermind.
-
Robbo99999 Notebook Prophet
Maybe.
When it comes to DX12 that could increase efficiency perhaps, not that convinced by increased resolutions. I think the bottom line is that if they design a card to be 130W on average then it's actually gonna be 130W on average (unless overclocking), but I might be concerned about those massive microsecond spikes in current, but again that will be designed into the system, maybe capacitors can cover those load spikes - I'm not an engineer. All being said I don't expect more than a 2000 core GM204 anyway - and that's probably only gonna be in laptops with decent cooling systems too. -
Just saw this interesting thread while googling and decided to join the forum.Havent been able to read it all but as those are speculations do you think that with the gtx990m a new quadro mobille will come to replace the k5100m?I wanted to pick up one of those dell m6800 but dont want to get caught and miss the new gpu.Maybe release this year?
-
You guys are going to be celebs at this rate: http://wccftech.com/nvidia-geforce-mobility-gtx-990m-q4-2015-faster-than-gtx-980/
So much incorrect info in there. I think Usman needs to read the thread properly. -
lol
He didn't even spell NVIDIA properly.
-
moviemarketing Milk Drinker
Certainly hope so! Despite being overpriced, mobile workstation cards from the past few years suck pretty bad compared to gaming cards for most creative applications. -
so does that explain why i have a ton of driver crashes when i overclock my 880m and my 4810mq at the same time and try to game with a 240w psu?
-
The rumours hath spreadeth fast:
Sent from my Nexus 5 using TapatalkPhase and moviemarketing like this. -
the MSI gt72 dominator pro comes with a 300w power supply, so that might be able to hold the spikes of the 990m. Not too sure if MSI is going to be offering upgrades for that card at any reasonable price any time soon.
-
Where? All I ever saw about it showed 230W.
-
seeing as the ZM series keeps the 980M ICE-COLD even with a 20% OC on both the gpu and vRAM, i see no reason why it shouldnt be able to take it
albeit with CPU at stock most probably
Mr Najsman likes this.
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.