I can see something being released around Computex, but not Q4 2015. That would be so close to Pascal. They've already delayed Pascal twice, I think.
-
-
Releasing this monster will mean they will have to really make a performance jump with Pascal. -
What if Nvidia know more than us about what Amd working on and will be ahead?
-
-
A to small perfomance increase over Gtx980m becomes too little I think. Amd has lying so far behind Nvidia that something must happen soon. I believe Nvidia also is aware of this. Amd has promised increased focus on high performance now so who knows?
Last edited: May 12, 2015triturbo likes this. -
Just why?
-
http://forum.notebookreview.com/thr...deon-m300-series-notebook-video-cards.775636/ -
-
-
-
I loved my old mobile HD 5870, that was what, 4-5 years ago? It was relevant for years and kept up well. We just need them to find the magic again and I'll gladly jump ship to keep Nvidia honest.
-
-
AMD hasn't even responded to NVIDIA Maxwell, yet. That's what they plan to do this June at Computex. Don't expect anything amazing* from AMD this year (in mobile). They fell behind years ago, and have stayed behind NVIDIA, which leads to re-branding on both sides.
*By "amazing," I mean a new architecture or something better than Maxwell. It's highly unlikely. Amazing in regards to AMD (in general) would be something as measly as proper driver support.
In other words, I agree with @Ethrem : It doesn't make sense. Why hurry? Not like anyone is forcing their hand or competing on their level. -
New cards from a new architecture is pretty much as fast as SLI from the last cards of the last architecture.
GTX 680M (Kepler) was as fast as GTX 580M SLI (Fermi). Much faster than 480M SLI (Fermi).
GTX 980M (Maxwell) was almost as fast as GTX 780M SLI (Kepler) and slower than 880M SLI because Nvidia lowballed mobile with Maxwell.
New architecture like Pascal will be vastly better in efficiency than Maxwell, plus they will introduce stacked VRAM and it will be built on 16nm, trust me, their first Pascal will be much better than 980M SLI.transphasic likes this. -
-
If I had to guess, assuming the 990M is real and has more cores, it should perform about 30% better than the 980M.
GTX 980 has 2048 Maxwell cores. The Titan X has 3072 Maxwell cores. The Titan X performs about 30% better than a GTX 980.
NVIDIA follows trends when it comes to mobile vs desktop graphics. They can easily push another 30% out of Maxwell mobile to still top AMD's response to Maxwell, and that's exactly what I expect of such a company. And not to mention, 30% actually makes perfect sense, considering Pascal is due soon thereafter.transphasic likes this. -
Meaker@Sager Company Representative
The titan-x also has more ROPS and a 50% wider memory bus. There is not really much more memory bandwidth room in the 980M, it would all be core horsepower.
-
-
-
-
jaybee83 likes this.
-
I guess that crazier thing was the 480M. It happened, it was a disaster, and nVidia nevered bother going that route again. -
Saw a video from J. Dre yesterday. Guy didn't even hit 60c in BF4 at 96% scaling, ultra preset, unlocked FPS. My cards hit over 60c in SLI getting like 50% usage on each and I ain't even at ultra AND I limit my fps to 125.jaybee83 and Mr Najsman like this. -
Oh how I've missed the Nvidia GPU speculation, love it.
Maybe Nvidia fully realise the impact that VR will have towards the end of the year / start of next and are releasing a mobile beast to power it -
Even if some systems can handle it, I wouldn't want a card that hot, just because... Better off waiting for Pascal. It will be a tiny, little monster that runs cooler than anything today. Either that or it will run the same temps as Maxwell but output twice the performance (due to the shrink in the process).
-
Firstly, I think that this '990M' will be called something else (either 1080M or something entirely different), as NVIDIA have never gone over a year without refreshing it's mobile lineup with a new branding/naming scheme. Secondly, I think it's going to have 2048 cores and be a GM204 chip, as that just makes more sense to me with such a hypothetical GPU already offering a sizeable improvement vs 980M and already having a mobile-friendly 256bit memory bus. I think that the story of mobile Kepler will play out again (680M to 780M, both GK104).
I hope I'm wrong, but I'll be very surprised to see a GM200 chip in laptops.
Edit: Either way, I don't think I will have enough willpower to resist getting a laptop equipped with one of these and Skylake. Too much temptation. Can't ignore. I'll be kicking myself when it gets blown away by Pascal the following year, but maybe by then I'll have a full time job (one can only hope!).Last edited: May 13, 2015 -
Meh, worst case should be 2048 core GTX 990M anyway.
Thats 20-30% better performance. Solid boost that too -
I just wonder whether a refreshed P65xSG from Clevo will be able to handle these higher powered chips. -
Yeah say 25% boost would be decent. Still won`t convince me to return to notebooks though. -
-
980MX is more likely, in my opinion. They've already used "MX" once. But the information in the main post suggests 990M.
-
I hear you there. I'm not gonna discard my notebook till I can't use it anymore probably, but I wouldn't jump back into things unless a true amazing notebook comes out. A SLI version of a P770ZM with a 400W-450W brick and skylake and two full GM204 chips WITHOUT low voltage vRAM and WITH proper amounts of VRMs with dual PSU compatibility and even 120Hz screen compatibility? THAT would be something that'd make me feel enthusiast laptops aren't dead. -
-
-
-
-
Please not an 880M repeat... Just make it a better 980M.
-
Dem Photoshop skillz
-
Why is the vendor ID 10DS? NVidia is 10DE.
-
Why use a 2048-shader cut-down GM200 instead of a full fat GM204?
Why is all the info filled in despite the GPU not being in GPU-Z's database?
Smells fishy, eh?
Can't believe Cloudfire has pulled some people's leg for 14 pages and counting. -
I wonder if Cloudfire's walk is done... -
Last edited by a moderator: May 14, 2015
-
-
-
D2 Ultima likes this.
-
I... wanted... to... believe... why couldn't you let me be happy for more than a moment!
-
Then I saw the die size and release date and was like"I wonder how many will fall for this
"
-
You know, I just went back to the "source", and no where does Cloudfire's phone pic (not saying it is his phone) info, the SLI bit, show up anywhere in the source. Not even the time.
I have to ask, did anybody look back at the original source, or did people just blindly believe this?Last edited: May 13, 2015 -
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.