The best thing about a hypothetical 990M: decreased prices of 970M and 980M.
-
-
-
But to be honest, I don't know and am afraid to guess. According to rumors, the 990M is supposed to be almost twice the cost of a traditional top-end GPU. So, it may actually have no direct effect on prices. -
-
I don't think anyone is stupid enough to buy an 880M over a 980M, unless you were also joking, lol. -
The only way it's true, is if we're looking at some kind of soldered monstrosity, the kind which would force MSI/Clevo/Alienware to build custom machines to handle.
So far Notebookcheck's "inside sauces" seem the most grounded in reality:
1. Based on GM204
2. Variable TDP so manufacturers can set their own performance targets
3. At least the possibility of soldered models, which would facilitate custom cooling solutions for the high TDP versions.
The variable TDP can makes sense, if you look at it a different way. It's not that the 990M you get will have a variable TDP, no, e.g. the card you get in a Clevo will be set at a fixed TDP.
It'd be Nvidia saying to the manufacturers, "decide how fast of a card you want to sell in your notebook." Only want a 100W 990M, sure thing. Want to go balls to the wall with a soldered 185W monster? Okay, build a custom motherboard for it, and it's yours.
That's the closest to realistic we can get. -
Variable TDP would be marketing nightmare. Say there's a 990M in a Clevo that's 100W. Then there's a 990M in an MSI at 185W. Obviously a massive performance difference.
What on Earth justifies these GPUs being named the same with such a colossal performance difference? Nothing. They must be called different names or else all hell will break loose.
"Why is my 990M performing only half as much as this other one?"
"Why can't I play this game with a 990M, this other person managed it with theirs?"
"Is my 990M broken?"
It's stupid. Plain stupid. They already avoided this trap in the past by selling GPUs such as the 650M and 660M under different names despite the underlying chip being the same, just with a variable TDP.
And how would OEMs market their products?
"Our new premium notebook lineup comes with the top of the line GTX 990M GPU!.. Uhhh, we mean the good one, not the crappy throttled version."
"We're pleased to announce our new notebook lineup, featuring the brand new 990M GPU from NVIDIA. We've throttled this beast down so severely that it performs only half as well as our competitors. Enjoy!"Mr Najsman, jaybee83, Robbo99999 and 1 other person like this. -
Agreed. To have multiple GPUs with differing performance targets called by the same name is as much of a stretch as mobile GM200. Probably more of a stretch, because the former is unprecedented while there actually was a mobile card based on Big Fermi.
-
With such a vast potential performance range, the name 'GTX 990M' would lose all meaning. That's not what you want out of a flagship product. You want to preserve a strong, recognisable brand name.
At least, if NVIDIA has any sanity left. -
Nvidia had no sanity to begin with, if you look at its entire mobile history. I also don't think Nvidia would give a damn about the problems the notebook manufacturers would have marketing the different TDP specs. They just want the money.
To be clear, the variable TDP is the implausible part of what Notebookcheck's sauces had to say, I'm just providing an explanation for how it could be true. I don't actually believe it. -
Honestly knowing nVidia, if it was truly a variable TDP, they'd just build multiple SKUs out of it and milk it for all its worth.
-
-
Why would you all upgrade if the 780m, 880m, 970m and 980m SLi are already destroying any game on the market right now and will probably continue doing so for the next 3 years. It boggles my mind plus the 990m is just pure speculation and can just as well end up like the 880m fiasco.
-
moviemarketing Milk Drinker
780m, 880m, 970m, 980m cannot even deliver steady 60fps at 1080p in Witcher 2 (a game from 2011) at max settings (ubersampling), so you can probably forget about Witcher 3 maxed.
980m SLI is faster than the desktop 980, but some game engines can't fully utilize SLI.
In my case, however, I'm overdue for an upgrade and I don't need to run everything at max settings. If I already owned a laptop with 970m or 980m I would probably wait a couple more years before upgrading. -
-
-
^You know 8K is 4x the numbers of pixels as 4K, right?
moviemarketing likes this. -
4K maxed out on GTX990m SLI? Even Titan X SLI struggles with this, it does around 55fps on The Witcher 3 maxed out. That's $2000 for the cards alone. 4K gaming is currently a joke in my opinion, today's hardware is not up to the task let alone future proof for say in two years from now with more graphical enhanced games at that resolution. I'm just no fan of 4K when it comes at that cost, happy with 1080p for quite some time, maybe Pascal will change that.
Robbo99999 likes this. -
Guys , i heard that gtx 990m DO NOT HAVE mxm version and it stick into mainboard of the new laptop . The alienware user like us dont have a change to upgrade to gtx 990m
-
moviemarketing likes this.
-
I hope it will have mxm for alienware can upgrade
-
I want to smoke whatever you are smokingMr Najsman, alaskajoel, PC GAMER and 3 others like this. -
Ok, I`ll play this game another time and waste more time. But it will be the last time
GTX Titan: 2688 cores @ 992MHz. 384bit, 12 memory controllers
GTX 780: 2304 cores @ 1006MHz. 384bit, 12 memory controllers
2688 - 2304 = 384 less cores and slightly higher clock = 24W less power consumption
Now imagine a cut down GM200 chip scenario:
GTX Titan X: 3072 cores @ 1215MHz. 384bit, 6 memory controllers @ 1750MHz
Cut down GM200: 2432 cores @ 1100MHz. 256bit, 4 memory controllers @ 1500MHz
640 less cores and 115MHz LOWER clock.
Lets examine that a little closer:
640 less cores instead of 384 less cores is 67% more reduction.
24W * 1.67 = 40W reduction.
Now that is if they are clocked the same. But they are not in the GM200 example. It is running 115MHz LOWER clock.
Easily 60W lower. Probably more since we are dealing with 2432 cores here. Plus we have 2 disabled memory controllers and lower GDDR5 clocks.
Lets use your Anandtech slides again
Look at GTX Titan X:
392W. Remove 60W from our cut down GM200:
332W.
Lets look at GTX 680 power consumption again:
GTX 680:
356W
Cut down GM200:
332W
And it just happens that we got GTX 880M from GTX 680...
Now we are even better off. We are 24W lower.
Are you done now?
I am. If you don`t want to believe it, fine. But every single evidence point towards that a mobile GPU based on GM200 with 2432 cores running at 1000MHz is perfectly plausible in every single way.
If you want an easier example you don`t even have to go further than look at GTX 980 which have a 165W TDP. Thats a full GM204.
Compare that with a GTX 680 (880M) which was a 195W TDP.
There is 30W room there for more cores (195W - 165W)
My calculation above between GTX 680 and GM200 is 24W. Yeah, exactly
I think I made my point pretty clear here although I`m pretty sure there are some knuckleheads here that still won`t listen and understand.
Im not gonna respond to GM200 vs GM204 anymore. I have better things to do.
It remains to see what Nvidia will do. Its up to them. Both GM200 and GM204 is possible.
Ultimately its up to what Nvidia deem most financially for them. I already made some remarks to why I think GM200 could happen and why GM204 could happen.
/discussionMr Najsman and jaybee83 like this. -
-
Would love GTX 990M to be Pascal though, but doubt it very much
I don`t have the energy to do it. Neither am I invested in this forum like I used too.
CheersTomJGX and moviemarketing like this. -
Is that a concept drawing or the actual system? Looks nice!
I see you trying to ignore my posts, @Cloudfire. Little boy is angry. I'm sorry. -
Thats how Asus G752 will look like. Its the upcoming gamer Skylake notebook from them. The whole backside looks like a vent. Asus know their cooling. Thats for sure -
Looks like the symbol I use sometimes to execute database queries. -
That G752 (real or not) looks gorgeous!
-
moviemarketing Milk Drinker
-
-
moviemarketing Milk Drinker
-
@Cloudfire Have I ever seen you on Baidu Tieba? Notebook bar? Graphic card bar and other places?
-
That's how the g752 looks like. The guy posted the picture told me this thing since last year. I have been waiting for this
-
That Asus G752 looks gorgeous, the Aorus X7 looks better though
-
Cloudfire makes excellent points... And with Dell and MSI dropping the ball, expect Clevo to capitalize on that.......
FYI if Clevo made Foxconn actually do their jobs and test for flatness, they would have heatsinks capable of 150W between their fans and open grill mods. Clevo is always called loud for a reason... Their fans move over 20% the air of any of the competition... They're just lazy.HTWingNut likes this. -
Don't know if this guy speaks the truth but apparently the 990m will support mxm and cost around 1600USD EACH. -
any leak on specs of Skylake Mobile Quad cores?
-
It most likely will be soldered. A select few manufacturers have probably expressed interest in MXM but nVidia may very well pull a first gen Maxwell (860) where they demand nothing but solder... Those machines flew off the shelf which was their test market... But Clevo isn't a pushover or a small player so it should be interesting.
-
I agree with Ethrem. Few will be MXM, like Clevo. The rest, soldered.
Aren't the ASUS ROG series laptops soldered these days? -
that was a sexy asus laptop. my first 2 laptops were asus. loved them. but then when i got a gaming laptop, i changed to this sager. an rog would have been good, but they jack the prices way up when going from a consumer asus laptop to an rog one.
so a 990m in an asus laptop that looks like that sure is going to be waaaaay out of my price range lol.
any ideas or speculation on how well this card will overclock or if it'll overclock at all? or will it be a power hungry heating 880m all over again? -
I think it will be a mess like the 880M was in the beginning. NVIDIA has never launched a 9 90 series mobile GPU, especially not with 16GB's of VRAM.
-
Robbo99999 Notebook Prophet
Haha, yes, the tri-sli 980ti was so far off base I couldn't muster a comment. Oh well, nevermind!
-
-
In the video description:
"I have an idea what GPU we are looking at, I was playing with the idea earlier in this forum, and I think it is GM200 with 2500 cores"
Its in the OP of this thread.
CEG that have the 990M for preorder quote that guy making those videos and he quotes me. Which I now quote.
We have come to a full (quote)circle
Not sure why CEG choose to quote 2500 cores though. Fingers crossed though.
Would love to see a GM200 chip coupled with the newest 4K display. Those would go nicely togetherLast edited: Aug 15, 2015moviemarketing, PC GAMER and jaybee83 like this. -
1215MHz is the MAX boost clock on Titan X, which is never achieved in any of the games tested. In Crysis 3, it ran at 1113 MHz. while consuming 392W of power.
Likewise, 1252 MHz is the max boost on the 980, but it actually only boosts to 1227 MHz in Crysis 3 and uses 301W of power. So the Titan X is already running 114 MHz slower than 980, yet still consumes 91W more power due to having 1024 more cores. So that's 0.08886W per core, and cutting out 640 cores while downclocking by 114 MHz, you'd still get a GPU that used 34W more than the 980. Titan X boosts to a max of 1189 MHz in actual gaming, so let's peg the core clock at 1189-114 = 1075 MHz. Ok so you cut out memory controllers and make it a 256-bit card. I have no idea how much power that's going to save, but for the sake of argument, call it 24W since you like that number.
Ok, so cut down 256-bit GM200, with 2432 cores running at 1075MHz and memory at 1750 MHz would still use 24W more than a full 980, or 335W. Or in other words, this cut down GM200 is basically a 680 now.
Now before I go on, I'd like to point out the graph you posted of 680's power consumption is flawed. You're comparing different games (BF3 vs Crysis 3), and the test systems are very different!
680's test system
980's test system
Literally the only component that remained unchanged was the ram! They even used a different case LOL. The 680 used a 3960X @ 4.3GHz, while the 980 used a 4960X @ 4.2GHz. That alone could have accounted for 10-20W depending on the exact voltages. Alright with that out of the way, let's continue.
So now you point to 680 vs 880M. Alright, but how about let's not ignore facts this time. So in order to create 880M from 680, nVidia had to
- downclock core by 117 MHz (going by max boost numbers here, even though we know 880M rarely sustains its 993 MHz full boost, so in reality it's a bigger downclock)
- downlock memory by 250 MHz (from 1500 to 1250 MHz)
- bin the hell out of GK104 and pick only the best of the best dies (80%+ ASIC)
So going along with your numbers and assuming everything worked out as above, your GM200 could fit within 120W if it was downclocked by another 117 MHz on the core, and 500 MHz on the memory (from 1750 to 1500 MHz) So in the end your GM200 would look like this:
- 2432 cores
- 256-bit memory bus
- 958 MHz core/1500 MHz memory
My turn now to play this game now. Since 680 to 880M was possible just by doing the above, a full mobile GM204 with 2048 cores should also be possible if we downclock core by 117 MHz and keep memory at 1250MHz. So a full GM204 mobile chip would be something like 2048 cores @ 1135 MHz and memory @ 1250 MHz.
But wait, 980 to 680 shaved 30W, you used that power budget to add more cores, so I'm going to use that power budget to clock the core and memory higher. Conservatively, I have no trouble believing that this full GM204 could at least run a desktop 980's core speed while keeping memory at 1500 MHz without breaking 120W, going by your numbers and logic so far. So now we have the following 2 GPUs:
Hypothetical 990M based on full GM204:
2048 cores @ 1227 MHz (highest boost achieved in games)
256-bit memory @ 1500 MHz
Hypothetical 990M based on GM200
2432 cores @ 958 MHz
256-bit memory @ 1500 MHz
The GM200 has 20% more cores, but the GM204 has a 30% higher core clock. Now, I'd like you to tell me if you know with certainty which GPU would perform better.Last edited: Aug 15, 2015 -
-
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.