R9 M295X
Description: Bus Interface: PCIe 3.0 x16, Max Memory Size: MB, Core Clock(s): 750 MHz, Memory Clock(s): 1375 MHz, Graphics API Support: DirectX 11.2 OpenGL 4.3 Max TDP: 125 W
GTX 980M
Description: Bus Interface: PCIe 3.0 x16, Max Memory Size: MB, Core Clock(s): 1038 MHz, Graphics API Support: DirectX 12 OpenGL 4.5 Max TDP: 100 W
GTX 970M
Description: Bus Interface: PCIe 3.0 x16, Max Memory Size: MB, Core Clock(s): 924 MHz, Graphics API Support: DirectX 12 OpenGL 4.5 Max TDP: 75 W
-
-
Here's more info on R9 380X:
AMD developers confirm Radeon R9 380X with HBM memory | KitGuru -
-
They removed 4GB of the VRAM because Dell thought it was a good idea to use soldered GPUs...
Other than that it should be a full 980M. If you are trying to suggest 980M to run hot you are out of your mind. Its one of the coolest running GPUs in a long time (with great performance).
What you should get from TDP info? I dont`t know, maybe an indications of how hot it will run? With 125W it will undoubtly just do that.
Here, while you wait for benchmarks from Alienware 15
Poll: M295X GPU heat in Valley benchmark - MacRumors Forums -
I was pulling a leg hence the "
". But seriously, it's like comparing two cars based purely on the spec sheets. Unless they hit the pavement you can't say which one would be better. You can probably guess, but there have been a few surprises now and then.
That's like what 2-3 moths old, I can barely care what they get. Sadly it's bad points for AMD and not for Apple with their obviously crappy cooling (Apple fan boys would buy whatever Apple throws at them, and if it's a fail, they'll just get the new model, God forbid changing the brand). As I said before it's the very same heat-sink that cools lesser CPU and half-the-wattage GPU. So I at least have put a grin above, but are you serious? I'm yet to see non overheating slim form factor Mac (this excludes the Mac Pro, since it's small, but not slim).
In the end of the day it wont really matter if Clevo or MSi don't put out and MXM module, since Alienware is obviously loosing it. -
78C/172F for the GTX 680MX inside the 2012 iMac during Unigine Heaven which is equal to the benchmark posted here where the M295X runs around 100C....
I don`t know why you keep trying to twist the fact that the M295X runs hot. The same machine could cool a full GM204 just fine.
If it looks like a duck, swims like a duck...Last edited: Jan 15, 2015 -
OK, it runs hot. Thank God they haven't put 880m in there, because there wouldn't be moaning owners over the forums, rather burned houses over the news
Let's move on - the performance. What's your take on it. As I said before and maybe fifth time - put them in the same machine and then we can write conclusion. Throttling R9-M295X is not something to base your conclusions on, or at least I wont. -
What? No 390X this time?
-
-
No info on a potential M380X?
-
The jump from my 6990m (40nm) [2011] to the 680m (28nm) [2013] was a huge jump in terms of 3D performance and reduction in heat dissipation.
Yields on 20nm will be hard to achieve as you do realize the wafers of the die cache are getting so extremely close together they are nearly touching.
The physics of reduction in die size layers to produce the chip is not as easy as Moore's law. It requires better research in germanium doping silicon yields along with other metalloid combinations.
If the yield can be achieved then this is a bonus but if i were AMD in 2015 I would focus on better architecture on the 28nm die size. -
Leaked 3dmark 11 extreme score of supposed engineering sample AMD Radeon R9 300 series GPU:
total: 8,121 points
graphics: 7,788 points
AMD Radeon R9 300...3D MARK 11 - ChiphellTBoneSan likes this. -
-
-
I think AMD might actually get to give Nvidia a long overdue spanking in a couple of months..
I can't wait. Let us all profit from some healthy competition. -
LOL this is fake. No way a 300W 380X card (per the previous leak from the two AMD employees' LinkedIn profiles) ties a Titan Z and scores this close to a 295X2, potentially even giving GM200 a run for its money. 380X means AMD must have an even faster 390X in the wings waiting to counter Nvidia. Now that would just be ridiculous.
(This is using a 3960X @ 4.6 GHz instead of a 4790K, which impacts overall score.)
Also this score would instantly put it in the top 25 HOF fastest single GPU alongside what I presume to be LN2/dry ice cooled 780 Ti and 980 cards. So yeah too damn high to be believable, esp. for a 380X. Even the previously rumored specs (4096 SP's and 4096-bit HBM) don't make sense for this level of performance.
The notion that AMD made this level of effiency gain in one generation (those world record busting Nvidia cards sure as hell draw a lot more than 300W), and on 28nm SHP (?) no less, is ridiculous. -
King of Interns Simply a laptop enthusiast
I think if the data is reliable then we should be VERY VERY happy. Not saying it would be ridiculous. Nvidia would be forced to price drop and or release some other monster chips. -
(And I don't think the data is reliable, at all.) -
-
Praying for a worthy successor to the 7970M. I honestly think this may be real, because if it isn't, AMD will be bankrupt soon enough. They probably waited long enough to unleash this.
reborn2003 likes this. -
Most likely fake because both AMD and Nvidia is at 28nm. I dont think AMD is capable of going that much over Maxwell in efficiency vs previous gen
-
Yeah people are saying the actual run and scores are both real, but the GPU name was photoshopped. The run itself was made using 2 overclocked 290's in CF. Then there's this post that calls to attention what is suspected to be photoshop evidence.
And just so people don't get fooled or overly excited again, here's another one of those finely photoshopped "leaked benchmark" from Chiphell.
Chiphell sure has been having a lotta fun with PS lately...Last edited: Jan 19, 2015 -
Some people here got trolled, hard.
BTW here is the real result: AMD Radeon R9 290 video card benchmark result - Intel Core i7-4790K,Gigabyte Technology Co., Ltd. Z97X-SOC Force -
If it was 390X on 20nm I guess it could've been plausible? But 28nm? Not a chance in hell.
-
AMD switching 28nm process to GlobalFoundries in 2015 - Industry - News - HEXUS.net
Also wasn't R9 390X supposed to be stock water cooled? That should give more breathing space for that huge chip at very high frequencies. -
One thing about a new GPU being equally powerful as two gpu's working in Crossfire... we had examples of new generation top end GPU's that managed to achieve this kind of leap... though they were usually coupled with a smaller manuf. process as well.
Still, AMD will be supposedly using an improved 28nm process specifically designed for high performance - so it shouldn't be compared to a regular 28nm process.
Nvidia was able to achieve 50% increase over the previous generation with Maxwell with architecture jump alone and same process node.
AMD will be using HBM... which in itself will provide a massive bandwidth increase, but it may not stop there... official figures seem to state that the form factor would shrink by about 65% and power efficiency would jump by roughly 68% from using HBM alone (as evident here: AMD 20nm R9 390X Features HBM, 9X Faster Than GDDR5)... so if AMD has a new architecture for 380X and 390X which is reasonably efficient (And we don't know how much of HBM will translate to the rest of the gpu performance apart from just high resolution gaming), then I could see them surpassing Nvidia with a 300W envelope - provided that the GPU's in question end up more efficient in terms of performance per watt.
In the desktop space, power consumption doesn't really matter that much, and even Maxwell can get to 275W when properly stressed - AMD simply might be reporting maximum thermal envelope under maxed out loads (compute for instance), whereas Nvidia likes to use only gaming numbers.
Indeed, AMD cards may end up draw similarly less power while gaming, and up to 300W when doing compute.
Still, R9 380X seems to have been 'confirmed' to use HBM:
AMD developers confirm Radeon R9 380X with HBM memory | KitGuru
Things should get interesting.
It would be excellent if AMD decided to use HBM for their K12-Zen.
With all of the supposed form factor reductions and power efficiency as advertised in the first link I posted, the CPU section of the APU could be addressed in a more serious fashion and overall produce a high powered APU which is quite efficient.
It's a shame Carrizo won't be using HBM (apparently), though this benchmark here:
Report: AMD Carrizo APU Benchmarks Show 2x the Performance of Kaveri, 3x Intel Iris Pro | PC Perspective
...seems to show a 2x performance increase over Kaveri (graphical performance most likely) and 3x performance increase over Intel Iris Pro - hm... one has to wonder how AMD managed such a jump.
GPU section makeover perhaps?
Inclusion of HBM (unlikely)? -
-
And considering that R9 380X is said to draw 50W more than 280X, that 390X will absolutely need water/hybrid cooling. 290X in uber mode was basically one big fireball -
But do you really think it to boost performance that much? Have the chips from AMD been that much limited by bandwidth lately? It will be very interesting for sure but I`m sceptical performance wise for these new stacked VRAM chips
I think it seems like AMD is doing a tick-tock like Intel. Using about the same architecture on the 300 series as previous cards, with some improvements efficiency wise, start with introducing HBM and then build massive GPUs to combat Nvidia`s Maxwell. Use GloFo`s 28nm process to gain some ground. Probably even surpass GTX 980.
Then when 2016 comes they come out with a brand new architecture on 16nm that will replace GCN -
AMD will not be replacing GCN anytime soon. The compute-focused architecture is the backbone of their future CPU/GPU/APU/SoC and HSA developments. They'll keep adding revisions to it, i.e. first GCN cards were 1.0, Bonaire and Hawaii were GCN 1.1, Tonga and 300 Series are GCN 1.2, etc.
-
Nvidia GeForce GTX 980 Review - Page 4 | Maximum PCCloudfire likes this. -
As if Hawaii's advantage at 4K isn't enough already. I guess AMD is really focused on pushing that even further.
-
R9 380X will release in Q2 this year btw, somewhere between April-June
-
Here at CES 2015 there were three 4K screens driven by one R9 295X2.
BTW he also speaks about virtual reality here (among other topics). Interestingly, folks at Oculus are supposed to be much more excited about AMD than Nvidia about their VR specific optimizations. So this could be another marketing point for AMD (VR needs both a lot of bandwidth and super low latency).
AMD to Share Low-Latency VR GPU Rendering Tricks at GDC 2015 : oculusLast edited by a moderator: May 12, 2015 -
-
-
King of Interns Simply a laptop enthusiast
Lol what fanboy would change his 7970m for a 680m.
I never said the data is accurate I only said I hope it becomes true.
I simply want there to be competition. Funny that you misinterpret this for fanboyism.TBoneSan likes this. -
-
While the leak itself is fake the performance of the 390X may still end up close to that level. With HBM and GCN improvements there is no reason not to expect 50%+ improvements over 290X at 4K.
-
-
So finally new AMD GPUs should come quite late, not earlier than April 2015, maybe even as bad as summer 2015:
-
Poor AMD.
They are bleeding money so bad they are falling so far behind Nvidia in the launch schedule that they will bleed even more money.
They lost a whopping $364M in Q4 2014 (just one quarter).
And without money they can`t keep up with Intel and Nvidia in R&D. Which is the reason why they are over half year ++ behind Nvidia now in releasing new graphic cards.King of Interns likes this. -
I look forward for their new GPU and hopefully it will stir competition in new ways -
Lol you guys were saying the 285 launch was bad, but now there is something that surpasses it, behold the junk that is 960, for those with no brain cells. Yeah I thought so, now you guys are quiet, nVidia has released the ultimate milking product, even the fanboys on sites like WCCFTech are complaining. The only thing Maxwell is good for is Mobile, end of story. AMD hasn't even released it's products yet this year and they have already won the GPU wars in 2015.
-
-
-
Last edited by a moderator: Jan 27, 2015
-
The reason why I said AMD will lose even more money in the future is because there are huge disadvantages getting behind in the release schedule. Very few of those 1 million+ people who bought Maxwell will bother to buy new cards from AMD in Q2 when they bought their cards just recently. GM200 will release in March this year getting ahead of AMD there too.
How much are left when AMD finally release their new cards?
I totally agree that GTX 960 is a crappy card because Nvidia cheaped out on bandwidth. But its priced at $200 and is about just as good as R9 285 which is priced the same.
Problem for AMD however is that GTX 960 draw 100W while R9 285 draw 170W. What do you think people will buy?
That 1 million+ people that bought Maxwell cards will increase tremendously now that Nvidia have a $200 card for the masses and GTX 970 and GTX 980. And Titan X in March/April.
God knows what number we are looking at in say May when AMD release their cards, but they will have a hard time selling enough cards to make up for the cost of the manufacture and engineering to produce the new cards now that Nvidia have filled the market with Maxwell.
Which means less revenue (profit) for development of future cards.
R&D is so extremely important. It becomes a vicious cycle if they fall behind with that. -
AMD will use 28nm for their GPUs in 2015
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 30, 2014.