2012:
Radeon 7870: 1280 shaders, 175W
![]()
7970M: 1280 shaders, 100W
---------------------------------------------------------------------------------------------------
2015
Radeon R9 Fury Nano: xxxx shaders, 175W. HBM.
Ground breaking efficiency over previous AMD cards
![]()
Can AMD be the first one to get HBM out to notebooks this year? The efficiency is certainly there to give Nvidia a solid fight. Is this possible? Could the M395X be based on Tonga? Will we see skylake notebooks with AMD cards with HBM?
I sure hope so![]()
-
-
Hard to say whether this can be done in mobile for obvious reasons. First I doubt current MXM is compatible with Interposers and HBM. Second this is a massive die, but then again Nano shouldn't make more heat than Tonga and having a larger surface area makes it easier to cool, the issue with the size is that area restrictions are present in notebooks but then again the interposer would be smaller than an MXM module so maybe they can just put it on the motherboard.
-
Souns good, But if Nano realy have 2x Performance/Watt of 290x it will have about 70% of 290x at 100watt. Firestrike gpu score of 290x is about 11,8k, 70% of it - 8,2k its good, But still not anough to beat 980m with 9,5 gpu score...
-
I'd like to see this happen, but we cannot know of any performance numbers by simply reducing tdp. Hbm also has effectively double the bandwidth of gddr5 among other things, and there's a question of how efficient the architecture is...
Besides, how does one arrive at 70% of raw performance of 290x from an architecture that's supposed to be 2x as efficient and faster by simply eliminating 75w (42.8%)?
We don't know how the new architecture scales with tdp. And if the desktop version is anything to go by... I would think the performance would go past the 980m by a comfortable margin. And, hbm overshadows gddr5 to begin with... so at this point we have little to no idea except potential theories -
-
The nano is pure speculation. Other sites claim AMD reported the nano has up to 90% of the gaming performance of Fury X with only 175W. Nothing adds up, so what we could gather is that they can actually make a 290X performance level card with much lower TDP.
But it is all speculation. Considering their dual Fiji GPU is basically two nanos, I don't doubt the actual nano can pack up to a single fury performance.
I seriously doubt AMD will launch any mobile GPU this year with the new architecture, but next year, you can rest assured we will have great performance increase from both nvidia and amd thanks to improvements in lithography and architecture enhancements. -
Surely there must be something coming from AMD on the mobile end...likely around the time Pascal hits the market. Fiji (desktop) is a response to Maxwell (desktop), so things are heading back on track. Maybe it's Mobile Fiji, maybe not. Either way, I'm rooting for AMD this time around.
It's about damn time they show up to the party with something in hand.BYOB.
-
The power/TDP miscommunication, left/right hand not knowing, etc. Even if the performance was exactly the same as a R9 290X the biggest draw to the R9 Nano compared to the R9 290X is that: It's a Fiji card, it uses HBM, it's air cooled on a single fan, and only uses a single 8-pin power connector.
It's a stepping stone to Arctic Isles but it's a nice stepping stone. -
HMB in Notebooks? Not in 2015. Maybe in 2016... ~summer.
-
It says up to 2x Performance Density and up to 2x Performance per watt.
So, it doesn't just claim up to 2x Performance per watt.
Besides, even if Nano reproduces 290x level of performance at just 175W, one still cannot directly scale down to 100W and expect the performance to be 70%.
As I said before, you are forgetting that HBM pushes almost 2x data compared to GDDR5, and there's also the compression algorithm that AMD began using with Tonga (which is supposed to be in Fiji - and Carrizo APU's)... so I think that it would be premature to say either way - and besides, we have no idea if the Nano will be made into a 100W mobile GPU as of yet (would be great if it were and released in a couple of months). -
Funny is that mobile GPU of 50% Fury performance would cost twice that (1000-1100$) because mobile prices are higher, yet still they don't do anything for mobile market. Nothing competitive at least and nothing popular. Freesync would be great 1 year ago in mobile. Yet they put balls on us.
AMD deserve that hole they are in right now. Maybe I don't want them to become bankrupt but noone can say that they did not deserve that. Don't put balls and watch ahead, in the future.Starlight5 and Marecki_clf like this. -
AMD is not entirely to blame though.
Intel consistently kept bribing OEM's to not include AMD APU's for instance.
There are many people interested in pure AMD APU laptops (without dedicated gpu's) in a decent form factor and proper cooling.
You can see and pick from thousands of Intel configurations, and yet AMD barely has a handful available.
APU's do have an issue with throttling, but so do Intel systems (they cannot stay at their advertised turbo speeds without throttling) - and this is of course an issue with shoddy cooling and inefficient materials being used in laptops (which conform to cost efficiency as opposed to technical efficiency).
Realistically, the OEM's COULD make a system that would properly cool both AMD and Intel systems... but OEM's seem to focus a lot more on making Intel based laptops than AMD ones.
AMD laptops come usually packed with inadeqaute APU's such as ATOM equivalents in 15' to 17' solutions, poor screens, slow HDD's, inadequate RAM (APU's love a lot of fast RAM and short timings for IGP performance), or even just above 2 to 4 core APU's that don't come close to their higher end equivalents.
This is something AMD cannot influence directly and its one of the huge contributing problems as to why AMD found itself in the situation it is in today.
Granted, its CPU performance is low, but design-wise, they pack a lot more features (including HSA which literally destroys any Intel cpu with an igp in software that's opimized for it).Starlight5 likes this. -
It's AMD who manufactured E-450 APU Zacate. Zacate in russia means sunset. It's dualcore 1.6GHz single channel slowpoke. It's such a slowpoke that after adding 2nd stick of RAM it becomes slower. That's 2011 year 15" laptop yet 2005'th Pentium M one would be comparable. It's good fot browsing untill you install Antivirus, then it's type-writer at best. Is this peace of PU something AMD couldn't influence directly?
-
-
Exactly.
AMD has little to do with how OEM's decide to use their chips.
The E-450 was not meant to be used in a 15" laptop... that was supposed to be done with proper APU's in the 35W - 45W range.
The E-450 was more suitable for netbooks if anything else.
Intel however bribed OEM's not to use AMD cpu's, or to just use the worst possible combinations.
Take Carrizo APU for instance.
It is the latest in APU innovation. While it is using a last iteration of Bulldozer, AMD made some impressive modifications by implementing it on a high performance 28nm process. They improved energy efficiency by about 40%. The CPU portion has been enhanced in the IPC department by up to 15% for the top end 35W models compared to Kaveri (and 50% for 15W models), while the IGP portion was revamped (it now uses the Tonga architecture, the new color compression algorithms, is about 20% faster than Kaveri IGP, or possibly more) and Carrizo as a whole supports HSA 1.0 entirely.
AMD managed to get some market penetration with Carrizo to be a lot more visible... and several Carrizo models will already be available for purchase, but again it would seem that some of the OEM's did the same mistakes like in the past:
1. They are using 800p screen resolutions.
2. They are using low end Carrizo APU's in 15" and 17" form factors.
3. Slow HDD's and 8GB RAM as an option (Intel counterparts have an option to be configured with 16GB of RAM and an SSD - no such options for AMD).
This kinda seems to me that Intel is rigging the game again.
I think that if Intel hadn't been paying OEM's to squeeze AMD out of the picture... the overall market share would be a lot more balanced.
I think Intel might still be in the lead, but not by the same margin they are right now.
Alas... things might change yet.
Carrizo did grab a lot of attention it would seem, and its up to OEM's to implement proper solutions...
Though AMD could have created a reference laptop themselves and sold it like that...
I think they did so in the past before, and those laptops ran great in terms of cooling, form factor, components, etc.transphasic and TomJGX like this. -
When it comes to mobile APUs the situation created by the OEMs is just a mess, especially in the US. Carrizo is such a huge improvement compared to previous APUs and even got a lot of praise from the press, yet the situation on the OEM side doesn't seem to be any better. Heck I have even read something not too long ago that said some of the laptops that were sitting on display during Computex won't even be on sale in the US because the manufacturers had no such plans. It's like mobile Kaveri all over again. Some of the OEMs need to get sued before they get their ***t together, with the exception of a few ULV parts HP is the only OEM to offer decent APU based models in North America, again.
Last edited: Jun 19, 2015transphasic, TomJGX and Starlight5 like this. -
HP seems to be only OEM for now that messed up by releasing low end Carrizo in too big/unattractive form factors.
Other OEM's like Acer and DELL were mentioned that they will be using Carrizo in their notebooks... so for now, we have yet to see how things pan up.
Who knows... we may end up pleasantly surprised.
I'd really like to see what the FX 8800P Carrizo can do with proper tests (but we need to wait for those benchmarks).Starlight5 likes this. -
AMD APU can be compared only to previous AMD... or Intel Atom. That kinda gives a hint.
If I need 18W CPU, I would use i3 by that time or downclock 35W i3/i5. Decent performance per watt? If decent means constant awaiting then yes.
Typical AMD user is like every notebook user till 2010 - he always needs more performance than he has and believes that tiny more and it would become bearable.
I've never wrote about AMD users but I guess you need to open your eyes: you want to forget about posession of more speed in everyday tasks/gaming? Buy Intel. I guess even Intel kinda wants AMD to create real fast APU because Intel tired artificially hold its processors.Last edited: Jun 19, 2015TomJGX and Starlight5 like this. -
Last edited by a moderator: Jun 21, 2015
-
Thread cleaned. Reply bans come next.
-
@octiceps, I deleted your post suggesting that three valued members of our community that pull for AMD engage in sexual acts together. Not only is this socially unacceptable and rude, but it is also a violation of forum rules. Please review forum rules (reminder since you seem to forget - Rules) and knock it off with your inappropriate behavior. I normally do not call people out publicly, but the private messages and warnings have been ignored and you even had the audacity to send me a private message that said "love you too". I am confident that I am not the only member of this community that is tired of your ridiculous and rude behavior, so I am asking you very nicely, in public, to knock it off. Or next time, it will be more than a warning.
transphasic, E.D.U. and TomJGX like this. -
"pull for AMD". Lol. (sorry)
Just look at independent benchmarks, make your own decision. A new part will come out rendering your new bit, old. Then another, making it very old. Same as its always been.
Noone has to convince everyone in the world they made the right decision by trashing the competition and those who made a different decision.
I never understand nerd emotion, ffs these are massively complex technologies that ultimately just push coloured pixels onto a screen. Nor how human tribalism tendencies get attached to companies that have no moral (only financial) compass.
What rubbish games they must have installed that they'd rather do the feces hurling internet monkey thing anyway!? -
I think it is more of an issue with short term memory, where people can't remember a gen or two ago and how things panned out haha
. People always spell doom and gloom whenever they can, but some people in particular, might take it a bit further.
Which is why we are always watching.
And you don't want us hovering around your every post... looking at your complete history, contemplating your future. -
Review of R9 Fury X is out. We might not get a mobile chip out of it but perhaps we can derive some numbers for the other cards from this:
Sweclockers test R9 Fury X
A very short summary:
Gets beaten by 980 Ti at 1080p.
Beats 980 Ti at 4K.
Bad overcklocker. -
It is quite disappointing that overclocking is locked on the memory right now, and the core only did so little. Considering they touted to make the GPU overclocking friendly, its surprising to read that.
Other than that, the Fury X and 980ti are almost clones in performance and price. -
In majority of the games, the Fury X seems to be between 980ti and Titan X at 4K - they all seem to be right next to each other.
Considering that these are initial benchmarks without improved drivers, I think we could see more performance increase in the coming days/months (but this isn't guaranteed).
Plus the price tag of the Fury X is comparable to 980ti, so I think that for those who are using desktops and want to use newer HBM technology but do not want Nvidia, they might want to opt for AMD.
One thing that people seem to be missing is that AMD Fury X was stated to have 8.6 Teraflops of 'raw computing power' - which is where most of its power consumption usually goes.
I would imagine that most people don't seem to understand that Nvidia usually dumbs down its compute capabilities in gaming GPU's while AMD is not really doing that - this could also explain why AMD isn't coming out on top.
Still, the 1080p results aren't exactly encouraging, but then again, this GPU seems to have been specifically made for 4k.
Either way, I'm looking forward to seeing the R9 Nano.TomJGX likes this. -
-
Another review here too: Fury X Review by IGN. (Not sure of their reasoning behind the 9.5 score but it's an actually decent technical read coming from them).
All in all, both Fury X and 980 Ti are comparable for the most part. The Fury seems to like higher resolutions a little more than lower ones, since it might edge the 980 Ti in those realms. Also that thing stays mighty cool while needing comparable power. HBM is really some sweet stuff, too bad OC seems limited. -
It's cold because it has liquid cooling, nothing more. Price looks fair but who would buy FuryX?
I tell you:
1. 4K gamers for SLI rigs maybe? But there is to be FuryX2 isn't it?
2. FullHD gamers? Only with a 144Hz Freesync monitor.
3. 1440p gamers? 980Ti looks better. Again, only ability to use Freesync monitor is an argument here.
It appears that not so much people really need it for gaming performance.
As about AMD Fury with Air Cooling... it is typical AMD aka cheaper, hotter, power hungrier but has better frame rendering time thanks to HBM so it doesn't have typical microstuttering issue.
So here it is: either buy cheaper Fury or buy more expensive but way faster (due to overclocking) 980Ti. I don't see a niche for AMD Fury X. -
Rebrands everywhere.
Nvidia effectively alone on the mobile market til 2016.
The hell is this?
RIP AMD -
http://www.amd.com/en-us/products/graphics/notebook/r9-m200 (Quick link)
Okay, so then what's the difference between the m390x and m395x? Because if they're identical in every way, then why have a GPU with two names? -
What the heck... M390x and M395X look identical on paper. This would be absurd if the cards are actually the same.
-
-
I hope for AMD's sake they can pull a win with Zen. -
-
Did anyone expect something from AMD for real?
-
-
-
-
-
-
Its not even impressive on the desktop, I wouldn't want a cut down version in my laptop when nVidia can walk all over it. I'm so disappointed with the Fury X...
-
I wonder if AMD is making any more MXM cards period. Haven't seen one past the R9 M290x. R9 M295x was (pretty much) all BGA, and searches for the 300 series has not turned up any MXM modules. Usually there are ES cards floating around by now, but nothing....
-
Maybe AMD should have invested that R&D money into a more efficient architecture (GCN 1.3) rather than relying on HBM to cut power consumption. Might have proved more effective until HBM 2.0 next year.
I'm no tech expert but Maxwell still beats GCN 1.2 in pure perf/watt.
Sent from my Nexus 5 using Tapatalk -
So, that R9 m395x? Apple GPU. There's also a m390 and m395, but they might be placeholders.
http://www.ibtimes.co.in/21-inch-im...ase-this-fall-everything-you-need-know-637325
http://www.journaldulapin.com/2015/...l-avec-une-radeon-hd-r9-m395x-en-preparation/
(There's more articles like these).
One of those GPUs is probably m290 rebrand since is newer than the other GPUs. Maybe the m390, but who knows. I might be wrong. -
The Fury X is sitting mostly between 980Ti and Titan X at 4k (and managed to exceed the Titan X as well at those resolutions). Those games however seem to show a difference of a few FPS difference at most between the 3 GPU's.
In Nvidia optimized games the FPS difference was higher and in favour of 980Ti and Titan X (for obvious reasons), however the Fury X also managed to close the gap in performance (much more so than 290X/390X).
2. These early benchmarks have been done with early drivers (so that's something to keep in mind as later driver releases could optimize performance further).
3. There's also the little thing known as 'compute performance' which handily exceeds the Titan X by 10%.
Compute performance on the other hand will not really show in games if I'm not mistaken, but software that can specifically take advantage of it (which we have not seen being tested) - it is also one of the possible reasons why power draw has been higher on AMD cards compared to Nvidia (who usually gimp their compute capabilities).
Saying one is disappointed with the Fury X seems to stem from an overly limited perspective (if games are your only way of measuring performance - and even there, one doesn't take into account the full picture because this GPU was designed for large resolutions).
Despite AMD using the Tonga architecture, or GCN 1.2 (which is simply modified from 290X), performance per watt has been increased.
Considering the lack of new architecture from AMD at this time, I think the overall improvements as we have seen them on Fury X are satisfactory.
AMD does need to release a new architecture though... but the HBM was a testbed for them, which should give them a full year more compared to Nvidia when it comes to dealing with this technology.
As for R9 Nano... it would be great to see a cut down version of 100W in laptop form (which I think might equal 290X given from what little we have been told).
Is it possible the R9 Nano is featuring a completely new architecture?Raidriar, TomJGX, triturbo and 1 other person like this. -
Bad time to be rebranding. They're essentially giving up any chance at ever again being competitive in the mobile market. -
@Deks
As far as compute performance goes, Fiji XT is actually not that great at FP64, since it's further gimped to 1/16 FP32, down from the 1/8 FP32 Hawaii had. In fact if you're buying a card for GPGPU/mining purposes, a 290X would still be a better idea. Not sure if games use FP64 at all, and if not, I'd rather have seen AMD axe FP64 to save a bit more transistor estate, and shave some watts as well.
Btw this page of the TechReport review is a good read, as it illustrates that not increasing the ROP count could potentially be what's limiting Fury X's full potential. This summary chart is also interesting, because it shows how poor the performance scaling is vs 290X at 1080p. Interestingly Fury X on paper appears to be Tahiti X2, and it does indeed come quite close to being 2x the performance of 280X, at least at 4K and 1440p. -
As always - the price. People would only consider AMD if it mocks the floor with nGreedia AND is cheaper at the same time. What we have here is more or less the same performance AND a water cooler, which alone is around 100 bucks. But no, it has to be EVEN cheaper! I wonder how AMD is not throwing houses, cars or even small countries with their GPUs. Even then someone would complain that the house, or the car is not the one s/he wanted, or the country is far off and one needs an airplane as well.
-
As I said the AIO is a mixed bag, some see it as a positive, others see it as a negative. If the $649 price is due to the AIO, then I hope AMD releases an air cooled version for cheaper.
The other problem is Fury X trades blows with 980 Ti only at 4K, but at 1440p and 1080p it falls progressively behind 980 Ti. Ignoring that for now (since Fury X is marketed as a 4K card), in most recent games neither of the top cards from either camp can manage a consistent 60 FPS at 4K, which means multi-GPU setups are needed. Yes I know HBM isn't GDDR5 and offers ungodly bandwidth, but I've yet to see convincing evidence either way that 4GB HBM will or will not be an issue at 4K with multi-GPU setups. You might laugh at the thought of 1080p or 1440p, but remember 120FPS gaming at 1080p is still very demanding, and with the introduction of DSR/VSR resolution is less important than before. Not having the performance crown at 1080p and 1440p means losing sales to nVidia.
Then there's also the issue of overclocking, which frankly I don't want to get into because a) Fury X is voltage locked right now so can't make a fair comparison, and b) it just makes things look much worse.
It's not so much Fury X is a disappointment in and of itself, it's more that when stacked up against the 980 Ti, there are more cons than there are pros, at least IMO anyway.Last edited: Jun 28, 2015
Mobile card from AMD coming out that is based on Fiji?
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jun 17, 2015.