thank you for letting us know, that is disappointing to hear. I guess its beacuse the expect the wont be able to sell enouth to make it worth producing/designing an mxm bord
-
In all reality, it seems that the GTX 970M, GTX 980M, R9 M290X, and custom designed R9 M295X are the final MXM 3.0B cards to be released, which is quite unfortunate.
JAY8387 likes this. -
Even if they made MXMish board (the new GPUs are odd with odd hole placement), what are the chances to fit into old machines? None. Terrible news, none the less. Everyone stepped out of their way to stuff DTM 980 into their machines, but they can't do the same for AMD. Whatever, no Clevos for me.
Ashtrix likes this. -
Sent from my LG-H850 using Tapatalk -
King of Interns Simply a laptop enthusiast
What a sorry state of affairs. Looks like I won't be getting another laptop after all. So very sad!steberg, Ashtrix, TomJGX and 1 other person like this. -
But... why would Clevo put Polaris 11 on MXM board? Or you still have hopes about Polaris 10 Mobile this year?
-
Polaris 10. I'm pretty sure we'll see it in MobileWorkstation, but quite likely wont be by the year's end. The WX 7100 gives me hopes.
-
The moment they announced 980 in a laptop, I think I was one of the first people to speak out on how ridiculous it is that they will bend over for Nvidia and even push for full grade desktop GPU in a laptop, while an AMD gpu of same TDP as 980m ends up throttling because the cooling is never sufficient for it.
Same for APU's that end up with garbage components.
These OEM's are likely getting more money out of Nvidia, hence more design wins and coming up with flimsy excuses every time AMD is mentioned. -
-
-
http://www.notebookcheck.net/AMD-Radeon-RX-460-Notebook-RX-460M.171186.0.html
There is typo: It is based on P11, not P10. However, good thing is mobile rx460 is just marginally slower than its desktop counterpart, so we can expect at least 965m level of performance at small form factor. -
And look at this:
http://ocaholic.ch/modules/news/article.php?storyid=15173
It would seem that R9M 480 is the desktop 460.
Not bad.
Now give us full desktop 480 properly undervolted in laptop form (undervolting would easily bring it down from 150W to 110W or 104W). -
-
King of Interns Simply a laptop enthusiast
RX 480 properly binned with vram probably running a little slower and core a little slower can easily operate at power consumption between 100W-120W while performing around desktop 970 levels. Still a decent perf to power ratio improvement over the 125W 980M that was 15% slower than a 970 and when OCed became a hot power hog.
So yeah I reckon they can bring about 20% better performance for the same power comsumption if they wanted. If at a decent price why not. Then release a 970M thrashing mobile RX470 with similar treatment to run at around 80-90W bringing 980M performance at a cheap price and much lower power consumption. Yes please. Plus of course much better DX12 and vulkan support/performance than a maxwell card ever will possess. Not least because Nvidia themselves will neuter Maxwell performance just like they did to Kepler.
We don't need to give the industry excuses why it isn't happening. There is only one reason: money. Nvidia pays the correct amount to the correct people. That is all there is to it.
It is time to revolt and say no to this crap. Let's stop being weak minded and see the wood for the trees. We are talking about the end of MXM standard card sizes and the complete end to AMD offerings for enthusiasts for laptops.
Can you imagine if this happened in the desktop world now. Suddenly graphics cards come in all sorts of shapes and sizes and prevent you from upgrading. Sound familiar?
This is worse than clockblock. This is the stifling and strangling of competition by Nvidia and will only mean 1 thing: A single option at an insane price. A bad deal for consumers and the environment.Last edited: Aug 9, 2016triturbo likes this. -
Yeah, AMD usually clock them to an inch of stability in order to stay somewhat relevant for those that just look at the numbers. So, it's not surprise that a dial back yields quite the improvement. Remember the R9 Nano?
@D2 Ultima - Pointless to you, not to me. Tell that 980m to play nice with my DreamColor, or that 1070/1080/whatever to fit in there, or that Quadro to cost a third of its price, or the other 780m to not lose performance... It's getting quite the list and I haven't dug into the good parts.hmscott likes this. -
http://videocardz.com/62790/geforce-gtx-1070-mobile-and-radeon-r9-m480-benchmarks-leaked
Someone just renamed the bigger bars to be GTX 970m and R9 m480, when in reality they represent the scores of the GTX 970 and RX 470.
On the topic of mobile Polaris, I question as to why the RX 470 isn't the ideal example of a P10 GPU that can be modified (binned + lower clocks) to a mobile GPU without losing too much performance. I know the RX 480 is the full P10, but I fear a full P10 mobile GPU will be like the R9 m295x: full Tonga but performance of an R9 285 (Tonga Pro). -
You're getting 980M levels of performance if you want such low TDP, and then you can't OC, and the card will most likely run hotter too.
Right now, you're expecting WAY too much. You want something that isn't going to exist right now, because while AMD is making strides in their efficiency, the same reason we didn't get Hawaii and Fiji on laptops is the same reason Polaris 10 isn't going to make too much sense. On the crappy team green side, they've actually shoved within-7-percent-of-desktop-models'-performance cards into laptops this time. Hence it needs to retain its performance and cost LESS than a mobile 1060 or it's just going to fall by the wayside.
I'm being realistic here with my statements. AMD hasn't figured out how to compete and they're STILL sticking to GCN and even with all the optimizations and perf/watt ratio bumps they've claimed, it's still not even very impressive compared to what the other camp is giving. I HATE that about it, but that's their own problem to fix. I can't fix it for them.
For all the crap nVidia is doing, AMD has nothing to counter them with. And where is this neutering of Kepler's performance in games? I've had 780Ms from launch until now and they have only performed equal or better in ALL games I've owned for the whole duration of my owning them. I've heard of benchmark issues in the past though I haven't seen any, but I can see nothing with respect to games. And I can expect the same to happen to maxwell. I know exactly what the forced obsolescence they'll do is, but that's not the same as neutering existing products (though almost as nasty). -
Moving on to detailed rundown on the "realistic" statements:
Keep drinking your Kool-Aid.Last edited: Aug 10, 2016 -
Here: http://www.hardware.fr/articles/951-9/consommation-efficacite-energetique.html
Power limits and thermal limits disabled in "Uber" mode. NO OVERCLOCKING APPLIED. 192W drawn in Witcher 3 at 1440p. THAT CARD IS NOWHERE NEAR 150W. If you want to somehow shove that in a laptop and tell me it's going to keep that performance and maintain 100W, you're dreaming. That card would have to go in something like the P750DM2 with a 330W PSU to even begin holding its inbetween-970-and-980 level of power, which basically matches the 1060 mobile (which, while extremely hot, does not need as large a power brick and is suited for machines like the P650Rx2 and P670Rx2, which use 200W bricks).
As for the Nano, you can see also in that bench that it hit 185W and sat there for both BF4 and Witcher 3's test. That meant it was throttling. How much did the card throttle? I don't know. How much do people normally notice throttling on that card? I don't know. I don't own one. But it isn't a Fury X replacement, and also remember that unlike the RX 480 (which I might add surpassed Fury Nano's power draw in Witcher 3 when its limits were removed) the Fury Nano has HBM, which reduces power consumption on the memory by quite a bit. A benefit the 480 does not have.
I'm the first person who'll tell you about the forced obsolescence that GameWorks uses. Straight tessellation on every single thing that doesn't use PhysX, and absurd amounts of it too, such that only their latest can cope to any decent degree, and when they shove a new card architecture up everyone's meowmix, it runs much better on those GameWorks titles because it simply handles the tessellation better, and they can point and say "look, it's so much faster!" when in every other title it's obviously not as good. But that still doesn't let me tell someone to pay more for a weaker, more power-hungry GPU (it ain't hotter, nothing beats Pascal though, I'll give them that. AMD's the cool arch this time around) just because nVidia sucks.
AMD needs to compete. Drink THAT Kool-Aid. They're not competing. They're not trying. They were using Crossfire (which works in about 20% of new titles on launch, has no nVidia Profile Inspector alternative to mess with bits and force profiles, and didn't even scale properly in their own demo when pushing their new card) to push their new card being good... that was awful. They came in swinging with a disastrous card frying peoples' motherboards. They even ran a driver update that limits the card's power draw from the PCI/e slot (in effect neutering the card's already driver-induced lackluster performance further), but won't fix it with an 8-pin connector by making a small recall. They just want to sweep it under the rug. They're betting on DX12 and Vulkan to fix their ridiculous driver overhead issues in DX11. Do you know WHY nVidia cards drop in performance in DX12 over DX11? Because there's no DX11 driver overhead on nVidia drivers. Unless you hit the limit of DX11 in draw calls or whatever, you're not getting any benefit with DX12 for them. This is because NOBODY is coding in DX12 optimizations for games, because it's too expensive to give console-level optimization to the PC and they get nothing else out of it. So the optimization is driver-side, and DX12 drivers from nVidia are not as mature as their DX11 drivers are. So... performance loss. Do you see AMD's DX12/Vulkan driver performance? How their cards basically get a 20-30% boost, more if the game was in OpenGL instead of DX11? That's what their cards are SUPPOSED to be doing. Right now. In DX11/OGL. But their driver inefficiency and overhead kills all their card's power.
And finally, they're hanging HARD onto GCN. Like, extremely bad. GCN is hot, power hungry and generally cannot overclock worth a meowmix (or maybe they similar push the cards so far to the limit before launching them JUST to scrape out more performance that they're already at their limits; like if nVidia made the base clock of a GTX 980 1350MHz and it could only OC to 1450MHz). I will admit wholeheartedly that Polaris is a decent bump in efficiency and in temperatures. That's great. Really. That's great. But it's not enough to compete in the mobile market. They can run the price war on desktops because nobody cares about heat (aftermarket coolers) or power draw (big PSUs) on desktops, and most don't care about overclocking (it's why superclocked cards sell so well). Notebooks don't have the first two luxuries. Especially the second one, since the power draw often dictates the notebook something can be used in, due to the accompanied power connector. Hence why I said, that card needs an allowance of up to 150W, unless you neuter its performance to ALL hell, and thus that means 300W power bricks. Even a 240W power brick of high quality might barely push it, but I would say 300W+. Even if a P65xRx2 could hold one, they sell with 200W PSUs at best. You're not going to power a RX M480 in that.
You feel I'm some happy go lucky nVidia lover? You really. REALLY. Had best think again. I'm more annoyed at AMD for not bringing anything I can even be halfway hopeful of to the table. I had SUCH high hopes for the RX 480 and they had to go blunder it with that 6-pin connector. That was their bloody trump card for low end. But NO. Trying to stuff a 165-192W card in a 150W envelope to "target it as a budget card".Shadow God likes this. -
Well, as I said earlier, it seems that the 1070 would be a 130W GPU, and the 1080 even more so. An RX 480 @130W doesn't seem to be that bad. It wont be up to 1070, but at least it would be an option. Considering that it would definitely be cheaper than the 1070, it would have it's place under the sun. Moneyflow is needed by everyone in order to keep the wheels spinning.
-
King of Interns Simply a laptop enthusiast
I actually wonder how the M295X performs and runs in a proper MXM based notebook with decent cooling (ie designed for 780M/880M/980M) and see the performance in game.
I reckon it wouldn't run half bad particularly as drivers are quite good now. The M295X was never tried out in a proper gaming laptop so I dont see a basis for decent judgement. Likely performance is somewhere around 970M (perhaps even higher) but at 980M power consumption. Not impossibly bad. It just never was given a chance.triturbo likes this. -
Exactly and now the gap is getting wider.
-
OEMs never gave a chance for AMD to have a shot at, Only AW had the MXM machine back in 2013, They were always being purged by these upper echelons Intel and nGreedia, even so AMD never tried at all giving their best, lost the Adreno to Qualcomm and now all they do is aim for the console crap and a top end competitor for Nvidia's top card, that's all no optimizations nothing. Plain mediocre stuff, with less QC in mobile MXM cards from what I read so far. They are worthless now Samsung should buy them and give a shot at the Ngreedia, Intel & Qualcomm, Hate this monopolistic trash...
OT : I really wish someone comes up with a strong rooted petition base for MXM3.0b cards from Clevo, that huge ZM/DM user base & looking for that community love made custom 1070mxm3.0b cards from kingofinterns -
This'll be my last attempt.
As for the 1070 and 1080 cards, their power so far outclasses the RX 480 it's not even funny. This is the problem. In the big, heavy notebooks which use large enough power bricks for these cards like the P750DM2, putting in a RX M480 is not going to end well unless it costs less than the 1060 option. And don't even bother offering it in Crossfire with the way multi-GPU is going, that's going to result in 90%+ of all customers buying it getting burned. Which I will talk more about later.
Compared to 980M, it's either going to perform the same and draw more power and run hotter, or perform much better and run much hotter and draw INSANELY more power, similar to a GTX 980. This means it'll only fit in the highest end of notebooks. Cool? Cool.
Compared to 1060, it's either going to run cooler, perform similar, and draw almost triple the power, or it's going to be much weaker, much cooler, and draw almost double the power. This huge extra power draw means it NEEDS to be again in the highest end of notebooks.
There is no place I can sit here and think to fit it. It will not work in a P65x or P67x; the power bricks are too small. It will work in the GT72, GT73, GT83, GT80, P7xxDM, P7xxDM2, P870DM2/3 since they can all take 330W bricks or higher, however that's basically buying those top end machines with a single 1060. It's extremely unlikely. You would have to really want AMD for something to do so. It just does not fit. You can't sit there and rationally tell me this is incorrect. I'm being completely impartial and viewing this from the eyes of a prospective buyer. Even if the AMD card is cheaper than the 1060, these machines are extremely expensive and powerful otherwise. You wouldn't much see someone buying a RX 480 in a desktop with a 6700K and 16GB of RAM with two SSDs and a HDD. It's unbalanced to all hell.
As for nVidrosoft, I never let them get away with their crap. However I am currently discussing AMD here. This isn't the pascal thread where I will bash them and then express annoyance at AMD's inability to seemingly compete their way out of a childrens' softball team.
Second, let's get rid of GCN. Stop pouring R&D into GCN revisions. GCN should've been discarded by the time they were preparing to release Hawaii. Hawaii was 2013. They would have been working on Fiji and Tonga that whole time, and that's fine. But they should have been preparing for 2016, right now, with Polaris and upcoming Vega, to not be on GCN. They should have seen Hawaii was going to be hot, hungry, and generally inefficient and realized they could only go so far with GCN. I know this is far easier said than done as you listed below, but it NEEDS doing. Pascal is a straight example of how refining and shrinking an architecture does not grant a bunch of gains (Pascal is in fact SLOWER than maxwell... it's just clocked higher and had more cores crammed into it) and it's ridiculously hotter.
Third, fix your drivers. This needs happening. You asked why is focusing on DX12 and Vulkan a bad thing? Well first, DX12 requires windows 10 which is garbage for consumers. Second, waiting for DX12 and Vulkan to fix your own driver inefficiency (which won't fix the hundreds of DX11 titles already on the market) is incredibly stupid. You want to discard all the existing titles? Witcher 3, GTA V, Dying Light, Black Ops 3, all these rather incredibly hard-to-run games that could be getting 30% better performance just by driver fixing? I already explained why nVidia loses performance and AMD gains. Their driver efficiency is simply lackluster. When devs start coding optimizations into games directly (I.E. never) then you're gonna see nVidia pull ahead in DX12/Vulkan. Until then? AMD is the only one getting performance boosts JUST because it automagically kills their driver overhead. It's why their cards do much better when you force a GPU bottleneck... there's so much excess CPU power available that their cards essentially operate closer to their proper potential (and even then, not as much as Vulkan/DX12 brings). I'm NEVER backing down from this point. Using Vulkan and DX12 existences as a crutch for things to come is incredibly stupid.
Fourth, fix crossfire. Produce more profiles more often, and give people an ability to adjust the profiles. Have you ever attempted nVidia Profile Inspector usage? Did you know I can do things like disable SLI from specific games without requiring closing everything and having my screen flash to enable/disable SLI system-wide? It's a deeper disable than forcing single GPU. Did you know I can change the actual bits used to form a SLI profile? Did you know I can force other games' specific profiles on new titles without needing a driver to specifically launch with a profile? Did you know that sometimes when nVidia's profiles are lackluster, I can find better ones online? Like this one with Black Ops 3 where the Battleforge bits provide about a 25% boost to performance over the default BO3 bits? AMD can't do that, in addition to having less frequent profiles. AND CROSSFIRE STILL DOES NOT FLIPPING WORK IN WINDOWED MODES. I don't care if scaling is better. It's useless to me, and most people who multitask and want more than one GPU. It's useless to pretty much any streamer worth their salt, because alt tabbing is something that needs doing a lot, and the good ones won't run fullscreen unless they have to. No. I'm not accepting this. The feature is nigh useless right now, considering the RIDICULOUS drop in multi-GPU support already happening. Two AMD cards might as well be one AMD card and one waste of money right now, and this is coming from someone who still loves multiple GPUs.
Fifth, discard Raptr and any ties to Raptr for AMD tech like recording via VCE. That's just barriers to usage at this point... enable some functionality in your driver.
Sixth, stop focusing on tech that isn't here yet or nearly here, like H.265 recording efficiency when H.264 is left in the dust (and H.265 isn't even accepted by most sites like youtube as of yet). AMD is focusing on tech that'll matter YEARS from now, but by the time the tech makes a difference the cards would be extremely obsolete. It doesn't matter RIGHT NOW that they are good for tech coming 3-5 years from now. If they can't fix their drivers and optimizations then it makes no sense filling the card with barely-working tech. You need to focus on selling cards NOW. Not selling current cards expecting people to buy them so it works better than nVidia cards 3-4 years from now (when someone is likely to buy a new card anyway).
That's the point. It won't be a 1070, but it will almost certainly need a notebook with a larger power brick and will run much worse. If it was made for the P7xxDM series and the GT73/GT83/GT72 series and was priced well cheaper than the 1060? Then maybe you have something. But even then, it'll be extremely niche. I can guarantee you it won't be showing up in any ASUS or Alienware laptops, far less the superthin sellers like Razer/Gigabyte/etc. NOBODY wants a thin laptop with a massive power brick. NOBODY. I've even seen people complain about the P65x power brick being large. Far less one of the 240W-300W bricks. You're considering nothing but power draw and performance... you need to consider everything.
AMD supposedly turned a profit the last quarter. So they're not, as of right now, bleeding. But even then if they have enough money to survive some years they should go all-in and bring something TRULY amazing, but they're basically lifeline-ing themselves, playing way too safe, and generally not trying.
AMD needs to want to fight. When they want to fight, I'll help. This is not them wanting to fight. This is them just... I don't even know what. But it ain't wanting to fight.
When nVidrosoft is the topic, or when Micro$haft is the topic, I will bash away till INFINITY. But here is AMD's topic. I will criticize them as I see fit. It's their own faults. There is a certain leeway I will give to them because their situation is not rolling in the dough like the other two teams, but that leeway is not infinite. -
Me too, it seems that we can disagree on some points to infinite
I'll just correct you on this:
-
D2 Ultima...
Here's something that might interest you:
http://www.forbes.com/sites/jasonev...x-1060-versus-amd-radeon-rx-480/#2b6a339a4166
Look at the power draw of 1060 and 480 from the wall.
Now keep in mind that this is with stock 480.
When undervolted, the 480 power consumption drops by about 30 to 35W... on core undervolt alone (this also doesn't take into account that the VRAM on 480 can also be undervolted, which when it does, apparently drops power consumption even more).
So, yes, it is not far fetched to think that if they can 'easily' slap a '120W' 1060 into a laptop, then an undervolted 480 would actually rate at about 110W at most - making it more efficient at base clock speed, while also providing superior performance at DX12 and of course far superior one in Vulkan.
Yes, DX11 performance would still be lower or relatively close (depending on the game), but the differences would not be noticeable to most players as both output more than high enough frame rates at 'High' settings in all games for smooth gameplay... and as time goes by, the 480 would probably further receive performance boosts and improvements.
AMD drivers improved quite a lot compared to before. Right now some people even dare say that they offer BETTER software support than Nvidia does - I don't know how factual that is, but AMD certainly stepped up their game in the software department.
Now we have to wait and see if AMD scores any design wins with OEM's and what kind of designs will they be.
Even if they get inferior cooling and ramped up voltages, we could still individually undervolt them (unless OEM's again sabotage AMD and create such an inferior cooling system that will make it impossible to compensate with undervolting... or they will just bring the voltages down and degrade the cooling at the same time, making it seem like AMD is heating up more).
AMD can do very little if anything to directly influence OEM's.
My point is that given what we know of 480 and its power draw, it CAN be placed into a laptop if properly undervolted - which would effectively make it slightly better than 1060 in terms of power draw.
For 1070 and 1080 in mobile factors... you're moving into the TDP range that AMD has no new GPU's in to begin with.
But yes, I will also call you out by saying you can be very biased if you think that slapping 200W Nvidia GPU into a laptop is a piece of cake for OEM's all the while claiming how hard it is to cram 120W AMD GPU of previous generation into a laptop and give it proper cooling and power brick.
And even if you think that undervolting cannot help AMD... I think you are sorely mistaken.
The differences between 1060 and 480 are not big in terms of power draw. Nvidia merely optimized it's voltages better out the door (hence the much lower volume of GPU's as a result), and AMD only needs to undervolt to lower levels without sacrificing performance (actually, increasing it at the same time) to get into the same if not better TDP range.
Also, when properly undervolted, the 480 is superior to 1060 in performance per watt.
1060 also operates on higher clocks, and when 480 is undervolted, it's performance goes UP by as much as 5% in some games (due to it being able to maintain boost clocks).
So, 480 offers mostly the same performance as 1060 while smashing it completely in Vulkan at lower clocks and lower power draw (when undervolted).
So yes, it is more than doable to get a full blown 480 into a mobile form, and for now, that would be enough (at least for me).
I wonder just how high OEM's will go in terms of GPU TDP before deciding what the cutoff point will be.Last edited: Aug 11, 2016triturbo likes this. -
Secondly, you're telling me that undervolting the core, undervolting the memory, and keeping the clocks the same will keep that card 100% desktop rock-solid stable performance at 110W when it normally draws 192W to hold its boost clocks at stock speeds in a VIDEO GAME (not even furmark or any such program)? I just want you to understand what you're saying right now, and why it won't ever work. You're expecting a card that pulls so much power that undervolting it makes it boost more PURELY because it can make the power it has stretch further, to completely reduce its actual power consumption by simple undervolting? And not by a small amount too; but by 43%?
As for this "DX12 & Vulkan" performance, this is irrelevant for the most part. Most games aren't using it, and it doesn't affect the VAST majority of existing or upcoming titles. 2 or 3 years from now, this may be more relevant. But right now, I see everybody jumping on it and it isn't making any sense. By the time Vulkan shows up for many titles, these cards will be badly obsolete. Anybody buying or trying to push cards right now as being "good in DX12/Vulkan" is grasping at straws to sell a card with. And this goes for anyone, pushing any card, from either red or green teams.
Designs? What are you talking about? The laptops are already designed. They simply need to layout a heatsink and produce a card spec and price to the OEMs. Whatever this "OEM sabotage AMD and create inferior cooling systems" you're talking about makes no sense. Pascal is about twice as hot as Polaris. Anything cooling a 1060 can more than cool a 480. This is out of the question, and I never said heating would be a problem for AMD (in fact I said the opposite).
AS for the mobile 1070, it apparently draws less than a 980M. It needs more cooling, but it's not going to need a larger power brick than anyone with a 980M did. This means it's A-ok for P65xRx new generation, let's say, should they be capable of cooling it. And if you're considering the larger laptops with large power bricks where power draw be damned, AMD simply has nothing to compete with. If nVidia made M cards, and neutered them like Maxwell was neutered, then we'd probably see something else happening. But no, they actually did something quarter-way competent this time, so AMD has no moves. But the way I see it, they walked RIGHT into their position and have little but themselves to blame for it. Everybody else simply moved around them, happily.
I think you're absolutely deluded and an AMD fanboy at this point. The differences between the 1060 and 480 in terms of power draw are gigantic, once you let both cards draw unlocked power limits. And without drawing unlocked power limits, there is TDP throttle. And you said it yourself in AMD's case: undervolting allows the card to perform better BECAUSE IT'S TDP LIMITED. The AMD card is a card that pulled 192W at stock once power was removed. The GTX 1080, as tested by Dufus who checked the card's actual power draw, did not go above 180W. This means that at stock settings on reference-class cards with TDP limits removed, the RX 480 DRAWS MORE POWER THAN A GTX 1080, far less a 1060. Get this into your head. Polaris is nowhere NEAR the efficiency of Pascal. And this is its problem. The power bricks, and why OEMs and ODMs will not shove them in laptops that could otherwise hold them. P65xRx has a 200W brick maximum. Undervolt a 480 and undervolt its memory and let's say you manage 150W or 140W, even. That 200W brick ain't supporting it. But the mobile 1060 somewhere around 80W will. So you underclock the 480 some. Now it's dropped to 110W, so it fits in the P65xRx, but it's now only as strong as a 980M and overclocking will exponentially increase power draw, and thus the card's potential is reduced to a non-overclockable 980M. For about the same price as a mobile 1060, unless AMD takes a huge price cut on the card and prices it closer to its desktop counterpart. Which I am certain they won't do, because they never have done it.
So take your pick here:
980M-level performance, potentially put in most entry-level gaming or higher laptops. Overclocking will not happen. Must be extremely cheap.
Mobile 1060-level performance, put in only large laptops that have large power bricks, which cost more than most other laptops because of their beefier cooling system and whatnot. No gsync or freesync tech. Limited overclocking, if at all (possibly requires power connectors). Also needs to be rather cheap, but not as much as above.
Which one do you think is going to happen? Or make AMD much money?
See above.
Statement makes no sense to the argument.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I get that you love AMD, and you want to see AMD succeed, but you need to face some facts. AMD is not capable of working magic here. They're not going to suddenly pull this one straight out of their meowmix. If this was last gen, a M480 would have done swimmingly well. But they launched Polaris too late. They had working samples for almost a year being demoed before launching them this year. They could have launched them and made a killing, but they took too long, and still blundered their launch, and then tainted their good name and the faith the people had in them with how they handled their blunder and attempted to claim nothing was wrong. I followed the ENTIRE scandal, laughing and facepalming the whole way, from the time reviewers broke the story straight up until the subreddit banned and deleted the megathread about all the issues. Having people coming up to me telling me they "fixed" it with drivers, for me to have to point out that all they did was neuter the card's performance in their attempt to "fix" a broken design, having me go around informing everyone that if they want one of those cards, they had best grab one with an 8-pin power connector or not at all.
AMD right now has nothing to compete with. The M295X was give or take the level of a 970M, but it again couldn't overclock and cost more. I saw it in Alienware models (the ones that had the power bug) and I never saw them get chosen much because of that price, and the fact that Enduro is MUCH more of a pain than Optimus is. It's one thing if their flagship competes with nVidia's flagship and costs less. That gets them sales. Lots of sales. The amount of 7970M sales I saw over 680M back in the day was huge. And then half of those people groaned a year later when the cards started dying, ironically. But now they have something that doesn't compete with the top end, but the low end, and draws much more power, and is not likely to be exceedingly cheaper. They are in no position to fight the "normal" way because they have nothing that can. If you want to keep believing that 82W of TDP for full boost and performance from the card is going to go away by undervolting core and memory without ANY performance downsides, you can keep dreaming. your example is the 7970M vs 7870, but that had a 15% performance cut right off the bat. The M480 could not afford to just "lose" 15% performance or more like that and remain competitive. The 1060 generally beats it in most games and benches I can find and that ONLY has a 7% clockspeed drop.Shadow God and s19 like this. -
-
Last edited: Aug 12, 2016
-
Speaking of OEM's crippling AMD:
http://www.anandtech.com/show/10000/who-controls-user-experience-amd-carrizo-thoroughly-tested
Also, I'm no fanboy of AMD.
Look at my signature and see the kind of hardware I use.
AMD ramping up voltages that result in high power consumption (but also higher volume of GPU's out the door) is well documented, as is the premise that undervolting the core drops power consumption by 30 to 35W on 480 (which actually brings the GPU in line with 1060 in terms of efficiency).
If you want to ignore that D2 Ultima, that's your business, but facts are facts, and they don't go away by ignoring them - if anyone might be displaying signs of fanboyism and favoritism, it would be you.
I'm just saying what AMD might be able to pull off if they get their heads together, OEMS though are a different storyLast edited: Aug 12, 2016 -
AMD lost, period.Shadow God, Kade Storm, hmscott and 2 others like this. -
-
How is Nvidia better again?
They are doing practically the same thing as AMD... and both are consuming less power than previous generations while delivering same performance.
Plus, are we denying that at 1.28 Ghz Polaris is producing basically same performance as a higher clocked Pascal and overshooting it in some areas when proper API's are used?
Shall I also mention Polaris compute capabilities, or is that yet another 'lets ignore' factor?
Oh yes, I forgot, compute is worthless to everyone apparently (except for those who actually use it) as GPU's are only meant to be compared in their gaming capabilities and nothing else.
I'm not doing anything except pointing out conveniently thrown out facts about AMD.
Yes, Nvidia has done better by putting out GPU's right out the door with better control over their voltages... but they also hadn't done any real groundbreaking stuff because all they did was shrunk Maxwell and rebranded it as Polaris (which clock per clock seems slower than Maxwell in order to hit higher frequencies). -
... ya'll are delusional. There's no point saying anything that isn't "we should all buy AMD!" to ya'll. I could make an hour-long video with graphs and bullet points compiling information from all over the internet, and you would STILL tell me I was wrong.
You know what, ya'll buy AMD. The R9 M460s are out, go grab em. I'm not going to make an excuse for AMD playing the lifeline strategy and refusing to push away from GCN when they knew it had serious limits for power draw and heat after Hawaii. I cannot STAND nVidia, but that doesn't mean I'll give AMD 10 miles. I gave them one. Not an inch, but a whole mile. And they want 10. I'm not giving it to them. They aren't even COMPETING. As far as I see it, AMD is targeting a desktop market nVidia is currently ignoring. And they're targeting a laptop market nVidia already dominates with the 960M. And that does NOTHING for people like myself, or anybody wanting anything other than an entry-level gaming card from either market.Shadow God, Kade Storm, tgipier and 3 others like this. -
No, but comparing clockspeed between 2 different archs build on 2 different processes is utterly meaningless. Pascal is maxwell tuned for clock and Polaris is an aging architecture ported to samsung's 14nm finfet which really isnt all that good to begin with. It really doesnt matter how high you push the clock until you start hitting 3/4ghz mark. I dont see the huge problem in Pascal being Maxwell tuned for clocks. The performance is still there right?
What is there to compare about Polaris's compute ability? Because 1/16FP64 is impressive??? People that cares very much about DP compute will be on K80/P100/Volta Tesla.Shadow God, Kade Storm, D2 Ultima and 1 other person like this. -
https://www.techpowerup.com/225081/high-pcie-slot-power-draw-costs-rx-480-pci-sig-integrator-listing
Further case in point about AMD shooting themselves in the foot.
The RX 480 is *OFFICIALLY* not a PCI/e card anymore.hmscott likes this. -
-
Sent from my LG-H850 using Tapatalk -
I *AM* bashing AMD on this. Because they need to be bashed, and people here need to stop treating AMD like they can do no wrong, or that any screwup can be spun into something good.
No fanboyism. I praise good achievements and I bash screwups. I am ESPECIALLY HARD on screwups that they should have absolutely known would have happen, like the 970, and now the RX 480, and if you think I'm ridiculously hard on THAT, I'm even *MORE* EXTREMELY ESPECIALLY HARD on screwups that they should have absolutely known would happen, when after the fact they try to deny it or push that it means nothing. You know, like the 970? And now the RX 480? Where damage control said nothing was wrong? Yeah. No.
Like I said above. I am NOT giving AMD 10 miles after already giving them 1 mile, far less 1 inch. The saying is give an inch take a mile? Yeah. I gave them a mile to begin and they want to take 10.Shadow God, Kade Storm, hmscott and 1 other person like this. -
So notebook RX 460 is now officially out:
http://radeon.com/hp-omen/
http://videocardz.com/63230/amd-compared-radeon-rx-460-desktop-to-rx-460-mobile
For now available just in China (seems the cheapest model starts from ~ 800 EUR):
http://search.jd.com/Search?keyword=Omen15 RX460&enc=utf-8&wq=Omen15 RX460&pvid=oiz63wri.r7nedkv6triturbo likes this. -
That's the real problem. AMD losing PCI-E Compliance Certification means all the AIB cards are in an uncertain compliance realm.i_pk_pjers_i likes this. -
King of Interns Simply a laptop enthusiast
Maybe we get rx460 in mxm form. That would be a nice start. Next a rx480 please
Sent from my SM-A500FU using Tapatalktriturbo and i_pk_pjers_i like this. -
AMD losing PCIe compliance on reference cards means many pre-builds will stay away from it. That market isn't large to begin with though.
D2 Ultima, hmscott and i_pk_pjers_i like this. -
Please note, the $550 USD R9 Fury cards, 20% stronger than a 980, costing the same as a 980, is *NEVER* sold in pre-built machines. I simply could NOT find it, anywhere, in the US for prebuilt PC companies.
RX 480 will simply not even exist in pre-builts at all. -
?0k
Wonder why cards have not caught up to games. My old GeForce 6800go ran half-life 2 at 130fps. The 6990m was and best offer at the time but never saw new games hit 130fps. Now it's 2016 and Nvidia has the 1080 out and it won't max much at 130fps. And has yet to offer -
-
Very funny right?
TomJGX likes this. -
AMD is looking for their crap. I never said the industry was fair to them, but they are looking for their crap. If I sat there and saw AMD actually fighting and trying, I'd change my criticisms and stop bashing them. But unlike your flaming nVidia-hating self, I refuse to believe that AMD is barely at fault. There were a hundred things they could have done to prevent all the heat I've been giving them here, and I listed some of them already.Shadow God, hmscott, Ethrem and 1 other person like this. -
I never said that "AMD is barely at fault", that's your short-view-distance-reality-bending vision that says so. I literary said "AMD is not entirely to blame" and was eager to see your response. You didn't disappoint. You can go ahead and double check the post above yours. I would never say that AMD is not to blame, because sadly this is not the case and I have posted a few times why the lack of MSi machines with AMD hardware might be partly to AMD. In the mean time nGREEDIA gets a pass on an awful lot of things, that's what gets me and I don't give a damn how I look, I'll bash nGREEDIA to the hell and back, since I'm among the very few that does so around here.
I don't see you go hard on nGREEDIA about numerous things (some pointed in this very topic and nowhere else in the forum), but you nit-pick EVERY SINGLE AMD mishap. So don't try to convey your neutrality, I know that there isn't one. At least I don't pretend, ever since GameSux happened, I said that I'm AMD biased and wont even consider nGREEDIA product and I stand behind it. Right now, there's nothing nGREEDIA has to be praised about, really. If you had a broader view, you would've seen it as well.
Not fighting? Pffft! How do you fight bribing?! Like the grIntel settlement that came a little too late with "coincidental" AMD CPUs downhill. Thanks the EU on that, if it wasn't for them, all would've been "good". Even with it, that settlement wasn't even remotely enough to fix what was done (after all it was an EU settlement, while the world is a bit broader than that). Now we are getting the same thing, only with GPUs. -
Go search my name and "nVidia" on T|I and here. I'm sure you'll find some things.
Mobile Polaris Discussion
Discussion in 'Gaming (Software and Graphics Cards)' started by moviemarketing, Jan 4, 2016.