The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Mobile Polaris Discussion

    Discussion in 'Gaming (Software and Graphics Cards)' started by moviemarketing, Jan 4, 2016.

  1. JAY8387

    JAY8387 Notebook Consultant

    Reputations:
    36
    Messages:
    231
    Likes Received:
    90
    Trophy Points:
    41

    thank you for letting us know, that is disappointing to hear. I guess its beacuse the expect the wont be able to sell enouth to make it worth producing/designing an mxm bord
     
  2. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    In all reality, it seems that the GTX 970M, GTX 980M, R9 M290X, and custom designed R9 M295X are the final MXM 3.0B cards to be released, which is quite unfortunate.
     
    JAY8387 likes this.
  3. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Even if they made MXMish board (the new GPUs are odd with odd hole placement), what are the chances to fit into old machines? None. Terrible news, none the less. Everyone stepped out of their way to stuff DTM 980 into their machines, but they can't do the same for AMD. Whatever, no Clevos for me.
     
    Ashtrix likes this.
  4. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    It is sad but MSI seems to be the only one making standard MXM sized cards since their 1070 fitted in the P750DM according to Prema.. Clevo are pretty dead as a brand as a result of their shenanigans this time.. Very disappointed by their custom 1070/1080 crapola this time...

    Sent from my LG-H850 using Tapatalk
     
    steberg, Ashtrix and triturbo like this.
  5. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Very much in agreement here. How on earth would a silly overpriced 980 design take root but creating MXM boards for AMD is not possible? What a load of rubbish. Clevo was going to be my next machine once my M15x crapped out but never now.

    What a sorry state of affairs. Looks like I won't be getting another laptop after all. So very sad!
     
  6. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    But... why would Clevo put Polaris 11 on MXM board? Or you still have hopes about Polaris 10 Mobile this year?
     
  7. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Polaris 10. I'm pretty sure we'll see it in MobileWorkstation, but quite likely wont be by the year's end. The WX 7100 gives me hopes.
     
  8. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    The moment they announced 980 in a laptop, I think I was one of the first people to speak out on how ridiculous it is that they will bend over for Nvidia and even push for full grade desktop GPU in a laptop, while an AMD gpu of same TDP as 980m ends up throttling because the cooling is never sufficient for it.
    Same for APU's that end up with garbage components.

    These OEM's are likely getting more money out of Nvidia, hence more design wins and coming up with flimsy excuses every time AMD is mentioned.
     
  9. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    I mean its looking like Clevo owners are SOL for nVidia cards as well. The 1080 is not going to fit in the current gen P870DM if what people have been saying is true. 1070 might but only because MSI has to keep its upgrade promise for GT72 and GT80 users.
     
  10. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    The 1080 will fit a P870DM, just not in SLI.
     
  11. Hurik

    Hurik Notebook Consultant

    Reputations:
    10
    Messages:
    145
    Likes Received:
    159
    Trophy Points:
    66
  12. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Sorry to break your bubble, but RX 480s draw 165-200W easily at stock (reference) speeds. They're rated for 150W, but they can easily pull much more. You're not getting one in a 110W format no matter what you do. Performance WILL need to be cut somewhere. They'll need to go in rather large, thick notebooks and need large PSUs (larger than 1070s will, give or take) and if their price isn't much less than the mobile 1060, they'll be somewhat pointless entirely.
     
  14. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Bet they can. Remember the 7970M vs the 7870? Took the core clock down a notch and reduced power consumption from 160W+ to around 100W.

    RX 480 properly binned with vram probably running a little slower and core a little slower can easily operate at power consumption between 100W-120W while performing around desktop 970 levels. Still a decent perf to power ratio improvement over the 125W 980M that was 15% slower than a 970 and when OCed became a hot power hog.

    So yeah I reckon they can bring about 20% better performance for the same power comsumption if they wanted. If at a decent price why not. Then release a 970M thrashing mobile RX470 with similar treatment to run at around 80-90W bringing 980M performance at a cheap price and much lower power consumption. Yes please. Plus of course much better DX12 and vulkan support/performance than a maxwell card ever will possess. Not least because Nvidia themselves will neuter Maxwell performance just like they did to Kepler.

    We don't need to give the industry excuses why it isn't happening. There is only one reason: money. Nvidia pays the correct amount to the correct people. That is all there is to it.

    It is time to revolt and say no to this crap. Let's stop being weak minded and see the wood for the trees. We are talking about the end of MXM standard card sizes and the complete end to AMD offerings for enthusiasts for laptops.

    Can you imagine if this happened in the desktop world now. Suddenly graphics cards come in all sorts of shapes and sizes and prevent you from upgrading. Sound familiar?

    This is worse than clockblock. This is the stifling and strangling of competition by Nvidia and will only mean 1 thing: A single option at an insane price. A bad deal for consumers and the environment.
     
    Last edited: Aug 9, 2016
    triturbo likes this.
  15. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Yeah, AMD usually clock them to an inch of stability in order to stay somewhat relevant for those that just look at the numbers. So, it's not surprise that a dial back yields quite the improvement. Remember the R9 Nano?

    @D2 Ultima - Pointless to you, not to me. Tell that 980m to play nice with my DreamColor, or that 1070/1080/whatever to fit in there, or that Quadro to cost a third of its price, or the other 780m to not lose performance... It's getting quite the list and I haven't dug into the good parts.
     
    hmscott likes this.
  16. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Oh, those graphs were confirmed misleading.
    http://videocardz.com/62790/geforce-gtx-1070-mobile-and-radeon-r9-m480-benchmarks-leaked
    Someone just renamed the bigger bars to be GTX 970m and R9 m480, when in reality they represent the scores of the GTX 970 and RX 470.
    On the topic of mobile Polaris, I question as to why the RX 470 isn't the ideal example of a P10 GPU that can be modified (binned + lower clocks) to a mobile GPU without losing too much performance. I know the RX 480 is the full P10, but I fear a full P10 mobile GPU will be like the R9 m295x: full Tonga but performance of an R9 285 (Tonga Pro).
     
  17. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You dropped 150MHz and undervolted and you went from 160W to 100W. You're now attempting to get a card that performs fluctuatingly between the 970 and the 980 at a maximum draw of 200W at STOCK settings just by playing certain demanding games, not even at 4K or any ridiculous resolution which usually requires more power from GPUs either, into a 100W package? Might as well clock it under 1000MHz and undervolt it and have it perform like a 980M?

    You're getting 980M levels of performance if you want such low TDP, and then you can't OC, and the card will most likely run hotter too.

    Right now, you're expecting WAY too much. You want something that isn't going to exist right now, because while AMD is making strides in their efficiency, the same reason we didn't get Hawaii and Fiji on laptops is the same reason Polaris 10 isn't going to make too much sense. On the crappy team green side, they've actually shoved within-7-percent-of-desktop-models'-performance cards into laptops this time. Hence it needs to retain its performance and cost LESS than a mobile 1060 or it's just going to fall by the wayside.

    I'm being realistic here with my statements. AMD hasn't figured out how to compete and they're STILL sticking to GCN and even with all the optimizations and perf/watt ratio bumps they've claimed, it's still not even very impressive compared to what the other camp is giving. I HATE that about it, but that's their own problem to fix. I can't fix it for them.

    For all the crap nVidia is doing, AMD has nothing to counter them with. And where is this neutering of Kepler's performance in games? I've had 780Ms from launch until now and they have only performed equal or better in ALL games I've owned for the whole duration of my owning them. I've heard of benchmark issues in the past though I haven't seen any, but I can see nothing with respect to games. And I can expect the same to happen to maxwell. I know exactly what the forced obsolescence they'll do is, but that's not the same as neutering existing products (though almost as nasty).
     
  18. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Where's the realistic part, because I might have missed it? The R9-M295X was disregarded as pointless. Now Polaris is AGAIN disregarded as pointless while in the same time we see in excess of 200W GPUs in laptops!?! WhatTheActualF!!! You know some of us care for different things, yeah how odd it is that people have different opinions and desires, right? So, no AMD option was given for a while now, which equals in ZERO cash income and ZERO market share. It's "a bit" of vicious circle isn't? Why they have to invest in laptops in first place, where the playing field is NOT AT ALL equal.

    Moving on to detailed rundown on the "realistic" statements:

    The worst I've seen was well below 200W @4K. So, show me.

    How much performance was lost Fury X -> R9 Nano?

    ALL of your titles speak volume. I can bet that my list is different than yours. So, anything more specific, like any GameSux titles? The GameSux degradation is a fact whether or not you want to admit it. The funny thing about it is that initially I thought that it was towards AMD, now I know that it is towards EVERYTHING that's NOT nGREEDIA latest.

    Keep drinking your Kool-Aid.
     
    Last edited: Aug 10, 2016
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    How much was the M295X? I seem to remember it was inbetween a 970M and a 980M, but performed worse than a 970M and couldn't even overclock as well as those neutered recent maxwell cards that barely hit 1200MHz. AND it drew more power and ran hotter. Why, at all, would you wish for me to look some other consumer in the eye and say "You should spend more money and buy a hotter, more power hungry GPU with extremely inefficient drivers to couple with power-limited, relatively slow mobile CPUs (this was before Skylake where the chips couldn't hold TDP beyond base) because nVidia is a crappy company"? Could you do that? Because I can't do that. I would happily explain to someone why nVidia is crap and why AMD needs marketshare, and if they asked if there were an available AMD notebook GPU I would happily point it out to them. But I would never do it the way I just described above that last sentence.

    Here: http://www.hardware.fr/articles/951-9/consommation-efficacite-energetique.html
    Power limits and thermal limits disabled in "Uber" mode. NO OVERCLOCKING APPLIED. 192W drawn in Witcher 3 at 1440p. THAT CARD IS NOWHERE NEAR 150W. If you want to somehow shove that in a laptop and tell me it's going to keep that performance and maintain 100W, you're dreaming. That card would have to go in something like the P750DM2 with a 330W PSU to even begin holding its inbetween-970-and-980 level of power, which basically matches the 1060 mobile (which, while extremely hot, does not need as large a power brick and is suited for machines like the P650Rx2 and P670Rx2, which use 200W bricks).

    As for the Nano, you can see also in that bench that it hit 185W and sat there for both BF4 and Witcher 3's test. That meant it was throttling. How much did the card throttle? I don't know. How much do people normally notice throttling on that card? I don't know. I don't own one. But it isn't a Fury X replacement, and also remember that unlike the RX 480 (which I might add surpassed Fury Nano's power draw in Witcher 3 when its limits were removed) the Fury Nano has HBM, which reduces power consumption on the memory by quite a bit. A benefit the 480 does not have.

    I'm the first person who'll tell you about the forced obsolescence that GameWorks uses. Straight tessellation on every single thing that doesn't use PhysX, and absurd amounts of it too, such that only their latest can cope to any decent degree, and when they shove a new card architecture up everyone's meowmix, it runs much better on those GameWorks titles because it simply handles the tessellation better, and they can point and say "look, it's so much faster!" when in every other title it's obviously not as good. But that still doesn't let me tell someone to pay more for a weaker, more power-hungry GPU (it ain't hotter, nothing beats Pascal though, I'll give them that. AMD's the cool arch this time around) just because nVidia sucks.

    AMD needs to compete. Drink THAT Kool-Aid. They're not competing. They're not trying. They were using Crossfire (which works in about 20% of new titles on launch, has no nVidia Profile Inspector alternative to mess with bits and force profiles, and didn't even scale properly in their own demo when pushing their new card) to push their new card being good... that was awful. They came in swinging with a disastrous card frying peoples' motherboards. They even ran a driver update that limits the card's power draw from the PCI/e slot (in effect neutering the card's already driver-induced lackluster performance further), but won't fix it with an 8-pin connector by making a small recall. They just want to sweep it under the rug. They're betting on DX12 and Vulkan to fix their ridiculous driver overhead issues in DX11. Do you know WHY nVidia cards drop in performance in DX12 over DX11? Because there's no DX11 driver overhead on nVidia drivers. Unless you hit the limit of DX11 in draw calls or whatever, you're not getting any benefit with DX12 for them. This is because NOBODY is coding in DX12 optimizations for games, because it's too expensive to give console-level optimization to the PC and they get nothing else out of it. So the optimization is driver-side, and DX12 drivers from nVidia are not as mature as their DX11 drivers are. So... performance loss. Do you see AMD's DX12/Vulkan driver performance? How their cards basically get a 20-30% boost, more if the game was in OpenGL instead of DX11? That's what their cards are SUPPOSED to be doing. Right now. In DX11/OGL. But their driver inefficiency and overhead kills all their card's power.

    And finally, they're hanging HARD onto GCN. Like, extremely bad. GCN is hot, power hungry and generally cannot overclock worth a meowmix (or maybe they similar push the cards so far to the limit before launching them JUST to scrape out more performance that they're already at their limits; like if nVidia made the base clock of a GTX 980 1350MHz and it could only OC to 1450MHz). I will admit wholeheartedly that Polaris is a decent bump in efficiency and in temperatures. That's great. Really. That's great. But it's not enough to compete in the mobile market. They can run the price war on desktops because nobody cares about heat (aftermarket coolers) or power draw (big PSUs) on desktops, and most don't care about overclocking (it's why superclocked cards sell so well). Notebooks don't have the first two luxuries. Especially the second one, since the power draw often dictates the notebook something can be used in, due to the accompanied power connector. Hence why I said, that card needs an allowance of up to 150W, unless you neuter its performance to ALL hell, and thus that means 300W power bricks. Even a 240W power brick of high quality might barely push it, but I would say 300W+. Even if a P65xRx2 could hold one, they sell with 200W PSUs at best. You're not going to power a RX M480 in that.

    You feel I'm some happy go lucky nVidia lover? You really. REALLY. Had best think again. I'm more annoyed at AMD for not bringing anything I can even be halfway hopeful of to the table. I had SUCH high hopes for the RX 480 and they had to go blunder it with that 6-pin connector. That was their bloody trump card for low end. But NO. Trying to stuff a 165-192W card in a 150W envelope to "target it as a budget card".
     
    Shadow God likes this.
  20. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    That's why I said "to stay somewhat relevant for those that just look at the numbers". You know, the primary function of a GPU is to output a picture, and I demand that picture to be eye-candy (10bit). As I already said, the Maxwells couldn't deliver that (severe grayscale issues), not sure how it is going with Pascal.

    OK, there goes the other way around - do you recommend nGREEDIA GPUs with a disclaimer before that? Because if you don't, you are doing just what you said that you wont do, but in favor of nGREEDIA. And don't get me started on drivers, or on other **** for that matter. Fact is that you can't simply recommend either without a disclaimer, but nGREEDIA gets away without one!

    Well this "Uber" mode is not exactly "stock" as you said, but let's go with it - fine it goes up to 192W, but it's lower than that when not in "Uber" mode, right? It is know that the stock voltage is high, I can only imagine how high it is in this "Uber" mode. It was proven that these go better (although not a lot better, but better), when undervolted. Can it behave @100W, I think yes. But scratch that, why should it be soooo crippled? I can see that they are shoveling a lot more power hungry nGREEDIA hardware. The 1070 would be ~130W GPU, can't even begin to imagine what the 1080 would be.

    This just goes to prove that it can keep a set target. Yeah it would throttle, but how are you certain that Pascal wont throttle as well - both power and thermal? Have you seen the heatsinks?! Boy that's some massive things right there. That's the whole point I'm trying to make for a while now (some years, but who's counting). Why people just assume AMD = bad and that's about it for them and nGREEDIA gets away with a lot (ALL) of their $hit?

    That's the thing about proprietary ****, if the programmers did their work right with abstraction, they can change the whole inner workings and the devs wont know a thing about it. They'll use the same APIs they did, but internally it would be entirely different. It was PhysX first, now GameSux, next it could still be GameSux just with something else terrible inside and nGREEDIA wont even have to "promote" it this time around. And by "coincidence" their newest would run fine. I know how it sounds, but I really can't think of anything good about them. They ditched MXM, the only good stuff they had and now are forcingly making laptops proprietary again.

    How they should compete? Shrinking market-share, constant bad-mouthing and trivial things like that? I already said, I haven't seen AMD GPU in any system but DELL, and an odd HP (which most of the time is crippled one way or another) for quite a while now. How comes they are not trying, where you said it yourself, they gave an alternative and CF usually scales better than SLi, but yes that demo was a disaster. How many boards have been burned? I don't see a massive drop in performance, actually it's barely noticeable, averaging at couple of FPS. Why should focusing on DX12 and Vulcan is a bad thing? It's the same as to say that they should focus on GDDR where HBM is obviously better. How lower overhead is for ALL makers is a bad thing? If we shouldn't progress, why stuck with DX11? Why not DX9c, which seemingly lasted forever? I mean where is exactly the line you draw? They had a problem - huge overhead, they came up with Mantle, which directly translated to Vulcan and kinda DX12. Good that nGREEDIA has great DX11 drivers, how many money and man hours are thrown behind that? Can AMD afford it? Same goes for the CF and SLi optimizations by the way.

    I can see that. Maybe they can't come-up with anything better. If I could've made a kick-a$$ new architecture, you can bet that I would've mailed them already. I think that overclocking has a lot to do with "AMD usually clock them to an inch of stability in order to stay somewhat relevant for those that just look at the numbers" which I said earlier and you just did as well, than anything else. Really. So I guess they can only do that much. So why buy them then, you are shooting yourself in the foot. True, but everyone else shot themselves point blank in the head. Then again this is performance and not everything is benchmarks. I heard that monochrome sucks :D Well, as I said earlier, it seems that the 1070 would be a 130W GPU, and the 1080 even more so. An RX 480 @130W doesn't seem to be that bad. It wont be up to 1070, but at least it would be an option. Considering that it would definitely be cheaper than the 1070, it would have it's place under the sun. Moneyflow is needed by everyone in order to keep the wheels spinning.

    No I don't, but don't sway your axe too hard on AMD. I'm usually trying to balance things out, there are more than enough people bashing AMD, and very little to point nGREEDIA. Even then people still buy nGREEDIA and are done with it. I had high hopes for RX 480 as well, but things are only going to be worse if we let that one go as we did with R9-M295X.
     
  21. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I actually wonder how the M295X performs and runs in a proper MXM based notebook with decent cooling (ie designed for 780M/880M/980M) and see the performance in game.

    I reckon it wouldn't run half bad particularly as drivers are quite good now. The M295X was never tried out in a proper gaming laptop so I dont see a basis for decent judgement. Likely performance is somewhere around 970M (perhaps even higher) but at 980M power consumption. Not impossibly bad. It just never was given a chance.
     
    triturbo likes this.
  22. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Exactly and now the gap is getting wider.
     
  23. Ashtrix

    Ashtrix ψυχή υπεροχή

    Reputations:
    2,376
    Messages:
    2,081
    Likes Received:
    3,281
    Trophy Points:
    281
    OEMs never gave a chance for AMD to have a shot at, Only AW had the MXM machine back in 2013, They were always being purged by these upper echelons Intel and nGreedia, even so AMD never tried at all giving their best, lost the Adreno to Qualcomm and now all they do is aim for the console crap and a top end competitor for Nvidia's top card, that's all no optimizations nothing. Plain mediocre stuff, with less QC in mobile MXM cards from what I read so far. They are worthless now Samsung should buy them and give a shot at the Ngreedia, Intel & Qualcomm, Hate this monopolistic trash...

    OT : I really wish someone comes up with a strong rooted petition base for MXM3.0b cards from Clevo, that huge ZM/DM user base & looking for that community love made custom 1070mxm3.0b cards from kingofinterns :)
     
  24. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    This'll be my last attempt.
    This, this right here, is something specific to you. You need 10-bit colour. Laptops come with 6-bit panels. I've seen ONE 8-bit in current rotation for most laptops and it's 4K and thus not even very useful for people on windows, or people who want to game, especially single-GPU owners. I will *NEVER* deny someone what they are looking for, but you need to understand that it isn't for everyone. You must have noticed that around the forums I don't jump on the BGA hate train as much as others... once Skylake came out and I confirmed they can use power if need be I left them alone, because I know not everyone wants or needs one of those heavier, thicker, desktop chips. You cannot say I only project my ideas on people at all. So understand that I'm right now looking at the average joe going to buy a laptop.

    No, no I don't. I will tell people flat out that I beyond recommend a custom vBIOS should they get any nVidia mobile GPU or it won't work properly, and I heavily avoid recommending models that are anal about this like ASUS, and I would even inform people that mobile cards use absolutely crappy parts and recent 900M cards I wouldn't guarantee to last very long. Beyond this, desktop and laptop practices are the same.

    This "Uber" mode is simply going in CCC or whatever and setting "power limits" and "thermal limits" to their maximum set potential. It's like my opening nVidia Inspector and setting my power limit from 100% to 146% and my temp limit from 92c to 93c. There is NO other change. The fact that doing this allows it to pull such power means that the card is incapable of holding its clocks in a stable manner otherwise. You and some others may find the bit of downclocking "not much" or "not very notice-able" but when you transpose this into a laptop, you get something like the throttling mess that was 980M at launch, or dare I say the 880M branch of cards. ESPECIALLY if you want to shove it in a 100W format. You want that card to behave with an undervolt? It NEEDS more than 100W of power allotted to it. Also, as far as 1060s and 1070s are concerned, Prema said they have less TGP than the maxwell counterparts (1060 vs 970M, 1070 vs 980M, 1080 vs 980), so all-in-all I believe they're going to be drawing less power (the heat they'll be outputting though, that is another story). As for mobile Pascal, see those 100MHz downclocks and voltage drops? Those go a long way, because Pascal is Maxwell die-shrunk. Just like overclocking maxwell exponentially increased its power draw, underclocking it exponentially reduces said draw. AMD, like I said, can do something similar, but you're NOT getting the already-fluctuating performance that the RX 480 has in benches, and then further to this point (which I'll explain down below) their drivers make things worse.

    As for the 1070 and 1080 cards, their power so far outclasses the RX 480 it's not even funny. This is the problem. In the big, heavy notebooks which use large enough power bricks for these cards like the P750DM2, putting in a RX M480 is not going to end well unless it costs less than the 1060 option. And don't even bother offering it in Crossfire with the way multi-GPU is going, that's going to result in 90%+ of all customers buying it getting burned. Which I will talk more about later.

    I never said Pascal won't throttle. But that's why we have people like Prema who honestly do so much work for the community I wish I could just give the dude a mansion and a room full of PCs left and right to his heart's content. And lots of beer too. We have some (not all) machines which can handle the heat. The P7xxDM2 and P870DM2/P870DM3 line are going to handle Pascal's heat. The P65x and P67x lines are being revamped with extra cooling, and their current cooling was so much overkill for Maxwell in the first place that I believe they just expected Pascal to suck this badly and designed the machines ahead of time in preparation. MSI has shown that it desires going balls to the wall with cooling for Pascal too, so we'll find working, non-overheating laptops (note I never mentioned heat for the Polaris chip? Because it's cooler than Pascal by a mile). Then we just need to worry about power draw. This is where we can get Prema (and maybe even Svet, who knows) to help out. The power bricks aren't a problem here. The problem that I have with the polaris chips for mobile is that their specs are so bad for mobile. Look at it this way:

    Compared to 980M, it's either going to perform the same and draw more power and run hotter, or perform much better and run much hotter and draw INSANELY more power, similar to a GTX 980. This means it'll only fit in the highest end of notebooks. Cool? Cool.

    Compared to 1060, it's either going to run cooler, perform similar, and draw almost triple the power, or it's going to be much weaker, much cooler, and draw almost double the power. This huge extra power draw means it NEEDS to be again in the highest end of notebooks.

    There is no place I can sit here and think to fit it. It will not work in a P65x or P67x; the power bricks are too small. It will work in the GT72, GT73, GT83, GT80, P7xxDM, P7xxDM2, P870DM2/3 since they can all take 330W bricks or higher, however that's basically buying those top end machines with a single 1060. It's extremely unlikely. You would have to really want AMD for something to do so. It just does not fit. You can't sit there and rationally tell me this is incorrect. I'm being completely impartial and viewing this from the eyes of a prospective buyer. Even if the AMD card is cheaper than the 1060, these machines are extremely expensive and powerful otherwise. You wouldn't much see someone buying a RX 480 in a desktop with a 6700K and 16GB of RAM with two SSDs and a HDD. It's unbalanced to all hell.

    As for nVidrosoft, I never let them get away with their crap. However I am currently discussing AMD here. This isn't the pascal thread where I will bash them and then express annoyance at AMD's inability to seemingly compete their way out of a childrens' softball team.

    Not arguing at all here. But this isn't the focus of this discussion. Also, I'm not going to sit here and say "well nVidia have the better value for money even though all their cards are vastly overpriced, and they have the by and large strongest cards, with the better DX11 drivers, and the better driver profile editing program (nVidia Profile Inspector), but you should buy AMD and sacrifice your performance and other things because nVidrosoft sucks". This isn't a place where we bash nVidrosoft because they're a company competing with Micro$haft for most anti-consumer company on the market and as a result exalt AMD. No. AMD has to work for their exaltation. When people are looking for low end I recommend sapphire Nitro+ 480s with the 8-pin connector because that's good stuff, but what else am I supposed to do? If someone comes to me and says they want to game at 144Hz in battlefield or something I am GOING to send them to nVidrosoft, because the DX11 inefficiency is some serious cyacacow behaviour, and hitting 144fps constantly is hard enough in that game CPU-wise and I'm not going to tell someone to have a worse experience.

    How should they compete? Let's think. First off, actually properly QC your blasted cards. That 6-pin RX 480 disaster? FORGET. IT. They had polaris WORKING for months. I think it was close to a year even. This should never have happened. They screwed up so badly that if it were a woman and not a video card, they'd be the absolute best adult movie start in the universe for all eternity. ALL BECAUSE THEY WANTED TO MARKET IT AS A BUDGET CARD.

    Second, let's get rid of GCN. Stop pouring R&D into GCN revisions. GCN should've been discarded by the time they were preparing to release Hawaii. Hawaii was 2013. They would have been working on Fiji and Tonga that whole time, and that's fine. But they should have been preparing for 2016, right now, with Polaris and upcoming Vega, to not be on GCN. They should have seen Hawaii was going to be hot, hungry, and generally inefficient and realized they could only go so far with GCN. I know this is far easier said than done as you listed below, but it NEEDS doing. Pascal is a straight example of how refining and shrinking an architecture does not grant a bunch of gains (Pascal is in fact SLOWER than maxwell... it's just clocked higher and had more cores crammed into it) and it's ridiculously hotter.

    Third, fix your drivers. This needs happening. You asked why is focusing on DX12 and Vulkan a bad thing? Well first, DX12 requires windows 10 which is garbage for consumers. Second, waiting for DX12 and Vulkan to fix your own driver inefficiency (which won't fix the hundreds of DX11 titles already on the market) is incredibly stupid. You want to discard all the existing titles? Witcher 3, GTA V, Dying Light, Black Ops 3, all these rather incredibly hard-to-run games that could be getting 30% better performance just by driver fixing? I already explained why nVidia loses performance and AMD gains. Their driver efficiency is simply lackluster. When devs start coding optimizations into games directly (I.E. never) then you're gonna see nVidia pull ahead in DX12/Vulkan. Until then? AMD is the only one getting performance boosts JUST because it automagically kills their driver overhead. It's why their cards do much better when you force a GPU bottleneck... there's so much excess CPU power available that their cards essentially operate closer to their proper potential (and even then, not as much as Vulkan/DX12 brings). I'm NEVER backing down from this point. Using Vulkan and DX12 existences as a crutch for things to come is incredibly stupid.

    Fourth, fix crossfire. Produce more profiles more often, and give people an ability to adjust the profiles. Have you ever attempted nVidia Profile Inspector usage? Did you know I can do things like disable SLI from specific games without requiring closing everything and having my screen flash to enable/disable SLI system-wide? It's a deeper disable than forcing single GPU. Did you know I can change the actual bits used to form a SLI profile? Did you know I can force other games' specific profiles on new titles without needing a driver to specifically launch with a profile? Did you know that sometimes when nVidia's profiles are lackluster, I can find better ones online? Like this one with Black Ops 3 where the Battleforge bits provide about a 25% boost to performance over the default BO3 bits? AMD can't do that, in addition to having less frequent profiles. AND CROSSFIRE STILL DOES NOT FLIPPING WORK IN WINDOWED MODES. I don't care if scaling is better. It's useless to me, and most people who multitask and want more than one GPU. It's useless to pretty much any streamer worth their salt, because alt tabbing is something that needs doing a lot, and the good ones won't run fullscreen unless they have to. No. I'm not accepting this. The feature is nigh useless right now, considering the RIDICULOUS drop in multi-GPU support already happening. Two AMD cards might as well be one AMD card and one waste of money right now, and this is coming from someone who still loves multiple GPUs.

    Fifth, discard Raptr and any ties to Raptr for AMD tech like recording via VCE. That's just barriers to usage at this point... enable some functionality in your driver.

    Sixth, stop focusing on tech that isn't here yet or nearly here, like H.265 recording efficiency when H.264 is left in the dust (and H.265 isn't even accepted by most sites like youtube as of yet). AMD is focusing on tech that'll matter YEARS from now, but by the time the tech makes a difference the cards would be extremely obsolete. It doesn't matter RIGHT NOW that they are good for tech coming 3-5 years from now. If they can't fix their drivers and optimizations then it makes no sense filling the card with barely-working tech. You need to focus on selling cards NOW. Not selling current cards expecting people to buy them so it works better than nVidia cards 3-4 years from now (when someone is likely to buy a new card anyway).

    I wasn't talking about people that just look at numbers. I was talking about people who have some know-how and are willing to buy a somewhat weakish GPU and overclock it until the cows come home. Why do you think the 750Ti was so popular, and why do you think nVidia started artificially limiting the card's overclock potential to +300MHz, and a custom vBIOS was needed to unlock it once again? Because 760 sales plummeted when people figured out the maxwell card was half the price and could match it in performance. Even here, people bring up the point that an 860M OC'd matched 780M performance in firestrike for literally much much less $$ and ran cooler. AMD's inability to OC even 5% without issues is a real problem especially for their lower end, "budget" stuff.

    That's the point. It won't be a 1070, but it will almost certainly need a notebook with a larger power brick and will run much worse. If it was made for the P7xxDM series and the GT73/GT83/GT72 series and was priced well cheaper than the 1060? Then maybe you have something. But even then, it'll be extremely niche. I can guarantee you it won't be showing up in any ASUS or Alienware laptops, far less the superthin sellers like Razer/Gigabyte/etc. NOBODY wants a thin laptop with a massive power brick. NOBODY. I've even seen people complain about the P65x power brick being large. Far less one of the 240W-300W bricks. You're considering nothing but power draw and performance... you need to consider everything.

    AMD supposedly turned a profit the last quarter. So they're not, as of right now, bleeding. But even then if they have enough money to survive some years they should go all-in and bring something TRULY amazing, but they're basically lifeline-ing themselves, playing way too safe, and generally not trying.

    No, I'm going to swing my axe as much as I want. I find nVidrosoft FAR worse a company, but they deliver performance. AMD just proved that they're willing to go to the SAME lengths to uphold a front when they know they have badly designed hardware. They're lucky that unlike the 970's vRAM issue, the RX 480 issue can be fixed by AIB partners... but they did the same principle, and them saying drivers fixed it is just straight up bull. You don't driver-optimize 15W-42W away. It doesn't happen.

    AMD needs to want to fight. When they want to fight, I'll help. This is not them wanting to fight. This is them just... I don't even know what. But it ain't wanting to fight.

    When nVidrosoft is the topic, or when Micro$haft is the topic, I will bash away till INFINITY. But here is AMD's topic. I will criticize them as I see fit. It's their own faults. There is a certain leeway I will give to them because their situation is not rolling in the dough like the other two teams, but that leeway is not infinite.
     
  25. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Me too, it seems that we can disagree on some points to infinite :) I'll just correct you on this:

    I need it now, but HDR would need it as well and it will be a thing in a year or two tops. In this year's TV sets line-up HDR is all the rage. It is based on Rec.2020 (10bit minimum, 12bit recommended) and would require at least 1000nits. So again, it is for me right now (the 10bit part, my display is 210nits, so not HDR spec), but also for everyone else in near future. Actually right about now if you already have an HDR TV set and want to hook-up your laptop to it.
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    D2 Ultima...
    Here's something that might interest you:
    http://www.forbes.com/sites/jasonev...x-1060-versus-amd-radeon-rx-480/#2b6a339a4166

    Look at the power draw of 1060 and 480 from the wall.

    Now keep in mind that this is with stock 480.
    When undervolted, the 480 power consumption drops by about 30 to 35W... on core undervolt alone (this also doesn't take into account that the VRAM on 480 can also be undervolted, which when it does, apparently drops power consumption even more).

    So, yes, it is not far fetched to think that if they can 'easily' slap a '120W' 1060 into a laptop, then an undervolted 480 would actually rate at about 110W at most - making it more efficient at base clock speed, while also providing superior performance at DX12 and of course far superior one in Vulkan.
    Yes, DX11 performance would still be lower or relatively close (depending on the game), but the differences would not be noticeable to most players as both output more than high enough frame rates at 'High' settings in all games for smooth gameplay... and as time goes by, the 480 would probably further receive performance boosts and improvements.

    AMD drivers improved quite a lot compared to before. Right now some people even dare say that they offer BETTER software support than Nvidia does - I don't know how factual that is, but AMD certainly stepped up their game in the software department.

    Now we have to wait and see if AMD scores any design wins with OEM's and what kind of designs will they be.
    Even if they get inferior cooling and ramped up voltages, we could still individually undervolt them (unless OEM's again sabotage AMD and create such an inferior cooling system that will make it impossible to compensate with undervolting... or they will just bring the voltages down and degrade the cooling at the same time, making it seem like AMD is heating up more).

    AMD can do very little if anything to directly influence OEM's.
    My point is that given what we know of 480 and its power draw, it CAN be placed into a laptop if properly undervolted - which would effectively make it slightly better than 1060 in terms of power draw.

    For 1070 and 1080 in mobile factors... you're moving into the TDP range that AMD has no new GPU's in to begin with.

    But yes, I will also call you out by saying you can be very biased if you think that slapping 200W Nvidia GPU into a laptop is a piece of cake for OEM's all the while claiming how hard it is to cram 120W AMD GPU of previous generation into a laptop and give it proper cooling and power brick.

    And even if you think that undervolting cannot help AMD... I think you are sorely mistaken.
    The differences between 1060 and 480 are not big in terms of power draw. Nvidia merely optimized it's voltages better out the door (hence the much lower volume of GPU's as a result), and AMD only needs to undervolt to lower levels without sacrificing performance (actually, increasing it at the same time) to get into the same if not better TDP range.

    Also, when properly undervolted, the 480 is superior to 1060 in performance per watt.
    1060 also operates on higher clocks, and when 480 is undervolted, it's performance goes UP by as much as 5% in some games (due to it being able to maintain boost clocks).

    So, 480 offers mostly the same performance as 1060 while smashing it completely in Vulkan at lower clocks and lower power draw (when undervolted).
    So yes, it is more than doable to get a full blown 480 into a mobile form, and for now, that would be enough (at least for me).

    I wonder just how high OEM's will go in terms of GPU TDP before deciding what the cutoff point will be.
     
    Last edited: Aug 11, 2016
    triturbo likes this.
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Nope, didn't interest me in the slightest. Their test isn't comparing a 1060 vs a 480. Their test is an entry-level gaming vs very high end games, and has other bad practices.

    Let's ignore wall draw and consider card draw. Wall draw is inefficient. It has to be converted. The amount of power those things are actually drawing (the real thing we should be concerned about for laptops) is the actual draw, which you either check on the cards themselves or you use certain software to do (like how Throttlestop correctly reports total CPU TDP for intel chips). I know more than one program should be capable of reading actual card power draw these days. Also, if at a GPU bottleneck, the CPU won't max itself and will pull less power, possibly meaning that the card is making up more of the drawn power.

    Secondly, you're telling me that undervolting the core, undervolting the memory, and keeping the clocks the same will keep that card 100% desktop rock-solid stable performance at 110W when it normally draws 192W to hold its boost clocks at stock speeds in a VIDEO GAME (not even furmark or any such program)? I just want you to understand what you're saying right now, and why it won't ever work. You're expecting a card that pulls so much power that undervolting it makes it boost more PURELY because it can make the power it has stretch further, to completely reduce its actual power consumption by simple undervolting? And not by a small amount too; but by 43%?

    Yeah no. Unlike what you expect from the 480, the 1060 is downclocked some. It's only a small bit, and it kicks it out of the power bracket of the 980, but it plus an assumed undervolt is enough to drop its power draw some. The key there is the slight speed drop... which the 480 cannot afford. Unlike that link you posted which checks a bunch of DX12 games and almost no DX11 games. Benchmarks don't really count. Shadows of Mordor isn't new nor is it a particularly demanding title, and Far Cry Primal was almost completely CPU limited, I am certain. GTA V is the only other tested title. I've seen many more tests, though, not from that article, and it generally is quite a bit worse, so I don't like that small subsection of games they tested. They should've made a much larger, non-DX12 list than that.

    As for this "DX12 & Vulkan" performance, this is irrelevant for the most part. Most games aren't using it, and it doesn't affect the VAST majority of existing or upcoming titles. 2 or 3 years from now, this may be more relevant. But right now, I see everybody jumping on it and it isn't making any sense. By the time Vulkan shows up for many titles, these cards will be badly obsolete. Anybody buying or trying to push cards right now as being "good in DX12/Vulkan" is grasping at straws to sell a card with. And this goes for anyone, pushing any card, from either red or green teams.

    Oh they definitely have. But they're still extremely inefficient. And I don't know what this "better software support" is. Their VCE is garbage unless using H.265 which isn't supported by any major website, they have no equivalent to nVidia Profile Inspector for tweaking games, and as far as I know access to some of their tech is STILL tied to Raptr. As much as nVidia has been churning out garbage for drivers for the last year and a half, I can't sit here and say any more than "AMD is probably more bug-free", which doesn't mean too much since there are a few golden drivers accepted as being mostly bug free from nVidia that people can use.

    Designs? What are you talking about? The laptops are already designed. They simply need to layout a heatsink and produce a card spec and price to the OEMs. Whatever this "OEM sabotage AMD and create inferior cooling systems" you're talking about makes no sense. Pascal is about twice as hot as Polaris. Anything cooling a 1060 can more than cool a 480. This is out of the question, and I never said heating would be a problem for AMD (in fact I said the opposite).

    No, your point is invalid. You didn't understand a single thing I said. You want the full performance of the 480. This isn't arriving in a 110W package. You don't cut out 82W from undervolting core and memory without an underclock to accompany it, pushing it well below the competing mobile 1060's performance range and into the 980M performance range. And if it hits 980M performance range, it loses. It won't fit in super small machines that can house 980Ms but can't house Pascal, because it is too hot. Sure it's cooler than Pascal, but it's nowhere near being as cool as Maxwell is. And if you keep its performance, its power draw is going to be too large to fit machines in the size of a new-gen P65xRx. Why? Because the power brick needs to be larger. The heat ain't the problem. The performance won't be the problem. The power brick size WILL be the problem. Almost nobody is going to buy a laptop of the size that holds a good 240W or larger brick (Alienware's new bricks do not count) to buy a GPU that ballparks a mobile 1060 they can get in another, smaller machine with a smaller power brick. Because the market in general wants things as thin and as light as possible, and this INCLUDES the relative size of laptop power bricks to said laptops. Why buy a P750DM2 with a RX M480 (which costs more, because it's a P750DM2) over a P65xRx which holds a 1060? The latter is smaller, lighter, has a MUX switch, and a smaller power brick. Tell me why would someone buy it? If you can't find a reason (which I can't find, no less) then it won't work. If Polaris launched in Maxwell's era things would be different. They could shove an almost-980 at 150W TDP into something like a GT72 and make BANK because it'd be about $400 less than the actual 980. Driver? Performance optimizations? All that be DAMNED that's $400 they're saving. Now they have to compete with the entry level gaming card, with a card that would draw twice as much power, forget heat/performance. It doesn't WORK. It doesn't make SENSE.

    AS for the mobile 1070, it apparently draws less than a 980M. It needs more cooling, but it's not going to need a larger power brick than anyone with a 980M did. This means it's A-ok for P65xRx new generation, let's say, should they be capable of cooling it. And if you're considering the larger laptops with large power bricks where power draw be damned, AMD simply has nothing to compete with. If nVidia made M cards, and neutered them like Maxwell was neutered, then we'd probably see something else happening. But no, they actually did something quarter-way competent this time, so AMD has no moves. But the way I see it, they walked RIGHT into their position and have little but themselves to blame for it. Everybody else simply moved around them, happily.

    I already discounted this sentence in this post. I'm not biased in the slightest.

    I think you're absolutely deluded and an AMD fanboy at this point. The differences between the 1060 and 480 in terms of power draw are gigantic, once you let both cards draw unlocked power limits. And without drawing unlocked power limits, there is TDP throttle. And you said it yourself in AMD's case: undervolting allows the card to perform better BECAUSE IT'S TDP LIMITED. The AMD card is a card that pulled 192W at stock once power was removed. The GTX 1080, as tested by Dufus who checked the card's actual power draw, did not go above 180W. This means that at stock settings on reference-class cards with TDP limits removed, the RX 480 DRAWS MORE POWER THAN A GTX 1080, far less a 1060. Get this into your head. Polaris is nowhere NEAR the efficiency of Pascal. And this is its problem. The power bricks, and why OEMs and ODMs will not shove them in laptops that could otherwise hold them. P65xRx has a 200W brick maximum. Undervolt a 480 and undervolt its memory and let's say you manage 150W or 140W, even. That 200W brick ain't supporting it. But the mobile 1060 somewhere around 80W will. So you underclock the 480 some. Now it's dropped to 110W, so it fits in the P65xRx, but it's now only as strong as a 980M and overclocking will exponentially increase power draw, and thus the card's potential is reduced to a non-overclockable 980M. For about the same price as a mobile 1060, unless AMD takes a huge price cut on the card and prices it closer to its desktop counterpart. Which I am certain they won't do, because they never have done it.

    So take your pick here:
    980M-level performance, potentially put in most entry-level gaming or higher laptops. Overclocking will not happen. Must be extremely cheap.
    Mobile 1060-level performance, put in only large laptops that have large power bricks, which cost more than most other laptops because of their beefier cooling system and whatnot. No gsync or freesync tech. Limited overclocking, if at all (possibly requires power connectors). Also needs to be rather cheap, but not as much as above.

    Which one do you think is going to happen? Or make AMD much money?

    See above.

    See above.

    See above.

    Statement makes no sense to the argument.

    ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    I get that you love AMD, and you want to see AMD succeed, but you need to face some facts. AMD is not capable of working magic here. They're not going to suddenly pull this one straight out of their meowmix. If this was last gen, a M480 would have done swimmingly well. But they launched Polaris too late. They had working samples for almost a year being demoed before launching them this year. They could have launched them and made a killing, but they took too long, and still blundered their launch, and then tainted their good name and the faith the people had in them with how they handled their blunder and attempted to claim nothing was wrong. I followed the ENTIRE scandal, laughing and facepalming the whole way, from the time reviewers broke the story straight up until the subreddit banned and deleted the megathread about all the issues. Having people coming up to me telling me they "fixed" it with drivers, for me to have to point out that all they did was neuter the card's performance in their attempt to "fix" a broken design, having me go around informing everyone that if they want one of those cards, they had best grab one with an 8-pin power connector or not at all.

    AMD right now has nothing to compete with. The M295X was give or take the level of a 970M, but it again couldn't overclock and cost more. I saw it in Alienware models (the ones that had the power bug) and I never saw them get chosen much because of that price, and the fact that Enduro is MUCH more of a pain than Optimus is. It's one thing if their flagship competes with nVidia's flagship and costs less. That gets them sales. Lots of sales. The amount of 7970M sales I saw over 680M back in the day was huge. And then half of those people groaned a year later when the cards started dying, ironically. But now they have something that doesn't compete with the top end, but the low end, and draws much more power, and is not likely to be exceedingly cheaper. They are in no position to fight the "normal" way because they have nothing that can. If you want to keep believing that 82W of TDP for full boost and performance from the card is going to go away by undervolting core and memory without ANY performance downsides, you can keep dreaming. your example is the 7970M vs 7870, but that had a 15% performance cut right off the bat. The M480 could not afford to just "lose" 15% performance or more like that and remain competitive. The 1060 generally beats it in most games and benches I can find and that ONLY has a 7% clockspeed drop.
     
    Shadow God and s19 like this.
  28. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    Is it... is it a Babylon of text? :rolleyes:
    Because "wall" doesn't feel like can represent all of it.
     
    Last edited: Aug 12, 2016
    D2 Ultima, Ashtrix and TomJGX like this.
  29. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    He does make sense, but you probably don't pay attention to mid and low-end machines. Also single RAM (yeah a single SO-DIMM and nothing soldered) APU machines, while it is well known that APUs greatly benefit of dual channel. Also something for your unwilling to admit list. When I started to talk some sense about GameSux you said that this is not discussion about nGREEDIA. It's not, but it is about gaming GPUs, and some games are obviously affected by them and as a "side effect" the GPUs in question as well.
     
    Last edited: Aug 12, 2016
  30. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Speaking of OEM's crippling AMD:
    http://www.anandtech.com/show/10000/who-controls-user-experience-amd-carrizo-thoroughly-tested

    Also, I'm no fanboy of AMD.
    Look at my signature and see the kind of hardware I use.

    AMD ramping up voltages that result in high power consumption (but also higher volume of GPU's out the door) is well documented, as is the premise that undervolting the core drops power consumption by 30 to 35W on 480 (which actually brings the GPU in line with 1060 in terms of efficiency).

    If you want to ignore that D2 Ultima, that's your business, but facts are facts, and they don't go away by ignoring them - if anyone might be displaying signs of fanboyism and favoritism, it would be you.
    I'm just saying what AMD might be able to pull off if they get their heads together, OEMS though are a different story
     
    Last edited: Aug 12, 2016
  31. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    Because bringing yesterday's performance with yesterday's power consumption is interesting.......

    AMD lost, period.
     
  32. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Well then, so did the consumers. Have fun.
     
    TomJGX and ghegde like this.
  33. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    How is Nvidia better again?
    They are doing practically the same thing as AMD... and both are consuming less power than previous generations while delivering same performance.

    Plus, are we denying that at 1.28 Ghz Polaris is producing basically same performance as a higher clocked Pascal and overshooting it in some areas when proper API's are used?
    Shall I also mention Polaris compute capabilities, or is that yet another 'lets ignore' factor?
    Oh yes, I forgot, compute is worthless to everyone apparently (except for those who actually use it) as GPU's are only meant to be compared in their gaming capabilities and nothing else.

    I'm not doing anything except pointing out conveniently thrown out facts about AMD.
    Yes, Nvidia has done better by putting out GPU's right out the door with better control over their voltages... but they also hadn't done any real groundbreaking stuff because all they did was shrunk Maxwell and rebranded it as Polaris (which clock per clock seems slower than Maxwell in order to hit higher frequencies).
     
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    ... ya'll are delusional. There's no point saying anything that isn't "we should all buy AMD!" to ya'll. I could make an hour-long video with graphs and bullet points compiling information from all over the internet, and you would STILL tell me I was wrong.

    You know what, ya'll buy AMD. The R9 M460s are out, go grab em. I'm not going to make an excuse for AMD playing the lifeline strategy and refusing to push away from GCN when they knew it had serious limits for power draw and heat after Hawaii. I cannot STAND nVidia, but that doesn't mean I'll give AMD 10 miles. I gave them one. Not an inch, but a whole mile. And they want 10. I'm not giving it to them. They aren't even COMPETING. As far as I see it, AMD is targeting a desktop market nVidia is currently ignoring. And they're targeting a laptop market nVidia already dominates with the 960M. And that does NOTHING for people like myself, or anybody wanting anything other than an entry-level gaming card from either market.
     
  35. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    And doing a far better job than AMD and targeting the entire segment of the market.

    No, but comparing clockspeed between 2 different archs build on 2 different processes is utterly meaningless. Pascal is maxwell tuned for clock and Polaris is an aging architecture ported to samsung's 14nm finfet which really isnt all that good to begin with. It really doesnt matter how high you push the clock until you start hitting 3/4ghz mark. I dont see the huge problem in Pascal being Maxwell tuned for clocks. The performance is still there right?

    What is there to compare about Polaris's compute ability? Because 1/16FP64 is impressive??? People that cares very much about DP compute will be on K80/P100/Volta Tesla.
     
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    hmscott likes this.
  37. CaerCadarn

    CaerCadarn Notebook Deity

    Reputations:
    320
    Messages:
    1,169
    Likes Received:
    1,124
    Trophy Points:
    181
    hmscott and D2 Ultima like this.
  38. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    hmscott and triturbo like this.
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah, and if you read anything I said in the last few pages, I said that the RX 480 needs 8-pin power connectors (you know, like was said in that same thread?) to work correctly.

    I *AM* bashing AMD on this. Because they need to be bashed, and people here need to stop treating AMD like they can do no wrong, or that any screwup can be spun into something good.

    No fanboyism. I praise good achievements and I bash screwups. I am ESPECIALLY HARD on screwups that they should have absolutely known would have happen, like the 970, and now the RX 480, and if you think I'm ridiculously hard on THAT, I'm even *MORE* EXTREMELY ESPECIALLY HARD on screwups that they should have absolutely known would happen, when after the fact they try to deny it or push that it means nothing. You know, like the 970? And now the RX 480? Where damage control said nothing was wrong? Yeah. No.

    Like I said above. I am NOT giving AMD 10 miles after already giving them 1 mile, far less 1 inch. The saying is give an inch take a mile? Yeah. I gave them a mile to begin and they want to take 10.
     
  40. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
  41. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Actually you can't assume *all* AIB RX480's are compliant, you need to verify it on a card by card, vendor/model basis.

    That's the real problem. AMD losing PCI-E Compliance Certification means all the AIB cards are in an uncertain compliance realm.
     
    i_pk_pjers_i likes this.
  42. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Maybe we get rx460 in mxm form. That would be a nice start. Next a rx480 please ;)

    Sent from my SM-A500FU using Tapatalk
     
    triturbo and i_pk_pjers_i like this.
  43. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    AMD losing PCIe compliance on reference cards means many pre-builds will stay away from it. That market isn't large to begin with though.
     
    D2 Ultima, hmscott and i_pk_pjers_i like this.
  44. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Pre builds already stay away from them. I could only find R9 380/380X in many pre builds around october last year, with some more offering 390/390X cards (at $100 more than 970s/980s, mind, somehow) and VERY RARELY R9 Fury X (at $100 more than 980Tis, once again somehow) and Fury Nano cards...

    Please note, the $550 USD R9 Fury cards, 20% stronger than a 980, costing the same as a 980, is *NEVER* sold in pre-built machines. I simply could NOT find it, anywhere, in the US for prebuilt PC companies.

    RX 480 will simply not even exist in pre-builts at all.
     
  45. ryanev84

    ryanev84 Notebook Consultant

    Reputations:
    10
    Messages:
    252
    Likes Received:
    65
    Trophy Points:
    41
    ?0k

    Wonder why cards have not caught up to games. My old GeForce 6800go ran half-life 2 at 130fps. The 6990m was and best offer at the time but never saw new games hit 130fps. Now it's 2016 and Nvidia has the 1080 out and it won't max much at 130fps. And has yet to offer
     
  46. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    So losing compliance doesn't really matter in the end when you don't have sells in the first place. :p
     
  47. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Very funny right?

    Cheaper GPUs, costing more, even better propositions non-existent. Now go ahead and tell us again that we are delusional and that the industry is not sucking nGREEDIA's @$$. AMD is not entirely to blame unlike what you love to say.
     
    TomJGX likes this.
  48. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You know what, I'm convinced you don't actually read my posts. I've done more than my fair share of bashing of nVidrosoft in this thread, even though I've said this is NOT the place for it, since the topic is AMD.

    AMD is looking for their crap. I never said the industry was fair to them, but they are looking for their crap. If I sat there and saw AMD actually fighting and trying, I'd change my criticisms and stop bashing them. But unlike your flaming nVidia-hating self, I refuse to believe that AMD is barely at fault. There were a hundred things they could have done to prevent all the heat I've been giving them here, and I listed some of them already.
     
  49. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I can say the same, so it's a two way street, that's why I stopped earlier. You read what you like and ignore the rest.

    I never said that "AMD is barely at fault", that's your short-view-distance-reality-bending vision that says so. I literary said "AMD is not entirely to blame" and was eager to see your response. You didn't disappoint. You can go ahead and double check the post above yours. I would never say that AMD is not to blame, because sadly this is not the case and I have posted a few times why the lack of MSi machines with AMD hardware might be partly to AMD. In the mean time nGREEDIA gets a pass on an awful lot of things, that's what gets me and I don't give a damn how I look, I'll bash nGREEDIA to the hell and back, since I'm among the very few that does so around here.

    I don't see you go hard on nGREEDIA about numerous things (some pointed in this very topic and nowhere else in the forum), but you nit-pick EVERY SINGLE AMD mishap. So don't try to convey your neutrality, I know that there isn't one. At least I don't pretend, ever since GameSux happened, I said that I'm AMD biased and wont even consider nGREEDIA product and I stand behind it. Right now, there's nothing nGREEDIA has to be praised about, really. If you had a broader view, you would've seen it as well.

    Not fighting? Pffft! How do you fight bribing?! Like the grIntel settlement that came a little too late with "coincidental" AMD CPUs downhill. Thanks the EU on that, if it wasn't for them, all would've been "good". Even with it, that settlement wasn't even remotely enough to fix what was done (after all it was an EU settlement, while the world is a bit broader than that). Now we are getting the same thing, only with GPUs.
     
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Go search my name and "nVidia" on T|I and here. I'm sure you'll find some things.
     
← Previous pageNext page →