The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Mobile Polaris Discussion

    Discussion in 'Gaming (Software and Graphics Cards)' started by moviemarketing, Jan 4, 2016.

  1. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    What are your thoughts on AMD's upcoming mobile graphics cards? I was a bit surprised to learn from their recent presentation that they haven't given up completely on mobile gaming hardware.

    [​IMG]

    [​IMG]

    [​IMG]

    Apparently 'Polaris' is not quite comparable to the traditional family of GPUs we have seen in the past - instead it is an umbrella term that covers disparate hardware, including both GDDR5 and HBM parts:

    As for the mobile gaming cards, it seems their goal is 'console caliber performance in a thin and light notebook':

     
    hmscott, triturbo and Cloudfire like this.
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    "Console-caliber performance in a thin and light notebook"
    Ok, I already dislike your statement that "console-caliber" is acceptable. Not that it isn't already achieve-able with 960M cards.

    That being said, none of this means anything if the specs itself suck, or if it's hot like all other GCN cards are.

    I sincerely hope great things come from this.
     
  3. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Let's wait and see :)
     
  4. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    I'm definitely exceeding console-caliber graphics with this relatively weak 960m.
     
  5. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    AMD's statement, not mine! :p

    There have been some rumors however, that they will update the Xbox with a Polaris card.
     
    Ashtrix and i_pk_pjers_i like this.
  6. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The GPU they tested against GTX 950 is def perfect for mobile.

    Load voltage is 0.83V which is even lower than idle voltage of today`s GPUs. The system consumed about 85W, so I`m guessing thats a 30W GPU performing like a 90W Maxwell which is very impressive.

    It also seems like low/midrange Polaris will be 14nm due to that process have a target voltage of 0.8V. Beefier GPUs need more voltage so like Anandtech says, AMD will use both 16nm and 14nm this time.
    Laptops could only get 14nm perhaps. Its also confirmed that AMD will use both GDDR5 and HBM for their Polaris GPUs. So like I thought earlier, no HBM for mobile (its tricky with current MXM specifications).

    AMD is ready for notebooks once again so yay. The one way Nvidia street during the several last years have been booooring :)
     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I know it's theirs.

    That does nothing; the performance will still equate to that of its current piss-poor card.
     
    i_pk_pjers_i likes this.
  8. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Will it MXM-B? That's all I care about.
     
  9. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Just looking at notebookcheck's list of GPUs, that puts this specific Polaris GPU inbetween the GTX 965m and GTX 880m.
    For a mid-range mobile GPU, that looks to be pretty good. Better than their R9 m385x that's for sure, but I hope AMD doesn't call this test GPU the R9 m480x.
     
    James D, CaerCadarn, TomJGX and 2 others like this.
  10. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Seems we will see the first Polaris laptops before the end of Q3:

    (google translation) from Lisa Su interview with http://www.computerbase.de/2016-01/amd-ceo-lisa-su-interview/

     
    TomJGX likes this.
  11. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    What makes you so sure HBM is tricky for MXM ?
    GDDR5 might be reserved for lower/mid range gpu's whereas HBM 2 could be reserved for higher end.
    I don't see how MXM would be incompatible with HBM 2 specifically. The main thing that would likely be affected is the cooling design that would accommodate HBM 2 gpu's.

    And besides, it might be possible that the MXM interface could undergo changes as well.

    Besides, AMD was the one that introduced HBM cards in the desktop space first... its possible it might do the same again for HBM 2 in mobile (alas, nothing has been stated on this end, and you could quite likely be accurate that mobile parts might not use HBM at all).

    Performance-wise, laptops wouldn't necessarily need HBM, but given the size constraints in laptops, HBM would reduce space requirements a lot and this could be an incentive for OEM's to modify cooling.
     
    TomJGX likes this.
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I believe @Meaker has said on multiple occasions that if you rotate the HBM memory a bit it could fit on MXM without issue, so I won't count it out just yet. However I doubt anything but their highest end model would really use it, and we still have no use for the extra memory bandwidth in games.
     
  13. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    We may see near Fury level performance (if not more considering the architectural improvements in Polaris ment to remove the bottlenecks in Fiji) with mobile chips this year at which point bandwidth will become an issue. As long as AMD having 3 different GPUs this year is true the middle one will most likely be the top mobile offering (exactly what Pitcairn was when GCN launched) which means they don't have to do anything crazy and should just put 2 4-Hi HBM2 stacks with 512GB/s bandwidth and 8GB total memory. Let's say ~ 250mm^2 GPU + 2 HBM2 (should have surface area <= HBM1) stacks on an interposer which would be much smaller than Fiji due to GPU size and HBM modules only on one side, which also means a less complex interposer (due to less stacks) and less pins so maybe this would actually work with an MXM module, and the module itself could be smaller because the only other things necessary are voltage regulation and other power delivery components. Not sure what will happen with mobile Pascal considering that nVidia is only supposed to use HBM on their highest end model (GP100? or that arrives next year?) and the issues with the Drive PX2 module and it only having 8 TFlops of SP compute performance for 250W TDP (that's just R9 Nano level) so 4 Tflops per GPU in the best case (if they didn't count the Tegras), all this doesn't speak confidence, looks like the efficiency reduction from putting all those things necessary for compute back that they removed from Maxwell is hurting them.
     
  14. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I just hate all this speculation at this point. I just want to see PRODUCT from either Nvidia or AMD.
     
  15. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    Dude, you've got a 980ti, just chillax...
     
  16. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Because MXM defines pretty much everything, cooling design as well (the contact between the GPU module and the heat-sink, the transfer and radiating of the heat is up to the manufacturer).

    It wont. If you get in hands an MXM module and physically measure it you'll see that you'll be able to put just a couple of mm larger chip than if put straight, touching each screw hole. This was the case with my Quadro 5000m (GTX 480m), which I used for the measurements, and as far as I know, this is the largest square chip put on MXM module to date... and the Fury chip is larger than that. This is about the current generation. Now, as @Link4 said, if we take into account the new process and hope that they'll use HBM2 for top-tier chips (490) and (this is where the real hoping is) use HBM1 for the runner-up (480), we might get HBM MXM module after all. Of course if 480 ends-up to be rebranded Fury, there go our (or at least my) hopes.
     
    i_pk_pjers_i likes this.
  17. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,705
    Trophy Points:
    431
    If Apple doesn't completely abandon discrete GPUs, I'd be willing to bet we will see some version of these in subsequent generations of the MacBook Pro, and probably the iMac as well.
     
  18. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Read something about Apple switching to AMD cards soon, don't recall whether it was a rumor or AMD release.
     
    Last edited: Jan 13, 2016
  19. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,705
    Trophy Points:
    431
    They already have. The current top-line 15-inch MacBook Pro Retina, which was launched last summer, has an R9 M370X.
     
  20. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Sure everyone wants to see a product, but since there are still 2 quarters before anything launches speculation is where all the fun is.
     
    hmscott likes this.
  21. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    And speaking of HBM2, interesting stuff just popped up on Videocardz.
    http://videocardz.com/58127/jedec-updates-hbm2-specifications

    Sadly no indication of die size.

    Edit: Straight from SK Hynix website [​IMG]

    so apparently HBM is coming to consoles and maybe even notebooks (I know it only says PC but there is an image of a laptop at least!).
     
    Last edited: Jan 13, 2016
    triturbo, moviemarketing and hmscott like this.
  22. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    If we do get a top MXM card from AMD, I will be the FIRST to test it! Can't wait!!!!! Fingers crossed for mobile top end MXMB with HBM.
     
  23. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    It seems a huge step forward, reminds me for the 3dfx's Rampage chip back in 2000. Hope AMD will manage to come out with and they stay in the business;



    Besides with what mobile GPU equivalent a Geforce 950?
     
  24. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    At stock, a 965M give or take.

    Hence why I'm not interested in such a low end card. It might be great for the ultrabook-type machines, but compared to the rest of Pascal and Arctic Islands it isn't even gaming-class. That's like saying a GTX 550 equivalent from Kepler drawing minimal amounts of power and being put into a notebook was great. No, it's pointless. Nobody cares about the power of a GTX 550 using less TDP and running cool when you have a 780Ti on the market... that's the stuff that goes into cards like a GT1040M or something. Not even a gaming card.

    No, AMD wants to impress me? Let them shove a 980Ti into a laptop at 100W TDP, have it be overclockable and run cool like a 980M. Then we'll talk.
     
  25. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    980Ti performance inside of 100W might be possible at 14nm. AMD just needs to show some interest for once. I get the feeling they simply couldn't fund or were too understaffed to properly commit to mobile.
     
  26. Hurik

    Hurik Notebook Consultant

    Reputations:
    10
    Messages:
    145
    Likes Received:
    159
    Trophy Points:
    66
    The desktop GPU (not yet fully optimized) consuming only ~30W at load, even if it's an entry-level 950 equivalent, is an incredible improvement for the red team. Moreover, this colossal per-watt improvement comes only from the node shrinkage and does not include the HBM memory, so I agree with sniffin that making a 100W card with at least a desktop 980 performance (ideally 980Ti) shouldn't be too difficult. It would be very VERY disappointing if AMD doesn't make use of this efficiency for mobile segment.

    Meanwhile, notebookcheck has also posted an article about it:
    http://www.notebookcheck.net/Energy...s-set-for-mid-2016-launch-Video.158182.0.html
     
    triturbo likes this.
  27. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Well, looking at these figures which compare AMD R9 Nano and GTX 980, I think AMD could accomplish the said task and deliver 980ti performance in 100W :
    http://www.legitreviews.com/amd-radeon-r9-nano-versus-nvidia-geforce-gtx-980_177681

    Nano is slated at 175W.
    Polaris is indicated to have 2.5 times more performance per watt than Fiji.
    So, at 175W, a Polaris Nano would easily surpass 980Ti (even an overclocked one)... plus we don't know what kind of additional benefits HBM 2 would bring in terms of performance and power savings when compared to HBM 1.

    However, if you drop down the TDP from 175W to 100W, what kind of performance difference would there be?
    I mean, there wasn't too much of a performance drop for Nano compared to Fury X (about 10% drop in performance for 100W lower TDP).
    So, would that mean that a 75W Nano part would be 20% slower than Fury X or would the difference be greater?

    Ok... if we theorize the drop would equate 50% performance reduction for a 75W Nano part... that's about 15-20% below 980ti (if my math is accurate), but a 100W part could be as fast if not faster than 980ti.

    Of course, this is working off R9 Nano (Fiji) - which is more or less a special case scenario and has HBM 1.
    We have 0 clue how this could translate to mobile parts, or what AMD might do with Polaris to improve efficiency and performance before its actually released.
     
  28. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    So, once more AMD will finally meet and exceed Nvidia's *last* generation GPU performance, just as Nvidia releases their *next* generation.

    Wait 3 months after AMD releases, and buy Nvidia ;)
     
    Last edited: Jan 15, 2016
    Ethrem likes this.
  29. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I wouldn't be so sure. AMD just wanted to demonstrate efficiency. They don't want to show their hand just yet.
     
  30. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    As I recall, historically, AMD makes a big splash...

    then Nvidia comes in and announces their new releases...

    Nvidia releases just enough product at a low price to put the kibosh on AMD sales...

    then Nvidia waits a little while longer before releasing enough product to fulfill demand, so prices rise to dizzying heights...

    AMD desperately drops prices... but noone buys cause they are caught up in the Nvidia hype...

    AMD bundles a bunch of games with their cards, people start buying...

    Nvidia then introduces better bundles, higher prices, and then the vendors start spinning out custom boards.

    AMD sits down and wonders how it all happened just the same way, all over again.
     
    Last edited: Jan 15, 2016
    jaybee83, TomJGX and Ethrem like this.
  31. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    AMD has made a lot of mistakes in the past that pushed a lot of us into the nVidia camp (myself included) but nVidia has really been making a lot of us mad lately so its really anyone's guess what's going to happen at this point.
     
    TomJGX, TBoneSan and hmscott like this.
  32. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    You're not wrong there. All AMD need to do is release a winner that can overclock and I'll jump on board. Nvidia have grown far too arrogant for my liking.
    In fact, I'd love the opportunity to help Huang wipe the hubris off his face.
     
    triturbo, Ethrem and hmscott like this.
  33. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I'm seriously debating my next desktop card... Although I still have to RMA my 780 Ti and apparently EVGA has been upgrading them to 980s and I could live with a 980 for awhile.
     
    hmscott likes this.
  34. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
  35. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Nvidia are on notice as far as I'm concerned. Is Maxwell performance is going to go the same way as Kepler once Pascal is out (ie fall behind the AMD products they were supposed to compete with)?

    If this becomes a trend I won't buy NV ever again. Maxwell was just alot better than any of AMD's mobile counterparts so I feel like I had no choice, but if 12 months down the line the M295X ends up better than my 970M due to Nvidia completely forgetting Maxwell exists then that will be the nail in the coffin and I just won't ever consider them again. I'll take slightly less performance at release if it means my GPU will be supported properly in the long term.

    On the desktop the choice is easy as there is a proper alternative to NV's lineup. The 290X ended up having incredible legs, great card imo.
     
    triturbo, TBoneSan and Ethrem like this.
  36. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Unfortunately the writing is on the wall for Maxwell once DX12 takes off. Maxwell is terrible at asynchronous compute and nVidia isn't going to be able to convince developers not to use it. We will have to wait for more DX12 titles to be released but seeing as how DX12 is pretty much tailor made for GCN, AMD could very well wipe the floor with nVidia.
     
    TomJGX, TBoneSan and hmscott like this.
  37. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Uhm... no.
    These performance gains on AMD's end (2.5 times perfomance per watt over Fiji) would effectively be consistent with what Pascal is supposed to bring over Maxwell (about twice perf. per watt)... and that's just a very baseline extrapolation off the entry level Polaris desktop card that was indicated to be open to further optimizations.

    So, both upcoming AMD and Nvidia gpu's should be more or less comparable (whoever gains an upper hand on DX12 though is an open game... however, we do have indications for now that AMD may be ahead of Nvidia in this area).

    Laptops with AMD gpu's (high-end MXM ones) were usually cheaper than Nvidia's counterparts while offering similar performance.

    At any rate, we won't know anything conclusively until either company comes out with more information or until they release their upcoming gpu's.
    But I wouldn't discount AMD. They may have made grandiose claims in the past, but they hadn't done this for years now, and their predictions in power saving and performance increases have been fairly consistent.
     
    Last edited: Jan 15, 2016
  38. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That would be awesome, I am always hopeful :)
     
  39. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    With how nvidia dont even have a working pascal MXM chip to show, this may turn into a fermi situation again. I think we are all tired of this monopoly in mobile market. As a consumer, I am only interested in the best product. Whoever makes the best product gets my money.
     
  40. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I think AMD coming out with Polaris before Nvidia does with Pascal would be good, since it would help them capture some of the market share back and possibly go unopposed for a while.

    This is a major shift for both companies since there's a manufacturing process differential (AMD is using 14nm, and Nvidia will apparently be using 16nm) along with 2.5 times performance per watt for AMD and 2 times performance per watt for Nvidia.

    Oh and here's more news about Polaris using both GDDR5 and HBM 2 :
    http://wccftech.com/amd-confirms-polaris-will-feature-hbm2-and-gddr5/

    It would seem that GDDR5 will be reserved for mid-range and enthusiast products, while top-end will get HBM2.

    Still no word on what mobile parts will be using, but I'm hoping top-end mobile Polaris gets HBM2.
    Only confirmation is that laptop Polaris parts will be available before back to school shopping season (or whatever the heck they call it) - which means this summer - or roughly 5 to 6 months from now.

    Oh, and the article does confirm that the power reduction on demo-ed Polaris desktop part was tested with GDDR5, not HBM 2.
    So, top end solutions might end up with more performance than estimated (and maybe even Polaris Nano is in the works?).

    Tidbits of info, but I'm liking what I'm reading... it is encouraging.
     
    Last edited: Jan 15, 2016
    TomJGX and triturbo like this.
  41. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    The thing is, AMD probably spend more time on R&Ding how to absolutely f*cked up a release then on actual R&D. Like seriously, for 290x, they had a great product, except they never released any information people wanted during their press conferences, had an absolute paper launch while nvidia had ample supplies of 780 tis ready. Bad heat/power consumption issues by cheaping out a cooler, no one would have mind paying an extra 20 dollars for a reference cooler thats doesnt wake up your entire block. Announcing mantle too early before any game was even close to ready in adapting it. A knee jerk response on free sync. etc etc.

    For Fury, not allowing an AIB design for fury X and having a weird fury nano that I still have no idea who it targets. First of all, Nvidia can afford to do that with Titans by because they had the performance and the market share to be arrogant. And also nvidia release an equivalent card with better non reference designs soon after. AMD doing it just kinda hurts them. There is no point in buying a WCed card without a proper block on a reference PCB. In my eyes, Fury X was a total failure. Fury nano is weird because 2 slot cards works perfectly fine with most matx/mitx builds, whats the point of a fury nano? I have no idea. It would be nice to push to the mobile market but they didnt. The fury non x is actually a decent card with ability to unlock shaders for certain cards except its overshadowed by nvidia's offering and produce non significant improvements over nvidia counter parts outside of 4k/dx12. 390x/390 arent exactly bad rebrands but they are just still rebrands.

    Fury series wasnt exactly a bad generation, just that nvidia have far impressive offerings. You could complain all about nvidia's tricks on 970 and maxwell clock switching, but the fact remains that nvidia absolutely dominates on the mobile market and have more or less a crown on desktop market with cards like the 980 TI KINGPIN.
    /rant
     
  42. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76

    Mobile will probably get GP104 which I doubt will use HBM2. Probably the new GDDR5X standard.


    Mantle set in motion events that lead to DX12. It was about reducing the pointless overhead that existed on PC APIs. What we have is a better DirectX, and Mantle itself is the basis of Vulkan. It was one of the best things to happen in a long time. Freesync was not knee jerk, it is in many ways a better technology than Gsync from a cost perspective. And there is nothing wrong or bad about providing a free open alternative to another one of Nvidia's proprietary technologies. My notebook has a Gsync compatible panel and GPU, but I can't use it because my model didn't come with a Gsync "license" - basically I bought it a few months to early. Nvidia's ******** is tiresome.

    AMD's desktop lineup has far better performance/$ in basically every segment. The fact 390X is a rebrand doesn't change the fact it often BEATS the GTX 980 yet is available for $100 cheaper. Like seriously, explain to me how the GTX 980 is a better buy than the 390X. What does it being a rebrand have to do with anything if it performs the same if not better and is available for cheaper? It's weird logic like this that leads to Nvidia owning 80% of the market. It makes no sense at all but there it is. Hawaii > GM204 but you wouldn't know it by looking at sales figures.

    The mobile market though I agree is a mess for AMD. They need to follow Nvidia's lead and throw TDP out the window. Release Fury Nano in MXM form and piss all over NV's parade.
     
    Last edited: Jan 16, 2016
    TomJGX and triturbo like this.
  43. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    I am look at it in terms of business decisions. As long as free sync seemed like a knee jerk movement, most people see it as that. And I think there is still a stigma against free sync. Same thing as mantle, great product, bad marketing. No one uses the name mantle now and thats the thing.

    GTX 980 have technically lower power consumption and a lot more OC headroom. From what I saw, 390x is around the same as 980.

    Being a rebrand means your used cards are heavily cannablizing your own sales. If someone can walk out and buy a 290x for way cheaper, why would he bother with a 390x. You have to give your consumer on incentive to buy new products.

    Again purely business decisions.
     
    hmscott likes this.
  44. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Mantle did what it needed to do and with a strong high end option the price of free-sync will be very attractive.
     
  45. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I'd like to see one of these graphs for laptops.

    It'd just be all green by now wouldn't it?


    Sent from my E5823 using Tapatalk
     
    hmscott likes this.
  46. Hurik

    Hurik Notebook Consultant

    Reputations:
    10
    Messages:
    145
    Likes Received:
    159
    Trophy Points:
    66
    Seems like news on Polaris are just keep coming!

    http://wccftech.com/amd-unveils-polaris-11-10-gpu/

    The article mentions two GPUs: one enthusiast class for desktops to replace Fury X and the other for thin and light gaming laptops, a so called "console-class gaming". Well, I am really concerned about this wording, since console-class gaming on laptops has been already available from 2010! My 7970M is already overkill for such comparisons. Hope we're not getting a 970M equivalent at a smaller form factor...
     
    moviemarketing likes this.
  47. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Yeah but even 970M beats anything the PS4 can offer. You have to go down to a 965M to reach parity with the PS4.

    Sent from my E5823 using Tapatalk
     
  48. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    I don't understand how they were able to generate hype at CES about demo'ing slim laptop featuring GTX 950 level of performance. There have been numerous slim laptops far exceeding that already on the market for the past 2 years: comparison link
     
    hmscott likes this.
  49. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    Console-class gaming is what hears of majority of people would like to hear. Don't get this 5-th page of concerns about this therm. Or does anybody really believe that AMD is going to bring 965M -class of mobile GPU? That would be too AMD-ish... I mean old AMD not nowadays one.
    P.S. All it means just minimum 60fps gaming on any game of 2016 on medium, nothing else (and nothing less), I bet.
     
  50. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    In the CES demo, they were running the Polaris 10 card alongside GTX 950, specifically to show the performance was about equal but drawing only 86W (compared to 140W GTX 950).

    The GTX 950 is only slightly faster than 965M (70W), handily beat by 970M(100W) from 2014.
     
    Last edited: Jan 16, 2016
 Next page →