The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD will use 28nm for their GPUs in 2015

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Dec 30, 2014.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    R9 M295X
    Description: Bus Interface: PCIe 3.0 x16, Max Memory Size: MB, Core Clock(s): 750 MHz, Memory Clock(s): 1375 MHz, Graphics API Support: DirectX 11.2 OpenGL 4.3 Max TDP: 125 W

    GTX 980M
    Description: Bus Interface: PCIe 3.0 x16, Max Memory Size: MB, Core Clock(s): 1038 MHz, Graphics API Support: DirectX 12 OpenGL 4.5 Max TDP: 100 W

    GTX 970M
    Description: Bus Interface: PCIe 3.0 x16, Max Memory Size: MB, Core Clock(s): 924 MHz, Graphics API Support: DirectX 12 OpenGL 4.5 Max TDP: 75 W
     
  2. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
  3. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    LULZ What precisely should I get out of this? Tests, tests, tests and throw some temps for good measure. Also there's a rumor that the 980m in AW15 would be cut down... Gee I wonder why :D
     
  4. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    They removed 4GB of the VRAM because Dell thought it was a good idea to use soldered GPUs...
    Other than that it should be a full 980M. If you are trying to suggest 980M to run hot you are out of your mind. Its one of the coolest running GPUs in a long time (with great performance).

    What you should get from TDP info? I dont`t know, maybe an indications of how hot it will run? With 125W it will undoubtly just do that.
    Here, while you wait for benchmarks from Alienware 15 :p
    Poll: M295X GPU heat in Valley benchmark - MacRumors Forums
     
  5. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I was pulling a leg hence the " :D ". But seriously, it's like comparing two cars based purely on the spec sheets. Unless they hit the pavement you can't say which one would be better. You can probably guess, but there have been a few surprises now and then.

    That's like what 2-3 moths old, I can barely care what they get. Sadly it's bad points for AMD and not for Apple with their obviously crappy cooling (Apple fan boys would buy whatever Apple throws at them, and if it's a fail, they'll just get the new model, God forbid changing the brand). As I said before it's the very same heat-sink that cools lesser CPU and half-the-wattage GPU. So I at least have put a grin above, but are you serious? I'm yet to see non overheating slim form factor Mac (this excludes the Mac Pro, since it's small, but not slim).

    In the end of the day it wont really matter if Clevo or MSi don't put out and MXM module, since Alienware is obviously loosing it.
     
  6. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Try iMac with GTX 680MX or 780M ;)
    78C/172F for the GTX 680MX inside the 2012 iMac during Unigine Heaven which is equal to the benchmark posted here where the M295X runs around 100C....

    I don`t know why you keep trying to twist the fact that the M295X runs hot. The same machine could cool a full GM204 just fine.

    If it looks like a duck, swims like a duck...
     
    Last edited: Jan 15, 2015
  7. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    OK, it runs hot. Thank God they haven't put 880m in there, because there wouldn't be moaning owners over the forums, rather burned houses over the news :D

    Let's move on - the performance. What's your take on it. As I said before and maybe fifth time - put them in the same machine and then we can write conclusion. Throttling R9-M295X is not something to base your conclusions on, or at least I wont.
     
  8. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    What? No 390X this time?
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Around 10% slower than GTX 970M is my guess
     
  10. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    No info on a potential M380X?
     
  11. TR2N

    TR2N Notebook Deity

    Reputations:
    301
    Messages:
    1,347
    Likes Received:
    255
    Trophy Points:
    101
    The jump from my 6990m (40nm) [2011] to the 680m (28nm) [2013] was a huge jump in terms of 3D performance and reduction in heat dissipation.
    Yields on 20nm will be hard to achieve as you do realize the wafers of the die cache are getting so extremely close together they are nearly touching.

    The physics of reduction in die size layers to produce the chip is not as easy as Moore's law. It requires better research in germanium doping silicon yields along with other metalloid combinations.
    If the yield can be achieved then this is a bonus but if i were AMD in 2015 I would focus on better architecture on the 28nm die size.
     
  12. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    TBoneSan likes this.
  13. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Awww yeah already 50% faster than 980. :thumbsup:
     
    triturbo and TBoneSan like this.
  14. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    uh I wouldn't trust that too much... a lot are already saying that's shopped...but again
     
  15. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I think AMD might actually get to give Nvidia a long overdue spanking in a couple of months..
    I can't wait. Let us all profit from some healthy competition.
     
    triturbo and DataShell like this.
  16. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    LOL this is fake. No way a 300W 380X card (per the previous leak from the two AMD employees' LinkedIn profiles) ties a Titan Z and scores this close to a 295X2, potentially even giving GM200 a run for its money. 380X means AMD must have an even faster 390X in the wings waiting to counter Nvidia. Now that would just be ridiculous.

    image.jpg
    (This is using a 3960X @ 4.6 GHz instead of a 4790K, which impacts overall score.)

    Also this score would instantly put it in the top 25 HOF fastest single GPU alongside what I presume to be LN2/dry ice cooled 780 Ti and 980 cards. So yeah too damn high to be believable, esp. for a 380X. Even the previously rumored specs (4096 SP's and 4096-bit HBM) don't make sense for this level of performance.

    The notion that AMD made this level of effiency gain in one generation (those world record busting Nvidia cards sure as hell draw a lot more than 300W), and on 28nm SHP (?) no less, is ridiculous.
     
  17. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    You wouldn't want AMD to compete with Nvidia?

    I think if the data is reliable then we should be VERY VERY happy. Not saying it would be ridiculous. Nvidia would be forced to price drop and or release some other monster chips.
     
  18. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No, I'm saying it's too good to be true. This is especially apt when discussing GPUs.

    (And I don't think the data is reliable, at all.)
     
  19. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    You and I both. I'm practically giddy with excitement. :D
     
  20. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    Praying for a worthy successor to the 7970M. I honestly think this may be real, because if it isn't, AMD will be bankrupt soon enough. They probably waited long enough to unleash this.
     
    reborn2003 likes this.
  21. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Most likely fake because both AMD and Nvidia is at 28nm. I dont think AMD is capable of going that much over Maxwell in efficiency vs previous gen
     
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah people are saying the actual run and scores are both real, but the GPU name was photoshopped. The run itself was made using 2 overclocked 290's in CF. Then there's this post that calls to attention what is suspected to be photoshop evidence.

    And just so people don't get fooled or overly excited again, here's another one of those finely photoshopped "leaked benchmark" from Chiphell.

    Chiphell sure has been having a lotta fun with PS lately...
     
    Last edited: Jan 19, 2015
    Cloudfire and octiceps like this.
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  24. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    If it was 390X on 20nm I guess it could've been plausible? But 28nm? Not a chance in hell.
     
  25. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    Oh well. Sorry for getting hopes up :rolleyes:

    AMD is supposed to use GlobalFoundries 28 nm SHP (Super-High Performance) that's allegedly better than TSMC 28 nm one for high-power chips.

    AMD to Switch to GlobalFoundries' 28 nm SHP Node in 2015 | techPowerUp
    AMD switching 28nm process to GlobalFoundries in 2015 - Industry - News - HEXUS.net

    Also wasn't R9 390X supposed to be stock water cooled? That should give more breathing space for that huge chip at very high frequencies.
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    One thing about a new GPU being equally powerful as two gpu's working in Crossfire... we had examples of new generation top end GPU's that managed to achieve this kind of leap... though they were usually coupled with a smaller manuf. process as well.
    Still, AMD will be supposedly using an improved 28nm process specifically designed for high performance - so it shouldn't be compared to a regular 28nm process.

    Nvidia was able to achieve 50% increase over the previous generation with Maxwell with architecture jump alone and same process node.

    AMD will be using HBM... which in itself will provide a massive bandwidth increase, but it may not stop there... official figures seem to state that the form factor would shrink by about 65% and power efficiency would jump by roughly 68% from using HBM alone (as evident here: AMD 20nm R9 390X Features HBM, 9X Faster Than GDDR5)... so if AMD has a new architecture for 380X and 390X which is reasonably efficient (And we don't know how much of HBM will translate to the rest of the gpu performance apart from just high resolution gaming), then I could see them surpassing Nvidia with a 300W envelope - provided that the GPU's in question end up more efficient in terms of performance per watt.

    In the desktop space, power consumption doesn't really matter that much, and even Maxwell can get to 275W when properly stressed - AMD simply might be reporting maximum thermal envelope under maxed out loads (compute for instance), whereas Nvidia likes to use only gaming numbers.
    Indeed, AMD cards may end up draw similarly less power while gaming, and up to 300W when doing compute.

    Still, R9 380X seems to have been 'confirmed' to use HBM:
    AMD developers confirm Radeon R9 380X with HBM memory | KitGuru

    Things should get interesting.

    It would be excellent if AMD decided to use HBM for their K12-Zen.
    With all of the supposed form factor reductions and power efficiency as advertised in the first link I posted, the CPU section of the APU could be addressed in a more serious fashion and overall produce a high powered APU which is quite efficient.
    It's a shame Carrizo won't be using HBM (apparently), though this benchmark here:
    Report: AMD Carrizo APU Benchmarks Show 2x the Performance of Kaveri, 3x Intel Iris Pro | PC Perspective

    ...seems to show a 2x performance increase over Kaveri (graphical performance most likely) and 3x performance increase over Intel Iris Pro - hm... one has to wonder how AMD managed such a jump.
    GPU section makeover perhaps?
    Inclusion of HBM (unlikely)?
     
  27. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    We had one example...8800 GTX. And multi-GPU support and scaling was much worse in 2006 (60% typical, 80% max) so that made it look better than it actually was.
     
  28. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    They are still bound to the same space limitations as TSMC`s 28nm process.

    And considering that R9 380X is said to draw 50W more than 280X, that 390X will absolutely need water/hybrid cooling. 290X in uber mode was basically one big fireball
     
  29. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That HBM brings greater power reduction and bandwidth and space requirements is all great.
    But do you really think it to boost performance that much? Have the chips from AMD been that much limited by bandwidth lately? It will be very interesting for sure but I`m sceptical performance wise for these new stacked VRAM chips

    I think it seems like AMD is doing a tick-tock like Intel. Using about the same architecture on the 300 series as previous cards, with some improvements efficiency wise, start with introducing HBM and then build massive GPUs to combat Nvidia`s Maxwell. Use GloFo`s 28nm process to gain some ground. Probably even surpass GTX 980.
    Then when 2016 comes they come out with a brand new architecture on 16nm that will replace GCN
     
  30. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    AMD will not be replacing GCN anytime soon. The compute-focused architecture is the backbone of their future CPU/GPU/APU/SoC and HSA developments. They'll keep adding revisions to it, i.e. first GCN cards were 1.0, Bonaire and Hawaii were GCN 1.1, Tonga and 300 Series are GCN 1.2, etc.
     
  31. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    I would expect AMD marketing of 390X emphasizing 4K gaming a lot. This is where HBM should shine. Resolution is big bandwidth eater. GTX 980 can still struggle at 4K:

    Nvidia GeForce GTX 980 Review - Page 4 | Maximum PC
     
    Cloudfire likes this.
  32. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    As if Hawaii's advantage at 4K isn't enough already. I guess AMD is really focused on pushing that even further.
     
  33. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    R9 380X will release in Q2 this year btw, somewhere between April-June
    AMD lanserar Radeon R9 380X till våren - Grafikkort - SweClockers.com
     
  34. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    Indeed. AMD does like to show off multiple 4K monitors connected to single PC :)



    Here at CES 2015 there were three 4K screens driven by one R9 295X2.

    BTW he also speaks about virtual reality here (among other topics). Interestingly, folks at Oculus are supposed to be much more excited about AMD than Nvidia about their VR specific optimizations. So this could be another marketing point for AMD (VR needs both a lot of bandwidth and super low latency).

    AMD to Share Low-Latency VR GPU Rendering Tricks at GDC 2015 : oculus
     
    Last edited by a moderator: May 12, 2015
  35. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    aside from fanboying all the way I think you are missing a very important little detail. if this is real, why can't we look it on 3dmark benchmark listings? people on tieba already mentioned at least 5 times that they couldn't find the result
     
  36. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    FAKE FAKE FAKE. Read the thread. :)
     
  37. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Lol what fanboy would change his 7970m for a 680m.

    I never said the data is accurate I only said I hope it becomes true.

    I simply want there to be competition. Funny that you misinterpret this for fanboyism.
     
    TBoneSan likes this.
  38. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    dude, I was the one that said it was fake LOL
     
  39. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    While the leak itself is fake the performance of the 390X may still end up close to that level. With HBM and GCN improvements there is no reason not to expect 50%+ improvements over 290X at 4K.
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I know, good call.

    Dude the leak was almost double the performance of 290X, which is why people started doubting immediately.
     
  41. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    So finally new AMD GPUs should come quite late, not earlier than April 2015, maybe even as bad as summer 2015:

    AMD: We will release new APU, GPU products starting in the Q2 2015 | KitGuru
     
  42. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Poor AMD.

    They are bleeding money so bad they are falling so far behind Nvidia in the launch schedule that they will bleed even more money.
    They lost a whopping $364M in Q4 2014 (just one quarter).

    And without money they can`t keep up with Intel and Nvidia in R&D. Which is the reason why they are over half year ++ behind Nvidia now in releasing new graphic cards.
     
    King of Interns likes this.
  43. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Its quite more of an issue that they are considerably smaller. Intel and nvidia, intel particulalry, bleed much more money in other areas heh, but they make up for it as a whole.

    I look forward for their new GPU and hopefully it will stir competition in new ways :D
     
  44. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Lol you guys were saying the 285 launch was bad, but now there is something that surpasses it, behold the junk that is 960, for those with no brain cells. Yeah I thought so, now you guys are quiet, nVidia has released the ultimate milking product, even the fanboys on sites like WCCFTech are complaining. The only thing Maxwell is good for is Mobile, end of story. AMD hasn't even released it's products yet this year and they have already won the GPU wars in 2015.
     
  45. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    What world do you live in Link4? :p

    http://www.legitreviews.com/asus-strix-gtx-960-video-card-review_157721
     
  46. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    The only reason desktop Maxwell sold so well initially is because it was released earlier, now that there are newer and true next gen cards coming nobody will be buying a 980 (970 still has decent performance/$ so that will still sell at least for now) besides people who don't know anything about graphics cards.
     
  47. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    mobile my a$$, the new driver still has no OC support for maxwell
     
    Last edited by a moderator: Jan 27, 2015
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I can assure you that people are still buying Maxwell like there is no tomorrow.
    The reason why I said AMD will lose even more money in the future is because there are huge disadvantages getting behind in the release schedule. Very few of those 1 million+ people who bought Maxwell will bother to buy new cards from AMD in Q2 when they bought their cards just recently. GM200 will release in March this year getting ahead of AMD there too.
    How much are left when AMD finally release their new cards?

    I totally agree that GTX 960 is a crappy card because Nvidia cheaped out on bandwidth. But its priced at $200 and is about just as good as R9 285 which is priced the same.
    Problem for AMD however is that GTX 960 draw 100W while R9 285 draw 170W. What do you think people will buy?

    That 1 million+ people that bought Maxwell cards will increase tremendously now that Nvidia have a $200 card for the masses and GTX 970 and GTX 980. And Titan X in March/April.
    God knows what number we are looking at in say May when AMD release their cards, but they will have a hard time selling enough cards to make up for the cost of the manufacture and engineering to produce the new cards now that Nvidia have filled the market with Maxwell.
    Which means less revenue (profit) for development of future cards.

    R&D is so extremely important. It becomes a vicious cycle if they fall behind with that.
     
  49. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    It's a slippery slope :(
     
    Cloudfire likes this.
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yup, I think their best hope (maybe the only) is if 300 series is vastly better than what Nvidia have on the market when AMD release them. Then I think the current Maxwell owners will have enough reasons to buy a new card again. But it needs to be really good
     
← Previous pageNext page →