The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Radeon R9-M295X

    Discussion in 'Gaming (Software and Graphics Cards)' started by Tsubasa, Mar 15, 2014.

  1. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,983
    Trophy Points:
    431
  2. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    We need to see this chip in a windows laptop before making any conclusions. The data here tells me different things each time.
     
  3. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    When you look at the Cinebench and 3DMark benchmarks, it kinda fortify my earlier speculation that R9 M295 is more or less like a GTX 880M in performance.

    Still dont think this chip is ever going inside a notebook due too high power requirements. Thats hurdle #1.
    #2 is convincing OEMs to use the chip when Maxwell is superior in performance and TDP
     
  4. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    You are forgetting one thing, they've put 650m in the MacBooks (after having 67xxm series AMD) and ARM over Intel in the smaller iSomethings ;) So I don't think they'll choose so obviously inferior product. It might be inferior, but I don't think it would be with the amount you state. Time will tell. Sadly Apple hoarding parts is most likely the reason we'll not see MXM R9-M295X and not heat and etc as you wish. 7970m, 780m and 880m are quite over the 100W (especially the later) but we do see them in notebooks.
     
  5. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Where exactly is the evidence that the R9 M295x has high power requirements?

    Also... the power requirements may be 'nothing out of the ordinary' for a high-end mobile gpu (in line with previous high end gpu's), however, given Apple's low-brow cooling execution, it wouldn't be able to handle the power requirements of those gpu's either way, and there's a possibility that the M295x was underclocked below its 'optimum' and as such we cannot see it's full potential (although at this point, I don't think we could possibly know for sure).
     
  6. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    M295X is the exact same chips as R9 285. Both are Tonga. Check out GPU-z screenshot posted on previous pages.
    R9 285 consumes 189W during gaming. 7870 consumes 111W. R9 M290X is based on 7870. That is fine, not much they need to go down in wattage.
    But now we are talking over 189W (M295X got more cores than R9 285) and they need to cut far down to reach sustainable levels for notebooks. Not an easy task and I think its pretty much impossible. The M295X you find in iMac runs at 850MHz. Maybe, just maybe a 600MHz M295X is possible inside a notebook, but what performance are you looking at then?

    iMac doesn't use MXM and have bigger power supply than notebooks too.
     
  7. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    The lack of cooling clearly shows, and I don't think they have improved the cooling when compared to the last year's model. The so called "superior" 780M by so many people on this forum is 20% slower in the 2013 iMac than in properly cooled gaming laptops (or +25% faster when windows machines are compared to the iMac). That same cooling issue is keeping the M295X down as well yet it still manages to outperform the 780M under the same thermal limitations by almost 50%. The disaster 880M would have probably done even worse than the 780M yet people claim that M295X isn't much better than an 880M. Given proper cooling (no I am not talking about those thin gaming laptops that the 970M is getting into) the M295X shouldn't have any problems in laptops that can handle a 780M.

    What you said is mostly true, except that the M295X isn't underclocked by Apple, the 850MHz is the boost clock (just like desktop Hawaii and Tonga, although desktop Tonga never throttles because it is efficient and cooled properly unlike Hawaii) and the severe lack of cooling in the iMac is causing it to throttle when the temperature limits are reached.
     
  8. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    The desktop card you linked to is a factory overclock.
    A few things to address here: we know gpu manufacturers like to put up relatively high voltages on their GPU's to ensure proper functioning across the board - this would probably extend to factory overclocks.
    So perhaps... the gpu in question is able to run at lower power requirements compared to what you would expect.

    In regards to core counts and increased thermals - we don't know what kind of changes to power optimizations will the mobile GPU have.
    There are differences here to think about.
    Mobile gpu's undergo various power reductions so they can actually function inside a laptop.

    Right now... all we have are speculations based on what we have seen on the desktop model.
    We don't know for certain whether all these things can be transposed to the laptop model - so if you don't mind, I'd wait and see actual data about the M295X TDP and how it might perform inside a Windows laptop with far better cooling.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    There is actually a lot we know if you dig for information.
    iMac 5K - i5 4670K + R9 M290X: Total Power consumption: 176W
    iMac 5K - i7 4790K + R9 M295X: Total Power consumption: 288W

    i7 4790K consumes about 30W more than i5 4670K.
    176W+ 30W = 206W

    R9 M295X consumes how much more power than R9 M290X:
    288W - 206W = 82W
    Remove 10W more because the M290X iMac have a little more RAM to make it a little more realistic.
    72W

    Did you look at the chart I posted earlier?
    R9 285 consumes 78W more than 7870. Both those desktop GPUs have their own respectable mobile GPU, R9 M290X and R9 M295X.

    Coincidence? Nope, speculation spot on using information we know.

    Apple also have heat output information they have posted
    iMac 5K - i5 4670K + R9 M290X: 601 BTU/h
    iMac 5K - i7 4790K + R9 M295X: 983 BTU/h

    Thats 64% more heat for the R9 M295X model.

    Conclusion:
    R9 M295X consumes a hell lot more power than R9 M290X and output substantial more heat.
    Since R9 M295X is a "mobile" GPU, they have already reduced voltage and done binning on it to reduce heat from a desktop version of it. I`m still sticking to my projection of M295X being Apple exclusive and unfit for notebooks unless a 600MHz or something version comes out
     
  10. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Another thing I`d like to mention is that the R9 M295X isnt really that much faster than R9 M290X.
    Its certainly no where close to Maxwell considering that GTX 970M is 40-50% faster than M290X.

    Anyway you twist it, AMD is too far behind Nvidia. They need a new architecture as well to catch up. Its absolutely crucial for a constrained environment like notebooks. Fingers crossed for a Q1 release for 20nm R9 M390X with HBM that beats GTX 980M solid.


    17% faster than M290X
    [​IMG]

    22% faster
    [​IMG]

    17% faster
    [​IMG]

    9% faster
    [​IMG]

    1% faster
    [​IMG]

    Identical
    [​IMG]

    http://www.barefeats.com/imac5k6.html
     
  11. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Wow those are some sad benchmarks.

    AMD lost.
     
  12. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    ^^ I really believe AMD will drop some serious s*** on Nvidia around December / January. Just hang in there AMD, I know you can do it!
     
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    For everyone's sake I hope so too. On the desktop side I'd like to see 390X completely decimate 980, so nVidia will be forced to release GM200 early and at a reasonable price point (I can dream right?)

    And if AMD's HBM implementation pans out very well, then nVidia will have no answer to it until Pascal's launch in 2016, which will really put the hurt on.
     
    D2 Ultima likes this.
  14. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    Yeah, and I can dream that it will come to mobile platforms with full HBM implementation.
     
  15. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    I hope that AIO shroud leak online has nothing to do with the standard 390X. I do like the direction NVIDIA are headed with the efficiency focused Maxwell design, though it is a bit much, I'm praying AMD are also following that trend. Hot + Power hungry GPUs don't showcase engineering excellence, if anything, it is a brute force approach to a problem.
     
    triturbo likes this.
  16. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    First post at Notebookreview. Had a fun time reading all 112 pages of this thread. Started reading sometime in Oct after the 5K iMac was announced. Kudos to everyone for the sheer enthusiasm and CSI needed when digging info about the cards.

    Recently changed my late-2011 iMac with a 6770M to this new 5K. Mine is the top-end with the 295X. I am a light gamer... mostly Dota2, CS:GO and Heroes of Newerth. Someone had asked for benchmarks for Shadows of Mordor earlier... so here they are. Benchmarks aside, I've been playing on high instead of ultra. Coming from a 4 year old card... this one looks loads better.
    ShadowOfMordor 2014-10-28 21-37-38-28.jpg ShadowOfMordor 2014-10-28 21-39-38-82.jpg ShadowOfMordor 2014-10-28 21-41-20-95.jpg
     
    Cloudfire and triturbo like this.
  17. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Kudos to you! How are the temps? Any stuttering?
     
  18. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    My ambient room temp is around 24c. I use the aircon set to 22c for about 15-20 mins before switching it off when the room cools down a bit. Play with headphones so don't notice the fans with them. Even with my room temp, according to GPU-Z it rises to ~100c at sections. Attached files are temp readings after I ran the benchmark just now...
     

    Attached Files:

  19. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    100C is not good!

    Apple's form over functionality = fail!

    You might want to salvage that 6770M if it is MXM and put it into the new imac you purchased. Games will heat the card up more than benchmarks so your new card will be throttling like crazy!
     
  20. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    Oh I sold it immediately after ordering this. Decent value... bought for $1500 (refurbished) and sold it for $950 after 3 years of use.

    Regarding throttling...the last week I've been playing only two games regularly... neither of them FPS so if there's any throttling... I don't notice it. ME:SoM and Dota2. Both played at 2560x1440 and all settings on high.

    Both the games are also played for hours at end.
     
  21. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    The CPU is definitely throttling (bouncing off the 100ºC mark where the thermal throttle activates), and the GPU graph showing silly numbers about temp means it's over 100ºC. I can't believe what they have done, but in all honesty it's not their first time, it has always been form over function for them. I hope you got extended Apple Care since you'll definitely need it.
     
  22. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    haha yes... I do have AppleCare. $170 to extend warranty of a machine which as configured is about $3800 is alright I think. Whether I'll be needing it because of this GPU remains to be seen. The 6770m in the older iMac wasn't exactly cool running either.

    And if you ask me, my old 17" Dell XPS Gen 2 - which failed after 13 months sob - and which I replaced with a Dell XPS 1710 with the 7950GTX were both louder and hotter. Both of those laptops cost me upwards of $2,500 each and when I sold them a year after using them, I got less than $300 for them. Those were my last Windows machines. Been using Macs ever since... also didn't game as much as before so that's another reason.
     
  23. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Holy hell.

    I think Apple went a little over their head thinking they could pack in M295X (102C lol) and that i7 processor inside the iMac.

    Can you repaste the GPU and CPU? Perhaps you can get better temps than that with a decent paste :)
     
  24. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Some early leaks pointed to 380X launching in early February and beating 980, but these might just be rumors, although beating 980 isn't much of a challenge. 390X is the answer to GM200 so it will destroy 980 but that's not what it is meant to compete with.
     
  25. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    AMD are currently making 20nm APUs for PS4 and Xbox One, so it seems that AMD have successfully moved on from 28nm for the bigger GPUs (PS4 GPU is only slightly smaller than R9 M290X)
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I was under the impression that PS4 and Xbox One APU's were made on a 28nm process.
    Are these new APU's or same ones on a 20nm manufacturing process?

    It would be interesting if consoles got Carrizo architecture for the CPU part (which would certainly improve performance there), and the gpu could also be updated to maybe a Tonga equivalent.
     
  27. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    According to iFixit, unlike the previous gen 27" iMac which used magnets to hold the glass panel... this one uses glue. So upgrades are a pain (except for the RAM which has an access door).

    But yes, this isn't a gaming machine at all. Well I am a light gamer... only have time for about 3 hrs in a whole week and not often for stretches longer than an hour in a sitting... so I think its ok.
     
  28. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    Speaking of the PS4/X1... I saw ME:SoM running in a friend's PS4 and the textures looked more soft than what loads on my iMac with even the High/Ultra setting. So I'm feeling chuffed
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yes both APU's are 28nm. There were rumors this week of a 20nm die shrink of current XB1 APU with same performance but better efficiency, which could potentially lead to a slimmer console. Nothing about PS4 yet.
     
  30. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Slimmer console?! OK now this thin and light fad is starting to get ridiculous.
     
    triturbo likes this.
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Wut this is nothing new. There have been different sizes of every console since at least the PS1. Smaller/more efficient silicon makes them cheaper to manufacture all-around so win-win for the console makers. 360 and PS3 were die shrunk several times during their lifespans.
     
  32. ins1dious

    ins1dious Newbie

    Reputations:
    22
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    So I was thinking about the heat and GPU of this iMac at load and came across this in the Apple support page;

    iMac (27-inch, Late 2014)
    27-inch display, 4 GHz Intel Core i7, 32GB 1600MHz DDR3 SDRAM, 3TB Fusion Drive, AMD Radeon R9 M295X 4096 MB

    Power Consumption
    Idle (70 W) CPU Max (288 W)
    Thermal Output
    Idle (239 BTU/h) CPU Max (983 BTU/h)

    Contrast that with the previous model with the 775m;

    iMac (27-inch, Late 2013)
    27-inch display, 3.4GHz Intel Core i7, 32GB 1600MHz DDR3 SDRAM, 3TB Fusion Drive, NVIDIA GeForce GTX 775M

    Power Consumption
    Idle (78 W) CPU Max (229 W)
    Thermal Output
    Idle (266 BTU/h) CPU Max (782 BTU/h)

    This thermal output in their support page, does it include the GPU on load as well or only the CPU? Any one know?
     
  33. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    That's for the whole thing including panel I believe.
     
    ins1dious likes this.
  34. 777light777

    777light777 Notebook Geek

    Reputations:
    0
    Messages:
    98
    Likes Received:
    12
    Trophy Points:
    16
    My thoughts... ;)

    R9-M295X

    1) R9-M295X is 125W-150W, this can go into a well designed 17"-18" notebook.
    2) The R9-M295X is throttling in the iMac because of temps. (lower performance)
    3) If it was not throttling it would beat the GTX980M.
    4) Most likely will not go into a notebook because the M390X will be out soon next year, and wipe the floor with it.

    380X 390X

    1) 380X will compete with and beat a GTX980. (based on AMD using liquid cooling)
    2) 390X will compete with GTX980TI/TITAN 2, with a lower price point. ($550-650 vs $700-1000)
    3) Both will be 20nm + HBM + improved power consumption.
    4) (3-6GB 380X) (4-8GB 390X) + liquid cooler.

    Also it seems Nvidia will not have HBM until 2016, so AMD could take a lead next year.
    (AMD was a co developer/investor)
     
    octiceps likes this.
  35. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    M295X is a fast card, especially at higher resolutions, but don't expect it to beat 980M. Even when properly cooled it will probably only come close to 980M. 3DMark11 scores should be similar if M295X doesn't throttle and the only times it might beat 980M is in Gaming Evolved titles or when Mantle is used.
     
  36. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Realistically, it can only match 970m. That is as far as it could go. 980m is near GTX 780 territory, while 970m is more on GTX 770 and R9 280x, This mobile R9 m295x can, at best, be an R9 280x in performance.
     
    Cloudfire, Cakefish and heibk201 like this.
  37. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    Did you already see this thread at MacRumors?

    M295XThrottle.jpg

    AMD Radeon R9 M295X Core Clock Throttling, Heat, and Performance - MacRumors Forums
     
    Cloudfire and triturbo like this.
  38. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yep, I was spot on with my warnings that this chip is Apple exclusive and will be too hot to put in a notebook. That was clear when you saw the power and thermals from its younger brother, R9 285.

    There is a guy in that thread that says his iMac with GTX 680MX ran much cooler, in the 70s with high overclock. So the M295X indeed got much higher TDP than that one since it runs in the 100s (lol) within minutes of gaming with stock clocks.

    It makes you wonder how much money was involved in this deal. Apple would be waaaaaay better off with a GTX 980M. Not just because of superior performance (also with OpenCL this time) but it would even run cooler than the GTX 680MX they used before in the iMac.
    But of course, it won't be as cheap as that M295X. Let's see if it was a wise decision anyway, because all that negative talk will put a damper to the sales. They may lose potential money on it.
     
  39. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The funny thing is that a GTX 980M more or less match a R9 290X. 100W mobile GPU matching a 290W desktop GPU. Its a huge leap between these two companies at the moment which is why we haven't seen a mobile AMD GPU to take on 980M yet.
     
    heibk201 likes this.
  40. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    the problem is that apple won't settle with the price that our boy huang was aksing probably. apple probably intended to buy to all of 980Ms or something like that, but NV simply can't dedicate a significant portion of its gm204 yields to apple at the price it wants.
    money aside, only amd worked with apple to have its gpu support 5k.
     
    Cloudfire likes this.
  41. kothletino

    kothletino Notebook Evangelist

    Reputations:
    65
    Messages:
    311
    Likes Received:
    110
    Trophy Points:
    56
    Exactly it is R9 275X. :p
     
  42. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Sorry but this isn't true. A Gigabyte 970 trades blows with a reference 290X stock for stock. The 980M is a gimped desktop 970 with about 81% of its performance. If AnandTech used a reference 970, then the difference is even greater since Gigabyte's 970 is 5% faster than one with stock specs. If I had to pin a number down, I'd say stock for stock 980M is about 80% that of a 290X at best. (75% might be closer)

    Not saying the performance/watt ratio isn't impressive or that the 290X isn't a space heater, but a 20-25% performance gap is still quite significant.

    Ok I'll stop being pedantic, carry on. :)
     
    triturbo, D2 Ultima and octiceps like this.
  43. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    ^Whatdya expect, this is the AMD bashing thread after all. :D
     
    triturbo likes this.
  44. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    R9 290X is a space heater, LOL
     
  45. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Fine, a single 290X is no problem, but 290X in XFire and you're looking at 600W of heat. QuadFire them and you basically get a 1200W space heater for free. Who cares about Maxwell's power savings when you never have to crank up the heat ever again during winter. On that note, I propose we call 4x 290X config HellFire.
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    And then you bring out Maxwell in summer so your room doesn't become hotter than mine?
    [​IMG]
     
    TomJGX likes this.
  47. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well in summer you put them under water so you don't accidentally unleash HellFire™ upon your room. And then in winter you go back to air cooling.
     
    D2 Ultima likes this.
  48. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    If you have a water loop you can connect it to your boiler so you can take a shower after an intense session :p

    I still can't see solid evidence that the R9-M295X is the sole reason of those temps. After all Apple things are notoriously overheating, not by that extend, but I guess they wanted to break some records or cook you breakfast for example (eggs and beacon... lovely).
     
    D2 Ultima likes this.
  49. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Absolutely. Look at this teardown. The cooling system is downright pathetic, less than abysmal really. Single fan for a 84W i5-4590 and 100W M290X, and look at the size of that heatsink. And holy farting batman do I count ONE SINGLE HEATPIPE?! Dafuq!

    I mean I knew Macs were terrible when it came to cooling, but wow I did not realize they were at this level of garbageness.
     
  50. bigspin

    bigspin My Kind Of Place

    Reputations:
    632
    Messages:
    3,952
    Likes Received:
    566
    Trophy Points:
    181
    M295X is not bad, it's the Apple who make it looks like crappy card.
     
← Previous pageNext page →