The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Radeon R9-M295X

    Discussion in 'Gaming (Software and Graphics Cards)' started by Tsubasa, Mar 15, 2014.

  1. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That doesn't make much sense to me; you'll need to explain more. Why does a game which is not running into a vRAM bottleneck (I.E. well within the vRAM buffer limits) perform worse on a card which has better memory bandwidth? If the 780Ti was running into a vRAM bottleneck in a game like Watch Dogs or Titanfall, I'd get it.. but if it's a game like CoD: Black Ops 2 where it doesn't, what then? And even so, SLI and Crossfire double bandwidth with two cards, so that should negate the point; it's not like 192GB/s bandwidth is any kind of limiter to any card these days; far less a higher bandwidth (which the 970 and 980 have). Also, 980s are significantly stronger enough than 970s that their prowess should show even with worse SLI scaling... i.e. if two 980s are 85% better than one, but two 970s are 95% better than one, but the 980 is 20% stronger at base, two 980s will still be stronger than two 970s.

    I think AMD just has better architecture for 4K right now. The R9 290(x) cards far outstrip the previous cards on that gen, no?
     
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Even though Hawaii has lower intra-GPU bandwidth between its core and memory than GK110, it has much greater inter-GPU bandwidth due to XDMA, which gets rid of the CrossFire bridge and does GPU-to-GPU communication strictly over the PCIe bus. PCIe 3.0 x16 has 16GB/s (32GB/s bi-directional), while the CrossFire bridge has .9GB/s and the SLI connector has 1GB/s. XDMA is probably why Hawaii CrossFire scales better than GK110/GM204 SLI at 4K. And the wider-but-slower 512-bit bus, despite slightly lower bandwidth than GK110, is probably why Hawaii punches above its weight at 2.5K and 4K in single GPU compared to GK110 and GM204.
     
    heibk201 likes this.
  3. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Not sure about CoD games because I haven't seen any 4K Crossfire/SLI benchmarks of those done, but it doesn't have to be a Watch Dogs or Shadow of Mordor to use a lot of memory, any modern game with proper texture scaling at 4K and higher AA options will use a lot of memory at 4K.
    I don't know how AMD might have a better architecture for 4K, maybe Hawaii in 290X has better (more efficient) memory controller compared to older cards.
     
  4. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It depends on the game entirely you're correct... but resolution increases doesn't do very much for vRAM. Usually it's not even a doubling from 1080p; it's more like a + 600MB increase AT MOST. If you try then to multisample with MSAA or CSAA at 4K afterward, then sure. That's fine. Newer games like watch dogs and such I mentioned because they use 3GB+ at 1080p, and at 4K would far outstrip the vRAM limits of a 3GB card.

    Next... I just also realized something. 1080p, high vRAM using games should also technically have a similar effect as 4K games if you're going for memory bandwidth requirements... Evolve uses 4.5GB vRAM if you have 8GB. On a 4GB card, you should have it JUST at the vRAM limit, which would allow memory bandwidth to speak better. If memory bandwidth is the limit, then a game which does a similar thing should be fine for showing the differences. The question is finding a game that JUST crosses 4GB and isn't limited to 60fps.

    Also, if only crossfire benefitted at higher resolutions that'd be fine, but doesn't single GPU 290X cards do better at high resolutions as well? Or is it only multi-GPU where they perform better? XDMA configurations shouldn't apply in single GPU, and I always thought they did better even in single GPU. Do correct me if I'm wrong in this sentence though.
     
  5. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yes it does. I already answered this question in my previous post:

    Looking at AnandTech's 980 review, you can see that single 290X beats 780 Ti at 4K most of the time, gets close to 980 and even beats it in Crysis Warhead and in BF4 due to Mantle.

    Nope, Linus said over 4GB VRAM in Shadow of Mordor at around the 7:30 mark in the video. He was using 3 of Sapphire's 8GB 290X cards in CrossFire.
     
    Last edited: Nov 30, 2014
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't get why a wider bus but slower memory bandwidth causes it to win out. This is what I keep asking people to explain to me all the time when they say the 512-bit mem bus is the reason the R9 290(x) cards do better at higher resolutions. Why does a wider bus that STILL effectively transfers memory data slower due to the final bandwidth calculation make it work better? At the end of the day, more data should be flowing through the 384-bit mem bus at higher memory clocks per second.
     
  8. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Yes I mean something like that, considering at 1080p it still needs 4.5GB VRAM (from D2's experiment), I guess it doesn't increase that much with the resolution. So going from 1920x1080 to 10kx1440 doesn't increases ram requirement by say from 4.5GB to I don't know, at most 8GB :D (doesn't he say the max amount at some point? just says "it used more than 4GB"?)

     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Who knows what he meant. All he said was that SoM used over 4GB and it was the only game they tested which did this and which required a >4GB card. "Over 4GB" sounds like somewhere between 4GB and 5GB to me. If it was more than 5GB or 6GB, wouldn't he have said "over 5GB" or "over 6GB"?
     
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh no... I'm fairly certain a single 1080p screen to 3 x 21:9 1440p screens is a really big increase. That'd be 14,515,200 pixels overall, versus 4K's 8,294,400 pixels. It won't be a simple 600MB extra, but I'd guess well over 1GB extra (not counting high multisample AA). And in newer games which are pretty unoptimized, I can see that 8GB buffer being well used with the setup he has. At least more than 8GB on 980Ms will get used at 1080p, haha. Resolution makes a small difference, but such a huge resolution will eventually add up, especially with heavy multisample AA formats such as MSAA, CSAA, SMAA M2GPU & 4x (not 1x, T1x, 2x or T2x), TXAA, etc.
     
  11. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Responding to the bit in your post in bold, I don't think a game is necessarily memory bandwidth hungry just because it's filling up lots of VRAM. For example, Titanfall can use all 3GB of my VRAM, but Memory Controller Load in GPUz rarely shows more than about 35% load (at x2 MSAA), so I think there are other factors that dictate how much memory bandwidth is consumed rather than just amount of VRAM used. Conversely, I've seen many games really hammer Memory Controller Load (in GPUz) when only using small amounts of VRAM - I fired up Crysis 1 yesterday and at x4 MSAA & Very High Settings it was using about 70% Memory Controller Load pretty consistently while only using about 900MB of VRAM. This could go some way to answering your question, perhaps in the games where you thought memory bandwidth was important (because you correlated amount of VRAM consumed with memory bandwidth required) are actually not memory bandwidth hungry & perhaps their performance depends more on the performance of the core and other aspects.
     
    Last edited: Dec 1, 2014
    triturbo and D2 Ultima like this.
  12. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    That's what I thought, but I'm not good at explanation. Basically, the way a game is coded. When Crysis was released, only top-of-the line GPUs were having 1GB vRAM, let alone 2GB. In order to deal with this limitation, massive data transfer ensures. Yet again I saw a post about Maxwell's efficiency, this time the bandwidth one. Let's not forget that Tonga has 40% edge over the previous gen, so it owns Maxwell :D
     
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    This is true; I never looked directly at the controller load. I used to look at it back when I used Playclaw 2, but Playclaw 3, 4 and 5 left that part out. I might run some GPU-Z otherwise and do some testing.
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    The only way to really determine if 4GB is a hard limit is doing a direct comparison to 8GB and seeing where the 4GB hits the wall performance wise.
     
    Link4 likes this.
  15. fatboyslimerr

    fatboyslimerr Alienware M15x Fanatic

    Reputations:
    241
    Messages:
    1,319
    Likes Received:
    123
    Trophy Points:
    81
  16. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    LOL sounds like AMD is continuing the legacy of the old ATi Omega unofficial modded drivers. R.I.P. 2004-2006. I remember using them on my Radeon 9700 Pro waayyy back in the day. Sure, they improved performance, but at the cost of noticeably reducing image quality, for example anisotropic/trilinear texture filtering optimizations. You could do the same thing by tweaking hidden driver settings in the registry with something like ATi Tray Tools. Omega drivers also artificially boosted synthetic benchmark scores back when this was a somewhat common practice even among official driver releases from ATi and Nvidia.
     
  17. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    To me, it looks like they're trying to do this downscaling thing NVIDIA did with Maxwell.. Let's see how it goes before judging it...
     
  18. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
  19. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Well, the new Alienware 15 would have R9-M295X and GTX 980m as options, let's see how it goes :D

    Of course it means nothing if Clevo or MSi don't come-up with MXM module, since the AW15 has soldered parts.
     
  20. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    I checked their pricing and the R9-M295X is priced between the 970M and 980M so that's probably how it performs.
    Alienware is such a ripoff though making it only $150 cheaper than the 980M. I'm quite sure AMD priced it in line with 970M yet Dell are making a ton of extra profit on it.
     
  21. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Alienware always had strange markups, but his time around - charging premium for non-premium machine... they went too far! Before someone say - but ohh teh feels, go buy MacBook, it has the same premium feel and more or less the same BGA crap. I can only hope that Clevo or MSi can get a hold of a few boxes with those chips inside and make MXM modules.
     
    TBoneSan likes this.
  22. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    Fist we need benchmark to see how non-thortle M295X performs...
     
  23. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Precisely why I'm looking eagerly at AW15 comparisons.
     
  24. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    I'm curious if M295X for AW is regular chip from Icrap or M295X 2.0 build on GF 28nm FD-SOI process...
     
  25. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101


    the long-waited 980m vs m295x o_O
     
    triturbo and Cloudfire like this.
  26. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Nice. I`m still not sure we will see this card on anything except Alienware and Apple. Wonder why AW decided to use it while Clevo dont. :vbconfused:

    +38% in Crysis 3 and +44% in Tomb Raider.
    And the iMac was tested with a better desktop CPU.
     
    Last edited: Feb 6, 2015
  27. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    R9 M295X in the iMac 5K throttles like a mofo. Worse results than a "slower" 285.

    Anybody got non-throttled M295X results from an Alienware?
     
  28. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Ehh... sort of. It's not official or backed by Alienware, but it's something.
     
    triturbo likes this.
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Any performance data?
     
  30. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Not yet. Maybe next week or the week after that. It sucks since not many of those who have an AW15 have the m295x, and none of those from what I have seen is a member of AWA or NBR.
     
  31. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    this sounds like another throttle card tho. with the new AW 15's cooling there's no way they could cool a 125W GPU to that extent. clevo needs their hands on this so that we can get some proper data
     
  32. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Let's wait for some more results from the person instead of jumping conclusions. All this person did was run Furmark.
    I doubt Clevo will be getting a m295x GPU soon.
    Also confused by what you mean by rhe AW15 cooling. You probably mean something else with the AW15, yes?
     
    Last edited: Feb 6, 2015
  33. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Well having the owner's lounge putting the R9-M295X down there, while costing more, doesn't help much in already nVIDIA biased (note biased, not only) community. Of course he gives the notebookcheck result, but as we all know the only test that has been run there is on what we can see few posts above a massssssively throttling iMac. So tons of Kudos for this brave soul! I'm waiting some more tests impatiently :) Still need Clevo to give it a proper treatment, but that's more unlikely with each passing day.
     
  34. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I would be very interested to see the GPU-z from the AW15 with M295x
     
    Robbo99999 likes this.
  35. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I wouldn't bet on it! Only cretins run Furmark on any GPU! Especially mobile GPU's. The single most pointless stress test/benchmark program.

    Anyways hopefully he will get some sense and run 3Dmark 11 and Firestrike and give us a GPUz too. With 2048sp it certainly should have quite a punch! Especially if there is room to push the clocks to 1ghz at least.
     
  36. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Based on the temps proposed I`m starting to wonder if it got 1792 shaders and not 2048. And even here its a stretch.

    R9 285 like M295X are both Tonga, R9 285 got a TDP of 190W, and it got 1792 shaders.
    Compare that with GTX 680 (GK104) with 1536 cores and 190W TDP and GTX 880M (GK104) with the same 1536 cores and 125W TDP.

    How on earth can AMD make a M295X with more shaders than R9 285 while getting lower temps than 880M (less than 125W)? We are talking vastly better efficiency than Kepler (680 and 880M) if this is true.

    It makes very little sense in my head atleast
     
    Last edited: Feb 7, 2015
  37. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    Cloudfire I personally think that AW chips might be produced in GF FD-SOI 28nm process which was rumoured to have 30% better efficiency than standard 28nm...

    285 was made in standard GF process i suppose.

    Damn we relay have to find AW-15 review...
     
    Cloudfire likes this.
  38. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Hmmm, thats a possibility I guess?! Pretty darn impressive if GlobalFoundries 28nm is that much more efficient than TSMC`s 28nm HP process.

    I just can`t see it happening if they use the same stuff they do on the R9 285.
    Would be extremely interesting to see a review of AW15 with M295X
     
    Legion343 likes this.
  39. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    Robbo99999 likes this.
  40. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    I sort of said this already a while ago that Tonga was very Voltage and clock rate sensitive and lower clocks (hence Voltage) would save a lot more power than the few extra GPU shaders that M295X gains over 285, that is only a tiny fraction of the entire die, would add.
    But if the M295X in the AW15 is using a different process than desktop 285 that would also make sense.
     
  41. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Someone has the AW 15 + m295x.
    I'll be honest, these temps are slightly better than what I predicted (around 92 C), but define this as you will. Not sure why he thinks it is too hot. Most thin and light gaming laptops reach 90 C typically, right?
     
    Cloudfire likes this.
  42. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    And we (I at least) don't know the fan profiles. AW18 was notorious for really late kickin' and even then for holding-off from switching to full-blast (keeping it in high 80s). I wont be surpirzed if it's the case. It seems that lately there are tons of very sensitive people - ones that can't carry more than a kilo or two and that would get an ear bleed from anything louder than a whisper. Oh well, the industry is going to hell.
     
    Robbo99999 likes this.
  43. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Well if you mean thin and lights like the GS60 Ghost Pro, then yes, those things do get quite hot. I recommended my brother to get one and I am impressed at how thin and light it is, it really is different to see it in person than just from pictures. While everything else is great about it running demanding games like Tomb Raider does make the laptop hot so he uses the long and thin box as a sort of a mobile desk but on the bright side it does leave enough space for an MSI gaming mousepad + mouse.
    As for the AW 15 it's even heavier (and maybe even thicker) than my N56DP so at 2.7Kg/6lb I wouldn't even consider this thing as a thin and light. But then again 970M and M295X are two different beasts and we not only need to see temperature comparisons between the 980M , 970M and M295X in the same chassis, but performance comparisons as well, and maybe even to older GPUs in systems with similar cooling.

    Also new rumors about AMD's Trinidad GPU core that is supposed to replace Pitcairn and launch in March as R7 360/X apparently on the same architecture as the 390X (-HBM) so that will make a great mobile GPU with lower TDP while performing between M290X and M295X possibly launching as M380X to replace the extremely rare M280X (that i have only seen in one laptop so far).
    http://videocardz.com/54858/amd-rad...tion-395x2-bermuda-390x-fiji-and-380x-grenada
     
  44. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
  45. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Still slower than M295X/970M, to say nothing of 980M.

    Hopefully AMD will become competitive in high-end notebook GPUs again. I don't want to see them completely abandoning this market and giving Nvidia free rein to milk cut-down mid-range desktop GPUs over several generations at exorbitant prices.
     
    Last edited: Feb 8, 2015
  46. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    I think that Strato XT goal is to beat 965M (nice for FHD but for last gen games... New ones (console ports) kills it pathetic 128 bit bandwidth... (AC unity 30% slower than m290x, Far Cry 4 30% slower, Civilisation BE 30% and also shadow mordor 30% worse).
    Don't forget that this benchmark was from M295X under performing Icrap
     
    Last edited: Feb 8, 2015
  47. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Around 90s Celsius was more what I thought this GPU would run yes.

    We have a bit conflicting information here though so we need to see more.
    Have him post GPU-z and 3DM11 :)
     
    Last edited: Feb 8, 2015
  48. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    For that information, you might want to head over to the Alienware 15's Owner Lounge. I did make a comparison with a Gigabyte 15" laptop with a GTX 880m as I think that's the closest outside comparison I could make (I linked such laptop in the AWA thread).
    I think paupsan beat you to that request, but I'll relay the request again if I remember wrong. I'm hoping he does fulfill it.
     
    Cloudfire likes this.
  49. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Oh, I see, its from the other Alienware forum
     
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Ai ai ai the guy wrote some more, 91C for the M295X during Watch Dogs. Hot stuff. This def is the same GPU based on Tonga.

    Too bad the guy is too inadept to even post a freaking GPU-z screenshot. Grrr.

    Tell him to upload the screenshot to imgur ffs. Or describe how many cores etc there is, what he see from the program if cant do that
     
← Previous pageNext page →