The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Radeon R9-M295X

    Discussion in 'Gaming (Software and Graphics Cards)' started by Tsubasa, Mar 15, 2014.

  1. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Moral of the story: 3DMark is junk. What else is new? :p
     
  3. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    You're being glib! ;-) 3DMark GPU is a very good indication of general GPU performance, and in my experience has been a very useful tool in quickly working out max stable overclocks & effects of overclocks (core & memory overclock effects) on performance, with a slight understanding that memory bandwidth isn't particularly challenged on 3DMark runs (more challenged on Firestrike though!). (As well as for comparison of GPU performance from one model to the next). 3DMark is awesome! :p
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  5. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    But the 4820K vs 950 result I posted run the same Fire Strike, SysInfo, driver, and even OS, so that should be a fair comparison no?
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Sure, and we see the 4820K system score substantially higher as expected given the overclocks.
     
  7. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    NVidia Driver versions account for some of that - I've seen some perceptible small significant increases over the year. But that doesn't account for the 'huge' difference you see there - looks screwy to me, and I've not seen any such increases. So, for me it's been a very reliable benchmark in all the ways I described - in fact you seem to be an outlier, I've not seen such wild variations in 3DMark ever (when comparing on notebookcheck.net, etc) - bizarre! (I run 3DMark with each driver version)
     
  8. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`m very curious to see if the M295X launch and what role it will fill, after watching the GTX 970M and GTX 980M benches.
     
  9. marcos669

    marcos669 Notebook Evangelist

    Reputations:
    359
    Messages:
    300
    Likes Received:
    71
    Trophy Points:
    41
    AMD is going to have to cut down m295x price a LOT
     
  10. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    What's this? Maxwell topic was overspamed and you transferred some of it here :D I want to be optimistic, but 970m level wont be so bad either. Still hoping for the best :hi2:
     
  11. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    OK here's what's up. I was browsing this forum about a year ago when I noticed in someone's sig that his 765M was outscoring my 650M SLI in 3DMark 11. I thought "no way that's right" since I knew for a fact that my GPU is considerably faster. So I did a little investigating. Turns out, his score was obtained on a newer version of the program. I hadn't run 3DMark 11 in a number of months at that point, so I fired it up, was prompted to update so I did, ran the benchmark and BAM instant 700+ point increase.

    So out of that 1000+ point increase, 700+ alone was from the update to 3DMark 11 and/or SystemInfo (can't recall which), and the rest was from +30 MHz on core (~100 points) and driver optimizations between 320.18 and 337.88.

    There's nothing screwy at all about my results. I've run 3DMark 11 many times both before and after the score got inflated and it was consistent. Comparing my post-update 3DMark 11 score at stock with the 650M SLI score at NotebookCheck reveals the same 700 point delta. And I'm not the only one to have reported this.

    If anything is screwy, it's the application itself, which is why I said not to trust 3DMark scores since they don't account for the fact that the scoring scale changes with app updates. They still keep the outdated scores in their database and mark them as "valid," which is extremely misleading. Not to mention it isn't a very good test of overclock stability in general (yes, even Fire Strike 1080p) compared to something like Unigine Heaven maxed out.
     
  12. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yes, I'm not saying there's been anything wrong with your system, just that it was unusual (maybe you're not an outlier like I said though!). Perhaps sli systems have been sensitive to 3DMark updates. For my system over the last year I have kept drivers & 3DMark11 up to date constantly and it hasn't changed since I started measuring my current overclock from 331.65 in October last year (so basically a whole year) - 6600 to 6700 GPU score.

    I've found 3DMark11 very good for initial overclock stability testing - specifically the first graphics test - it will fail in that one for me if unstable. For permanent gaming stability I just have to lower overclock by one or two notches below my max stable 3DMark11 run. (1124Mhz gaming stable @1.05V - 1124Mhz 3DMark stable @1.037V). So a quick way of determining stable overclocks for me.
     
  13. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Now it makes sense. The score-inflating update was before you began measuring.
     
    Robbo99999 likes this.
  14. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    It will probably outperform the 970M since we had some early leaks where it performed close to a 7970 (newer drivers should also make it better). And besides these are just synthetic scores that bloat maxwell performance, and Firestrike doesn't seem to be as solid as 3DMark11. Just wait for real tests, there isn't much left anyways.
    But that 970M though if that thing is based on GM206 (12SMM 192bit bus) is is more efficient than GM204 (which it should be since its smaller) it will make all the Thin and Light laptop gamers happy.

    Edit: Also on the topic of Firestrike being weird and the score database just BAD, I found something really weird or that was the work of someone on drivers and bioses which is nothing short of magic, anyways I will try to find it again.

    Well here it is, http://www.3dmark.com/fs/1153329 and even the VRAM amount is wrong.
    Don't know how in the world they got that score.
     
  15. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Hardly if it outperforms 970M at the same price it should be fine (if earlier leaks are accurate it should) 980M will be overpriced as always so they don't have to worry about that.
     
  16. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Since AMD's GCN is not as power friendly, they won't compete vs 980m, but they will still sell a solid performer that most likely rival 970m or is around that level of performance.
     
  17. marcos669

    marcos669 Notebook Evangelist

    Reputations:
    359
    Messages:
    300
    Likes Received:
    71
    Trophy Points:
    41
    After the desktop 285 you still think that is possible? i think we will be lucky if i manages to get on pair with a 880m
     
    Cloudfire likes this.
  18. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Different fabs, so yeah, I'm expecting at least 970m level.
     
  19. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    And AMD will probably WAY outclass Nvidia in terms of OpenCL.
    I would be willing to sacrifice a bit of efficiency and gaming performance for that... because AMD will still provide a solid gaming experience at high settings and high fps while also be usable for GPGPU
     
    triturbo likes this.
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I hate to be a downer, but after the R9 285 was announced, what I expect from the M295X is GTX 880M performance and same heat if not more.

    R9 M295X - GTX 880M: 100%, 125W
    GTX 970M: 125%, 75W
    GTX 980M: 160%, 100W
     
  21. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    It will never be the junk that 880M is, they wouldn't bother releasing it otherwise. And besides Tonga isn't bad, even the unoptimized 285 draws an average 175W.
     
  22. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Time will tell, and while it would be rather premature to claim anything about a mobile Tonga chip... considering it hasn't even been released yet and we have no benchmarks from it either - I am rather hopeful that it wouldn't just 'match the 880M... instead, I would hope it would indeed exceed it - maybe not be comparable to 980M, but hey, 25% slower than 980M would be acceptable.
    Plus, if history is any indication, initial drivers would have to be refined to accommodate the new architecture.
     
  23. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    A great review of R9 285 by Anandtech, with many games benchmarked at 1080p and 1440p resolutions but the most interesting tests are the Synthetics and Compute tests, which show just how different Tonga is from Tahiti. At stock clocks it uses as much power as GTX 760 in Crysis 3, 13W less than Sapphire 280 and only 25W more than 270X. Also you can see just how much power consumption increases from the factory overclock vs stock (31W on the same card for only 5% increase in clocks which is crazy) indicating that Tonga may be much more efficient at lower clocks. It's very interesting and informative so I recommend reading the entire review.

    AnandTech Portal | AMD Radeon R9 285 Review: Feat. Sapphire R9 285 Dual-X OC
     
  24. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Definitely a comparative snore-fest to Maxwell though! ;-)
     
    octiceps likes this.
  25. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah Im starting to think that this GPU isnt happening...

    Change of plans due to GTX 970M and GTX 980M most likely. Would make zero sense and they would become a laughing stock if they released a M295X matching a GTX 880M over half year later than Nvidia, when Nvidia release new 900M cards at the same time

    Question is, do they even have something else ready?
     
    Mr Najsman likes this.
  26. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    Beating 880M won't be hard at all, if they can get the clocks to a decent level it would even beat the 970M. And if they release it it will most likely be at the same time 285X launches later this month.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Technically, a working 880M is quite a good card. That being said, a card that's 10-15% stronger than a WORKING 880M is a problem, as the 970M and 980Ms seem to strip them to shreds. Remember: AMD's R9 290 and 290X (and now their 285) are their "new" tech that still trades blows with Kepler's tops. It's cheaper, sure, but AMD needs something seriously stronger to compete with Maxwell. If they don't have it and they're releasing the R9 285/290/290X as their new, stronger stuff... then they've already lost.

    I'm going to assume that the 285, 290 and 290x cards are simply kind of a side-project to keep things up in the running with Kepler's current stuff. But it makes no sense for a 285 to come out now at its current spec. It's not impressive, at all. But it's actually "new". AMD's got the spotlight on them.
     
    Cloudfire and Mr Najsman like this.
  28. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    AMD needs a new architecture to compete with GTX 970M and GTX 980M.
    There is no chance in hell they can beat even a GTX 970M or come close to it with Tonga. The information is out there. Its the exact same architecture as M290X and they would need to sacrifice a ton of performance to even get a working 28CU mobile Tonga based in R9 285 out in the market.

    I had hopes because I thought Tonga would be their first "Maxwell" but nope.
     
    D2 Ultima likes this.
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Oh yeah, it's pretty obvious what AMD is doing to make their GPU's faster: Simply make them bigger, hotter, and more power-hungry instead of relying on architectural improvements for increased efficiency and performance. Basically the antithesis of Maxwell.

    Just look at 390X--only a single-GPU card and it already needs a hybrid air/liquid cooler a la 295X2. The Fiji die will be a monster in terms of size, no doubt as big as anything Nvidia has ever put out. Is AMD's small/efficient die strategy dead? Would be a shame because it won them a lot of success back in the day (think Radeon 5000 Series).
     
    Cloudfire and D2 Ultima like this.
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    AMD is hot, loud and power-chugging. And I forgot something else that I remembered a short while ago which caused me great hilarity.

    Anybody remember that AMD's mobile GPUs have a VERY high failure rate after 1-1.5 years? The same chips that are (though cut down) in the new consoles? I want to see how many recalls are going to start showing up in the next 8 months :laugh:

    Seriously though, AMD needs to stop re-iterating their stuff without any meaningful advancements. Their bulldozer/piledriver architectures game out during Sandy Bridge. Intel is about ready to bring out broadwell (in fact it's been DELAYED)... still no high-end advancements from AMD. And their APUs are so thermal/power limited they end usefulness at midrange. Their GPUs came out in 2011, and it's gonna be 2015 (MAYBE) before we see a true advancement in AMD chips. nVidia brought out Kepler second and is bringing out Maxwell first (after delays). I want to know what they're actually trying to do over there, because it does not seem to be advance tech. It feels like they're trying to milk their current tech and slightly revise it and re-sell it.
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    390X looks like a beastly card though, both the GPU itself and the cooler.

    Any wagers on how many PCIe power connectors it'll need? Two 8 pin? :p And another 6 pin?! :eek:
     
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't really count it as being beastly if they just shoved 1.5x of a R9 290X onto one chip and it has equally large power drain and heat output. There's kind of a reason people don't already do that isn't there? XD

    And nah, you'll need a second PSU. They'll be putting forth "new generation" case designs that are meant to hold two PSUs, one for the GPU and one for the rest of the system :laugh:
     
  33. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    You do know Maxwell is a revision of Kepler, with a new rearrangement of assets, delivering a more efficient performance? is nvidia not releasing two new GPUs which give an almost negligible performance boost but bring a new level of power efficiency to their desktop portfolio? AMD did just that to their current inefficient products. Until both AMD and nVidia release actual products that defeat their current offerings for a considerable performance gain, we can't bash one for doing what the other just did.

    Both are trying to bring better performance. AMD-CPU division is another beast altogether, and intel is a much more fierce competitor, so they are pretty much pinned down to fight midrange. As long as they can still fight, there will be improvement, but for now we have to wait. Delays are not bad, if what they result on, is a better product.

    As for mobile GPUs.... The only reason we are currently praising nvidia is because while neither had something to offer to mobile for a while now, nVidia's 900m series bring performance improvements. Something that is hardly relevant in the desktop world until more products emerge. AMD's new GPU does not need to beat 980m, hell, not even 970m, as long as it brings better performance/price. If their new GPU can barely beat 880m by 10% but cost half of a 970m? That is serious bang for buck, even if 970/980m are still quite faster. As long as we get competition in any relevant segment of the market, we are set.

    I think it is best to remain put until they release something :)
     
    fatboyslimerr likes this.
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Oh, we've already had dual-PSU case designs for many years now. Now we can finally put them to good use with a few of these bad boys in tri- or quad-Fire. The wall socket would run out of juice and you'd have to plug it into different circuits. :D
     
    D2 Ultima likes this.
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    And Kepler is a distillation of Fermi with castrated GPU compute, no? ;)
     
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Traditionally, AMD's flagship has been $100 more expensive than nVidia's 2nd-best mobile card. So if they made their new flagship half the price of a 970M, they'd be pricing it somewhere inbetween a 950M and a 960M, which would be bloody amazing. As far as traditional pricing (that I remember since the fermi cards) goes though, AMD's flagship has been $300 less than nVidia's flagship, and nVidia's 2nd best has been about $400 less than its flagship or so. So they MUST beat a 970M (and handily, I might add) if they want it to be anywhere NEAR competitive to nVidia in the slightest.
     
  37. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    But there is no such thing as traditionally, because AMD does not have much of a history that can be repeated in the mobile GPU arena as to draw such conclusion. They, at best, release a handful of powerful GPUs every now and then for great prices. Seldom do they pull an HD7000 series. HD7970m came out first knocking out the original rumored 680m, so nvidia delayed to be king. In other gens, like the HD5000 series, AMD released the gpu that kicked current GTX200m series butt forcing nvidia to try and pull off one of the worst high end mobile gpus. But the HD4000 series? Man those were almost ghosts.

    Infact, what we can actually see is that AMD releases a minuscule amount of GPUs compared to nvidia. And they always price them at competitive value. You will be hard pressed to find an AMD GPU being more expensive for the performance it nets. Since AMD does not have anything to compete, they can't release something high in price. Their best bet is a GPU that will most likely not match 970m, so they will have to price it lower than that to begin with.

    You are thinking in a performance mindset, not in the whole market. The high end GPUs are not the bulk of the sales or market itself. They don't need to compete in the high end to sell. They just need to offer great bang for buck. In the desktop area, for several gens, AMD didn't offer the powerful GPU. They offered alternative, affordable performance (that's what you get for getting your kicked with your HD2000 series, AMD!). Their HD4000, 5000, 6000 series were introduced with the mindset to bring affordable performance, while nvidia was the king of the hill. Hell, the performance difference was even greater back then, since you needed dual HD4000 gpus to compete with the nvidias GTX280 when they launched.

    When HD7000 series launched, I never even expected them to be competitive in the hight end. We all expected the same story. GK110 was supposed to be the king of the hill while tahiti was the affordable solution. But well, it all played out differently that gen.

    I just hope they do come out with a new GPU. They can move their 7970m gpu back to midrange, and offer their 880m level performance as their cheap high end GPU. 980m will be king.
     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    When I said "Traditionally", I was considering the M4000 series, the M5000 series, the M6000 series and the M7000 (plus the M8970 & R9 M290X pointless rebrands) series from AMD versus the GTX 200M, GTX 400M, GTX 500M, GTX 600M, GTX 700M, GTX 800M series from nVidia. It's just my memory of how the prices worked, in Clevo machines (which usually are among the cheapest, if not the cheapest) for reference. I always kept looking because I always kept meaning to get a machine between my D900F and my P370SM3, but it just didn't happen.

    So if AMD is again going to release a card costing more than a 970M but yet weaker than a 970M... they've sealed their fate. They already have very bad reputations in the high end mobile market due to their cards failing after 1 year since the M6000 series (which carried onto the M7000 series).

    As for the desktop market, AMD has had weaker by a bit cards, sure, but they were viable. I could buy two AMD $300 cards and beat out the single $500 nVidia flagship. Even now, a R9 290X will perform similar to a 780Ti for $150+ less money. That's power and value. Their mobile lineup has been that way too; the 7970M power-wise was close enough to a 680M and it was significantly cheaper. If, however, they release something for their flagship for the mobile market (and please note, their flagships are power hungry and HOT, two things mobile users hate that desktop users can mitigate more easily) that can't even compete with nVidia's 2nd-best GPU, it'll either have to be super super cheap or it'll just fail.

    And even if it IS super cheap and is meant for midrange, they'll need to cut down its power consumption and heat output greatly while making sure it keeps up with 960M/860M type cards in performance, or nobody would buy it for midrange either as thinner/lighter laptops that would quicker use the midrange GPUs would simply pass AMD by without a thought.

    Edit: I too would like to see AMD bring out a new, good card. They have not, however, shown any signs of doing so at all. Their R9 285, 290 and 290X cards ARE new architectures. If they created the 290/290X cards as a placeholder to contend with GK110, then that's fine and dandy. But they better be bringing out something quite soon, or they're in trouble. It is what it is.
     
    Mr Najsman likes this.
  39. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    For all off you claiming AMD can't beat the 970M, all they have to do is clock R9 295X with 32CUs at 800MHz. There were already leaks of early R9 295X GPUs with 32CUs and 800MHz clocks, maybe they are delaying it to increase clocks more to perform closer to 980M. At 850MHz 980M will only be 10% faster. These are only based on Firestrike scores since that's all we have of GM204 GPUs, we have no idea how well GM204 will perform in real world tests, so instead of basing your opinions on inflated synthetic scores wait for more useful data. Just look at Kepler and Maxwell 860M for example. In Firestrike Graphics score Maxwell variant beats Kepler by 30%, yet there are many cases where Maxwell loses to Kepler games at 1080p. Suddenly those Firestrike scores don't look as good anymore.
     
  40. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    this may be all well and good, yes, however please remember that AMD needs to get that card to:
    1 - Draw sufficiently little power to be used in mobile form factors (and AMD's cards are MUCH more power draining than nV's, and Maxwell is exceedingly efficient)
    2 - Produce little enough heat to be viable at its selected clocks. Since the 290-type cards definitely run hotter than the older revisions, a 295X would need to be a serious efficiency jump to do this in mobile form factors.
    3 - Still have a modicum of overclockability. It does not make sense if the AMD card is just a bit more expensive than the 970M but beats it by 10%, however the 970M can OC by 25% like it's a joke and the R9 M295X can only OC by 5% (realistically, considering both power drain and heat output as well as stability).

    I'm not saying AMD will fail. I am not saying they cannot beat the 970M. I am saying that the R9 285 chip in a mobile form factor will get pummeled by the 970M and 980M, while being fairly hot/power-sucking. They cannot use that chip to compete in the high end mobile market; and if they put their flagship to compete at midrange, they would be incapable of going in smaller form factor notebooks where the 960M-type chips (which the R9 M295X will be competing with) can be used in due to power draw needs and also cooling needs.

    In other words, AMD needs to bring it. What they've shown us so far is not "bringing it". I *want* them to "bring it". So nVidia has to "bring it" back, and we get some proper advancements. I stand by my original statement. The GTX 880M 8GB fiasco WOULD NEVER HAVE HAPPENED if AMD had a card that was "bringing it" to the 780M so that nVidia needed to actually step it up. They did not, and nVidia willy-nilly rebranded for extra $$ when it really was not necessary. To be honest, we DID get good cards out of the 800M lineup. The 860M maxwell and 870M kepler are both AMAZING cards and superb bang-for-buck, and do not have AMD's last-two-gens track record of failing after 1 year, yet still are well within competing price ranges. The 880M though is an abomination and should never have been created, just like the 670M and 675M chips (while not abominations) should never have been created. And I don't mean the 670MX or 675MX, just to clarify.
     
    Mr Najsman and heibk201 like this.
  41. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Can't fully agree here. The 7970M could run at stock clocks at just 0.975V. If used the same low voltage vram that NVidia used on the 680M it would have been more power efficient. Yes I have used both. Maxwell no doubt is hugely efficient. Also not all 7970M's failed after a year. Xfired seemed to often be the cause. My own ran fine for over 2 years and then I sold it in perfect working condition.
     
  42. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    If the 28CU R9 285 is drawing 180W, how on earth are they gonna add 4 more CUs (256 more shaders) and still get it down to mobile cards restrictions?
    The only option I see is using 28CUs like R9 285 got, reducing the clock from 918MHz down to, say 700Mhz. Even at that point it should be around GTX 880M`s 125W I think.
    And it should be 25% ish slower than R9 285, and you are looking at around 5500 in Firestrike. Which happens to be what GTX 880M score.

    GTX 970M is at 7400 in Firestrike while GTX 980M is at 9500.

    I was hopeful too earlier, but sometime you just got to face the reality. They need a new architecture to have hope to keep up with Maxwell.
    You just can`t come near a new architecture like Maxwell with a vastly less efficient one. It can`t be done. Especially when you are dealing with confined space and a thermal restriction we have with notebooks
     
  43. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Quite gloomy! I hope AMD prove you wrong. Nothing concrete whatsoever has released for tonga mobile. AMD are clearly up to something and were waiting to see what Maxwell 970/980M brought to the table. Let's see the response. When 7970M released people were surprised. It isn't impossible for it to happen again.

    If you are right then it isn't a good thing for any of us consumers.... Maxwell for one will be hugely overpriced and then milked to Kepler/GCN 1.0 levels and more....
     
    triturbo likes this.
  44. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I hope AMD prove me wrong as well, but I just can`t see it happening.
    The fact that there are zero leaks for several months when we should have seen more leaks than before since we are in the month it should have released, makes me anxious.
     
  45. londez

    londez Notebook Evangelist

    Reputations:
    114
    Messages:
    602
    Likes Received:
    8
    Trophy Points:
    31
    Awww man, the series/model numbers are becoming even more confusing.
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You are definitely right about not all failing, however a LARGE number of them do indeed fail, and if crossfireX is the cause, then that's a manufacturer defect, and even WORSE than simply having fault silicon yields.

    Also, you may be right about undervolting it, however please remember: the 680M's low voltage vRAM resulted in VERY low memory bandwidth at stock. And the 7970M was indeed a 100W card; 0.975v versus 1.0v is not going to drop TDP by a whole lot. For example: adding -80mV to my CPU's voltage offset (0.08v if I'm thinking correctly) only dropped the max draw by something like 6-8W. I mean, temps can drop by ~4 degrees or so using that -80mV, sure, but it's not going to make a huge world of difference. Unless it's VERY different for GPUs and I am talking out of my butt.

    That being said, the thing is that the 7970M was the best card they could bring to the mobile market in years. 8970M and R9 M290X do not count; they are literally just rebrands of the same card with nothing more than a 50MHz OC from 7970M to 8970M and nothing but name to M290X (to my knowledge; correct me if I'm wrong about that last bit). And their beast was beaten by the 680M which I find to be a very crippled card; both with low voltage memory that couldn't pass at BEST (not average, but odd-example best) 147.2GB/s memory bandwidth, as well as basically being OC-able well into the 1000MHz speed range without much in the way of heating or power limiting issues.

    For AMD's chip to utilize low-voltage memory in a bid to run cooler like that? I think it would do more to harm the card than benefit it. Also, there is still the problem of them mainly re-vamping their currently existing architectures without making huge advancements. The R9 29xx series was strong, but it was still hot and drank power like mad. Even though synthetic benchmarks are all we have, a 980M with a good vBIOS and a solid overclock (which we know it'll do, since it apparently is cool enough to work in a thin-model gigabyte laptops) could probably trade blows with a R9 290X or even maybe beat it sometimes.

    In short, I WANT AMD to bring something good for the mobile (and for desktops too). I don't like this largely non-competition thing that's going on. From my perspective, it's a non-competition thing. If they don't at least get within 5% of a 980M in their flagship for the mobile market while being priced near to the 970M WHILE being overclockable enough to keep ahead of the 970M without fail... otherwise people would buy a 970M.

    I'm sounding really pessimistic, but they've shown us absolutely nothing at all that's even close to being real progress that can be put into the mobile chips... at least not that I have seen. They need to optimize their architectures. Hot and power-hungry but cheap work for desktops but not for laptops.
     
  47. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    What reality are you talking about, half what you said is just based on vapor and random calculations. You are using the chap, unoptimised, binning failed Tonga Pro GPUs that are a bad example even compared to average chips to base your calculations on. That is just as unrealistic as it can be. I can't say how much power M295X will use but it won't be similar in performance/watt to the joke that was 880M. Tonga has great ROP performance compared to Tahiti so there is no reason to cut CUs to 28, they can just reduce clocks a bit. And they don't have to reduce it by much either, since overclocking results already show that it has much better efficiency at lower clocks where even the unoptimized 285 gets over double the % of watts reduced per 1% drop in clock rate. Desktop 285 only uses as much power as 760 yet it destroys it in benchmarks. How can properly binned Tonga even compare to the disaster that was 880M. Getting close to 8000 Score In Firestrike is still possible, and the fact that Firestrike bloats Maxwell scores is even makes it easier for M295X to outperform 970M. 970M will have a lower TDP and probably be more efficient by a decent margin and get into thinner designs but that is not my point. Don't forget that M295X will have half the memory of 980M which will use less power. And again, don't forget the fact that Firestrike makes Maxwell look better than it really is.
     
  48. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Hey guys... here are some seemingly leaked specs for the R9 M295X
    :)

    AMD Radeon R9 M295X with Tonga GPU has 32 Compute Units | VideoCardz.com

    Seems encouraging.
    Dunno how good performance wise it will be when it actually comes out... but the main reason I'm eyeing a high end AMD gpu in a new laptop is due to its compute capabilities which are vastly superior to Nvidia... while still offering decent gaming capabilities.

    It's possible AMD was waiting for Nvidia to release their new GPU's, but it wouldn't surprise me if they already had some expectations and delayed their release due to technical difficulties.
    Anyway... what do you make of those?

    EDIT:
    Sorry about that... didn't see the date there... its dated 3rd June apparently.
    So... old news it would seem.
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    But then this is the whole point. The M295X needs to not only beat the 880M, but far outstrip and be close to the 980M. And it needs to do so with a LOT of thermal, power and overclocking headroom, or it simply won't be competitive for the higher end users. If an overclocked 970M beats an overclocked M295X and the M295X runs hotter, draws more power and is more expensive than the 970M, then it is a literally pointless chip unless someone specifically needs an AMD GPU.

    You may be 100% right. Maybe the benchmarks ARE inflating maxwell. I don't doubt it in the slightest, in fact. It has happened with both games and synthetics in the past, favouring one vendor over the other etc depending on the game.

    That being said, I am still saying that I have yet to see anything from AMD that can fit in the mobile format and be ENOUGH of an upgrade over a 970M's potential to truly be considered as a flagship. IF THEY HAVE SOMETHING UNDER WRAPS, THEN THAT IS AWESOME. But we have not seen it, and if they DON'T (as to my knowledge, AMD has been bringing out their new tech 1st since the 4000 series? I do not remember beyond that) and their "new tech" is the R9 285 and R9 295X, then so far they've sincerely lost to Maxwell, and have no competing mobile lineups.

    If AMD can prove me wrong, then I hope they do. I want them to. I cannot stress enough how much I want some good competition in the mobile market. And with CPUs too. I want some good AMD mobile CPUs to contend with the high-end market. Competition is healthy, and I want some good to come out of it. I just am saying they have to bring something VERY good.
     
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    You just went full fanboy. But i`ll keep my cool and reply anyway

    Binning is only part of the equation. Voltage reduction and VRAM type also counts. All those reduce heat output. I havent said anything other than that

    - AMD Pitcairn, aka 7870, aka 1280 shaders, aka 20 CUs, And it have a TDP of 175W.
    7970M was made out of that.
    How did they do that? Through binning, voltage reduction and clock reduction.
    They got a fairly cool 7970M, although slightly hotter than GTX 680M.

    - AMD Tonga. aka R9 285. aka 1792 shaders. aka 28CUs. And it have a TDP of 190W.
    Now you tell me how on earth AMD are gonna reduce MORE TDP, and still add 4 CUs and 256 shaders on top of that?

    How?

    It is not realistic.

    The best thing they can do is take the R9 285 with 28CUs, reduce the same amount TDP as they did with 7970M, but since R9 285 is 190W this time instead of 175W, you are not looking at a 100W R9 M295X, you are looking at 115W+.
     
    D2 Ultima and heibk201 like this.
← Previous pageNext page →