The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Radeon R9-M295X

    Discussion in 'Gaming (Software and Graphics Cards)' started by Tsubasa, Mar 15, 2014.

  1. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Poor AMD! :-(
     
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Can't go 3 years without anything new while two new architectures CPU-side AND two new architectures GPU-side have come out since your last anything and expect sympathy though.

    It's gotten so bad I actually blame AMD for Intel & nVidia's complacency right now. Usually one brings out something new and whips the other into shape and vice versa... this is just one-sided and the other isn't even putting up the dukes.
     
  3. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Maybe they'll hit us with something big - apart from the names of course (BULLDOZER!! - PILEDRIVER!!)! Actually, being really rude, they just need to replace the second word after bull and pile to make it more apt, links them both together nicely!
     
    triturbo and D2 Ultima like this.
  4. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Like balls? Yeah, they have them. Staying in business with half the market-share of Intel and nVIDIA and selling at half (or close to) the price... If that's not titaniumballs, I don't know what it is.

    Excuse my french :)
     
    octiceps likes this.
  5. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    lol, wasn't the word I had in mind, but yours does work in some ways!
     
  6. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I hope so too! I periodically check this thread hoping for new news. Nothing!
     
  7. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    R9 290X and GTX 980M should be fairly close I think.

    Reference GTX 970 is 10% faster than R9 290X in 1080p.
    techPowerUp! - The latest in hardware and gaming

    Second, if you compare specs of GTX 970 and GTX 980M
    GTX 970: 1664 @1253MHz
    GTX 980M: 1536 @1175MHz
    The GTX 980M should run 13% slower if both used the same CPU which Anandtech didnt.

    So if you do 88*0.87 you get 77 which is really close to a R9 290X if you look at the graph posted above.

    There is one recent review where GTX 980M is pitted against Sapphire R9 290 Tri-X. From all the games tested, the Tri-X was only 12% faster than GTX 980M.
    http://nb.zol.com.cn/488/4882818.html

    R9 290 Tri-X is actually faster than R9 290X
    http://www.techpowerup.com/mobile/reviews/Sapphire/R9_290X_Tri-X_OC/24.html

    We are comparing a GTX 980M with a weak 47W mobile CPU against a R9 290X with 85W++ CPUs. With a proper CPU and 980M, I still think it would gain a decent amount of performance so that 980M and 290X would be close.
     
    Robbo99999 likes this.
  8. darkydark

    darkydark Notebook Evangelist

    Reputations:
    143
    Messages:
    671
    Likes Received:
    93
    Trophy Points:
    41
    Cooling of new imac is just terrible. Precision m3800 has better xooling for less wattage.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The previous iMac had no problem cooling the GTX 680MX and GTX 780M. Looks like they are using the same fan on this model.
     
  10. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Eh I prefer actual actual gameplay comparison:
    Gigabyte 970 vs R290 and R290X
    Gigabyte 970 vs reference 970

    Also keep in mind the 980M's memory is clocked significantly lower at 5010 MHz effective instead of 7010 MHz on the desktop 970. It won't have as much of an effect as the core clock does (at least not a 1080p), but it will cause some differences.

    And Cloud that "R290 Tri-X" is actually an R290 X Tri-X. A reference 290 is about 5% behind a reference 290X, and from everything I've seen the R290 Tri-X is basically a reference 290X. So ok going by that link you gave, a reference 290X should be about 12% faster on average.

    As for CPU, outside of CPU-bound games, those that can't multithread to save their life, and a select few like Crysis 3, even an i5-4670K without HT is going to perform the same as the 4770K. CPUs really don't make that big of a difference these days.
     
    octiceps likes this.
  11. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I know, but I didn't like it, so I came-up with something better sounding and more inspiring :D
     
  12. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    755m

    M290X

    They look the same both the fan and the heat-sink, and I think my eyes are OK. Don't know what they were using for the higher-end 7xxm.

    And this is probably the 680mx cooling. Same as above.

    Where are your numbers that they were indeed cooled properly?
     
  13. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Haha, yes, I knew you knew!
     
  14. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    I already said in one of my earlier posts that the 780m in the iMac was 20% slower than the ones in gaming laptops. This shows that it was either underclocked, locked at a lower TDP, or limited by thermals either from the cooling or temperature cap.
     
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    You know you might be onto something, considering after Piledriver they released Steamroller. So bull____, pile (of) ____, and then steam(ing) pile of bull____.

    I'm not sure who's more genius, you or AMD.
     
  16. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Didnt notice that there was Tri-X of both R9 290 and 290X. Found a review on Anandtech showing that 290X and Tri-X 290 are pretty much equal.

    That aside, the chinese review shows GTX 980M and R9 290X being 12% apart. Im still certain that a 47W HQ processor at 3.5GHz will have worse results in games than a 3.8GHz 84W processor. One thing is that the 4710HQ already bottleneck the 980M in some scenarios shown by Anandtech and HTWingnut, other is that faster CPU will give you more FPS in a good deal of games. And since that chinese site tested so many games, the aversge percentage difference will go down if both 980M and R9 290X was tested with the same CPU. 12% might go down to 5-8% which means, more or less same performance
     
  17. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The people that have owned several iMacs says that. From the same thread talking about high M295X temps:
    "I always overclock my Nvidia 680MX at +250/+375, and temps over 85C are very rare - no throttling. I think I'll wait for the next riMac model with Nvidia graphics."

    "I'm really surprised at Apple. The GTX 680MX, to my mind, was revolutionary in the All-In-One space. It ran cool, was easily overclocked, and NEVER throttled. The M295X is a massive step back. Hitting 106C after several minutes of gameplay, as it does easily in my iMac is just."


    Or are you still in denial that the M295X is 100W? Despite official data from Apple with power and heat output from many generation of iMacs. TDP of R9 285, heat concerns from owners etc etc.
    Stop smoking whatever you and Link are smoking and come back to reality
     
    heibk201 likes this.
  18. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    There was no mention about the 780m's performance and temps, or I missed it?

    I never said it's a 100W GPU. Browse around and quote me if you find such thing. I said, what I'm going to say for, what, third time - it is most likely 120W GPU and it's not like we haven't seen such GPUs make their way in notebooks - 7970m, 780m and 880m AND I want to see it in properly cooled machine before I call it CrossFire, without the X.
     
  19. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Wow, yes, there's a link between all three! Too much coincidence here!
     
  20. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Cloudfire, triturbo and Robbo99999 like this.
  21. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    300 series from AMD is hopefully here soon. There was some specs for R9 380X posted earlier. I think a M390X will be based on that one. 390X is probably the big boy ready to take the performance crown away from 980M :)
     
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The CPU really doesn't make all that much of a difference these days, unless we're talking about games that are very optimized for multithreading, and the CPU lacks HT. Otherwise it all boils down to pure clockspeeds. 4710HQ is 400MHz behind the 4770K, but both have HT. You may see 5% improvement tops in the very best case scenarios but that's about it. And in a lot of games it won't matter.

    Obviously there's no direct comparison available so the best I could pull up is this. Yes they're using a relatively weak 770, but given it's only about 10% behind a 780, and a 980M is just about an inch behind a 780, the results should tell you something. If you look at the i5-4570 @ 3.6GHz without HT and compare it to the 4770K @ 3.9GHz with HT, pretty much the only case where there was a significant difference was Arma 3.

    Anyway I think we can all agree the 980M has about 90% of a 290X's performance on average, and that's pretty impressive in its own right.
     
    octiceps likes this.
  23. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    It is possible that AMD wants to 'dominate' the market in 2015 with early HBM release (Seeing how Nvidia doesn't have stacked memory planned for release until Pascal comes out).
    I am definitely holding on to my Acer (in my signature) until AMD releases this mobile puppy with HBM.

    I am also hoping that by the time Pascal is released, AMD will make sufficient modifications/upgrades/improvements to their HBM cards so that they could be 'in the lead' again even when Pascal comes out.
    AMD definitely needs more 'wins' on its side.

    Finally... we still don't know if higher end Carrizo models might have HBM. Carrizo-L seems not to have it... but then again, it was slated for release 3 months before HBM's official debut in desktop gpu's... so higher end Carrizo might end up with it as a 'surprise' - though I would imagine that HBM will definitely find its way into Zen (if not in Carrizo).
     
  24. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    If AMD indeed releases HBM cards in Q1 next year, holding out for then will be really worth it... Although will HBM work on mobile i.e in the sense will the memory run cool enough etc?
     
  25. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Good find, I think HBM technology will be my next upgrade, probably in 2016 - either Pascal's version or AMD's depending on how it stacks up (pun intended!).
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Look at this:
    AMD 20nm R9 390X Features HBM, 9X Faster Than GDDR5

    Its an article that was released at an earlier date, but provides certain specs on what we can expect from HBM alone.

    Now, keep in mind that we don't know if AMD will actually put out a completely new architecture with HBM and 20nm... though I think a new architecture is more or less a 'given'.
    Couple that with 20 nm and HBM, and you probably end up with pretty good efficiency in terms of power... though, I would imagine that even if AMD decides to simply pair Tonga with HBM on 20nm that it will be enough to lower the power usage a lot while radically increasing performance.

    HBM seems to allow reduction in form factor on its own without architectural changes (if I'm interpreting the article accurately - although I could be mistaken).

    Either way... I doubt heating will be an issue for these new cards.
     
  27. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, according to that article HBM is more power efficient than GDDR5, so looks like that's a good thing for the mobile form factor. Looks like an interesting technology, makes me more convinced to give Maxwell a miss and do that upgrade I talked about in 2016 - will be good to get on that new memory technology bandwagon rather than live with Maxwell for a few years once HBM is out already.
     
  28. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    If the release schedule is on track, then AMD should release desktop GPU's with HBM in February 2015.
    Mobile cards will probably follow soon after, or might be released with the destkop cards (I doubt they will release them sooner because we had no indications of this happening at all).
    With that in mind, if you plan on missing out on Maxwell, then I would suggest you upgrade when AMD releases mobile HBM in 2015 instead of waiting for 2016 - this is effectively why I'm holding out because HBM offers a real 'push forward' compared to hardware in 'standard' GPU's.

    Then again, if you plan to wait until 2016, might as well wait and see what will come of Zen architecture from AMD (I'm intensely curious about it, especially with the likelihood of HBM in the mix).
    :D
     
  29. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Hm... Intersting.. Thanks for the article.. Might have seen it before but last time I had only skimmed through it.. If AMD can get this right, it will have a massive lead over NVIDIA in 2015 easily.. The thing is though all this new tech is gonna cost a lot which might prove to be a problem..
     
  30. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    That's why I focus on paying the basic necessities first (rent, food, bills, transport) and then everything else (45 to 60% - depending on the month) goes into the bank (with a very minor % of that going to occasional dinner out or something else - plenty of things one can do without paying for it) - even on a minimum wage (which is what I'm on - and fortunately, Scotland - Edinburgh doesn't have obscenely high rent).
     
  31. cryptodamus

    cryptodamus Notebook Consultant

    Reputations:
    44
    Messages:
    209
    Likes Received:
    55
    Trophy Points:
    41
    you should put some of that bank money on precious metals or blue chip stocks, any good long term investment. keeping savings all in fiat stored in a bank is a losing game during inflation.
     
  32. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    You should actually buy apple and google stock.. Always increases in price :D
     
  33. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I don't really play that game.
    Investments are not something I'm terribly interested in, nor is my goal accumulation of more money and material wealth.
    Its an intriguing prospect, but stocks fluctuate and you need to have proper information on the subject matter in order to get proper returns without losses.
    I can look into it a bit, but when you live in a world where stocks (and most other things) are in the end 'manipulated' by others, it just feels like wasting money either way.

    Inflation or not, the entire monetary system is going to burst relatively soon due to technological automation (and I can't wait for it to happen).


    Putting that aside...

    I was wondering if we have any solid information about which process the new AMD GPU's with HBM will be on exactly?
    I realize the articles I posted mention 20nm, but we don't know that for sure... and it seems to be mostly 'whispers and rumours'.
    Some people seem to be convinced that 28nm is going to be the process of choice - which does seem somewhat logical, considering that Carrizo APU's for example will be made on that process (and Zen seems to be slated - at least in rumour form - for 14nm).
     
  34. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Stacked VRAM gives much greater bandwidth, but the question is how much extra FPS will it really give in 1080p games..
    The important stuff is the architecture pushing things through that extra bandwidth. The power efficiency, the increased performance per shader etc. Bandwidth is only a small piece of the puzzle
     
    triturbo likes this.
  35. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Unless there's a bottleneck with the data in vRAM and the data that needs to be moved in & out of vRAM, faster vRAM won't help all that much as far as I see. It'll just allow for larger vRAM buffers and faster bandwidth without adding heat or taking up extra space on the card, which is good for desktops but great for mobile users. But in terms of raw performance? Better vRAM isn't gonna add a large amount of frames if what's already existing doesn't have a bottleneck there.
     
    Cloudfire and triturbo like this.
  36. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    True and they already state that there's bandwidth reduction with the Tonga, so not much performance to be gained, if at all, just efficiency. I just wish they haven't screwed the deal with MSi over Apple. It's highly likely that there wont be any GX series anymore, at least not AMD based. Which is sad since AMD is catching-up CPU wise just enough to be fine playing everything. Granted it wont be the enthusiast's choice, but for sub $2k those machines were packing quite the punch (especially with ES CPUs). Time will tell, but I have bad feeling that FX7600P + R9-M295X is not going to be. The REALLY sad part? Apple would easily go nVIDIA next time and AMD has (probably) lost the GX line. """"""Lovely""""""
     
  37. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Don`t worry triturbo. Once AMD release the mobile 300 series that have low enough TDP like 7970M/M290X, MSI will most likely use them in GX series I bet
     
  38. cryptodamus

    cryptodamus Notebook Consultant

    Reputations:
    44
    Messages:
    209
    Likes Received:
    55
    Trophy Points:
    41
    eew i don't want amd cpu, GT72 should be able to run M295x without enduro crapduro.
     
  39. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Do people reckon stacked VRAM will enable high end cards to be built in the mxm A size instead of B. pretty awesome if it did!
     
  40. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    You'll still need the extra space for the power circuity. They fry like matches as of now, consider what would happen if they reduce it :)

    As for MSi + AMD, apparently there was talk about MSi Z72, which is what cryptodamus is hoping for - i7 + R9-M295X in GT72 chassis... but it wont happen. Yeah, I'm pretty sure now, that AMD screwed themselves.......
     
  41. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    R9 300 series launch is just too close, and I don't think MSI wants to upset people by releasing something now while something possibly much better will launch in just 3 months. If rumors of 20nm + HBM are correct (HBM is already practically confirmed on the desktop side) then it will be much greater, with a newer architecture improving efficiency, while 20nm + HBM would allow much greater performance at lower power consumption than M295X. I hope we get the same core which will be in the R9 380X, which is rumored to be the first 300 series card to launch and also to have HBM and outperform GTX 980 (desktop) with about 3072 SPs. A bit lower clocks and 20nm process makes it possible to have R9 290X performance in a laptop. And before anyone says, no 980M is not as fast as Hawaii, it's not even as fast as properly cooled 290.
     
  42. eeryanee

    eeryanee Notebook Consultant

    Reputations:
    141
    Messages:
    222
    Likes Received:
    13
    Trophy Points:
    31
    will amd mobile offerings be faster than 980m

    AMD R9 380X in February & R9 390X & 370X Announced
     
  43. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    February is kinda late tbh, I was expecting desktop chips to hit around late December. This means mobile chips will be around May-June (if not later).
     
  44. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    They've launched GT72 with 880m and just couple of months later (even less) launched it with 980m, so I don't think it's the case.

    Anyone else noticed that there are no 285Xs? Pretty sure that Apple got all of the chips.
     
  45. aaronne

    aaronne Notebook Evangelist

    Reputations:
    321
    Messages:
    466
    Likes Received:
    524
    Trophy Points:
    106
    I'm hope for a new Rxxx series gpu come out to put in my m18x r2 (the green's have serious issue at this time), meantime great job AMD, 2 beta drivers out in last week, that's the way to make customer happy :thumbsup:
     
  46. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Dude there was a good reason why there were 2 drivers... 14.11.1 was bloody broken and driver kept crashing... installed 14.11.2 yesterday and now things are fine... However, AMD does have less driver issues then NVIDIA..
     
  47. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    For the general user, not really. For the overclocker, apparently this is now true.
     
    Cloudfire likes this.
  48. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    AMD said there is no 285X, full Tonga will probably launch as 370X since only 380 and 390 cards are supposed to launch with HBM. As for Apple they probably bought a few months of initial Tonga XT production before they even launched it so AMD should have more chips by now.
     
  49. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  50. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I hope you see what I was talking about. Even at their best, they are quite behind in terms of market share (almost twice), yet their prices are still lower, while the wafer prices are the same for everyone ;)

    Full Tonga is launched and it is called R9-M295X and Apple bought whatever AMD has. This proofs my believes for the first part. For the second, browse around and see (especially for iPhones and iPads) - Apple has dropped contracts multiple times if the manufacturer fails to deliver in quantity and quality.
     
← Previous pageNext page →