How did you get up to 1450 memory clock?
The furthest I can go on my 8970M is 1375 according to AMD Overdrive.
Sent from my baked potato
-
I really hope the 9970m is not a rebrand...that would be insane. The 8970m was already outdated by eons when it came out, if the 9970m is indeed a rebadge, AMD will be bankrupt by the end of 2014.
-
fatboyslimerr Alienware M15x Fanatic
I'm hugely disappointed we aren't on 20nm litho yet. Shocker!
Sent from my HTC One S using Tapatalk -
King of Interns Simply a laptop enthusiast
-
Anandtech posted some upcoming MSI notebook pictures and i found above photo. 2GB M290X... -
-
Useless vram anyway..
-
A picture with the placard below the notebook (also linked a couple posts ago) says it's a A10-5750m.
-
OK It's another rebrand. Here's the AMD official specifications.
-
Nevertheless I was trying to show that the GX70 has 2GB of VRAM too.
-
They say Nvidia is good at rebranding, but AMD is actually worse.
Nvidia had GTX 680M first with 1344 cores.
Then they released GTX 780M with 1536 cores. Then 880M which is a rebrand of GTX 780M.
AMD released 7970M which had 1280 shaders.
Then they released 8970m which is identical to the 7970M. Then they now release another rebrand of a rebrand, R9 M290X. Every step they just increased the shader clock by 50MHz.
I wonder how long we have to wait until we get some -real- upgrades? I hope we dont have to wait 6 months... -
I'm mainly play BF3/BF4, so my decision to stay with AMD depend on the BF4 mentle performance.
-
-
*move along, nothing to see here*
I'm getting real tired of your s**t AMD :| -
Simples.
-
Thats even worse -
-
Really excited to see what Mantle will bring to BF4 as better framerates are always welcome, expecially on 120Hz or overclocked displays. -
Context:
A10 APU + 7970M: GTX 660M performance because of the APU bottlenecking the GPU
i7 CPU + 7970M = 7970M performance because there is no bottleneck
The devil might be in the details.
<iframe width='853' height="480" src="//www.youtube.com/embed/yfXI1pn5los" frameborder='0' allowfullscreen></iframe>Last edited by a moderator: May 6, 2015reborn2003, Arioch, TBoneSan and 1 other person like this. -
DX is unnecessarily heavy on the CPU and only evolves very slowly, that's where Mantle comes in. Optimizing the memory model, making CPU - GPU collaboration much easier and more efficient and exposing all of the GPUs functionality.
I'm not saying that AMDs x86 processors are magically better with Mantle, but the performance of games that make use of Mantle will see a very noticeable boost - because, yes, it should remove most of the CPU bottleneck.
APU + 7970m brings fantastic price / performance that is unmatched by any Nvidia / Intel combo.reborn2003 and octiceps like this. -
Robbo99999 Notebook Prophet
geko95gek likes this. -
The new mobility RADEONs leaked today are a big disappointment - rebranded again.
I have a 7970m and know they are good, but I don't see how this is going to encourage upgrades and new sales. -
reborn2003 likes this.
-
Although its a given that people should be staying away from AMD APUs, anyone that buys that instead of Haswell really need their head examined. -
To be fair, nVidia has done straight rebrands of their mobile cards before e.g. 8800 GTX to 9800 GT (although they've never tried to milk the same exact chip in the high end for this long without a real refresh)reborn2003, Arioch and geko95gek like this. -
Euh isn't 780m pretty much a rebrand of 680m with higher clocks and gpu boost? and if rumors about 880m are true it will be a rebrand of 780m with clock boosts, so they are in the same boat, 680m was a better gpu than 7970m though, with waaaay more oc potential, that is why it felt less like a rebrand .
-
Cloudfire likes this.
-
GTX 680M had 1344 cores
GTX 780M had 1536 cores
You are right about GTX 880M though. That is a 780M rebrand -
I take it thats the memory clock?
Funny stuff because thats what I run my 8970M at lolreborn2003 and sasuke256 like this. -
King of Interns Simply a laptop enthusiast
So long story short: get yourself a 7970M or 680M and save money? LOL
-
-
-
fatboyslimerr Alienware M15x Fanatic
Or giggle to yourself because your now 2 year old GPU is still current gen ha ha ha!
Sent from my HTC One S using Tapatalkreborn2003, Cloudfire and geko95gek like this. -
-
-
It's just a 2GB version of the R9 M290X.
Cloudfire likes this. -
-
Just be careful that the memory on M290X match (type and amount) with your 7970M. Or else the flashing might brick your GPU
-
-
bigspin, reborn2003 and long2905 like this.
-
inperfectdarkness Notebook Evangelist
AMD, I am disappoint. How is MSI supposed to be convinced to make a GX60 3k edition if you can't even make a GPU worthy of it?
-
-
Uuuuuuu, I personally love rebrands.
For one, I bought my 7970m's a while ago, and they've served me well and continue to do so.
Secondly, going a while without gigantic performance leaps will increase the customer base for that one specific product, and increase developer support and optimization at that performance level.
Third, I bought my cards at a very nice price (325$ each) before people expected a new gen, and now I'm sure I could sell them for 400$ each easy, since they're still basically the current-gen flagship.
Fourth, my 2010 (4-year-old machine) is still running great CPU power and a pair of GPU flagships only marginally slower than a non-OC'd 780M SLI. Certainly nothing to cry about.
In a way, delaying the new architecture and fab process release for a while makes me happy. It maximizes my time and satisfaction on my cards, and when they do hit eventually, they'll kick all kinds of hiney. At this point it's pretty much guaranteed that the next gen will either come late 2014 or mid-2015, hit on 22, 20 or 18nm fab, and use the refined Volcanic Islands arch in a way that hopefully won't burn a hole through our machines the way the current revision does. Admit it, that's one hot chip. -
"I like rebrands because then I don't feel like my laptop is obsolete."
Have we really come to this?reborn2003 and moviemarketing like this.
Incoming: AMD 9970M
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jul 15, 2013.