something i found
http://www.cbber.com/data/attachment/forum/201301/15/161128sh2h8hmm5l5u77z7.jpg
according to the people who posted this, they are getting blue screens because of driver problems
looks like they installed in clevo:
http://www.cbber.com/data/attachment/forum/201301/15/145811fpfof0o4gglfc9ix.jpg
not sure if its fake or real though!
-
-
I`m not sure if this is legit or not, but apparantly our friends from China got their hands on the upcoming 8970M which they put in a Sager P150EM
As you can see from the GPU-Z, its not able to read everything about the GPU. Probably because its not fully supported yet. But we do have core frequency and memory speed there.
Compared to 7970M this baby got a bump up from 850MHz to 1000MHz and the memory from 1200MHz to a whopping 1900MHz. (Not sure if memory read is legit, false read from GPU-Z or true)
That 256bit on 3GB also doesn`t sound right. False GPU-Z read? Mixing 7970M with 8970M because it does not support 8970M yet?
http://i.imgur.com/TgoYd.jpg
http://i.imgur.com/8yCL6.jpg -
Good job Cloudfire for posting this I am excited to see that but a 1900MHz jump seems to be un-believalbe to me . I mean it doable buts but you have to accout for power , heat issues unless they managed to drastically change that . I hope that I am flase . For the 256 , I suspect its false reading but I am glad they added extra 1 GB, good for modded skyrim.
-
One thing that strikes me is their choice of 3GB of vRAM. I am sorry but why? Let us see what the future brings us
gj cloud
-
When I first got my "first" GTX 680M the memory was read 0MB by GPU-Z though and when the card died it was read 4096MB by the same version of GPU-Z, but I can't say much for the 8970M, may be special combination?
.
-
Well if it is 3GB it is either 192bit or 384bit.
The leak could be legit (or false) and that the GPU-Z is reading it wrong. J ust look at 7970M when it was leaked. 128 bit? Nope its 256bit.
1000MHz for the core in the 8970; sounds about right to my ears. -
p150em
here you go! -
Didn`t see you made a thread about 8970M before me Silverfern. Sorry
-
This info seems fake though, memory bandwidth 243GB/s for a mobile card? Alienware? Could it be 2GB 128bit + 1GB 128bit? They are pretty much asking these questions now.
-
katalin_2003 NBR Spectre Super Moderator
-
Thats why I think its either false read from GPU-Z or fake -
failwheeldrive Notebook Deity
-
you'll see funny business soon
-
failwheeldrive Notebook Deity
-
LOL you know something we dont jaug? Have they hired a magician?
-
failwheeldrive Notebook Deity
Jaug is magician
-
Fat Dragon Just this guy, you know?
It also says it's a Thames GPU, which means no new architecture unless GPU Z is reading that wrong as well.
-
Cant wait to see the respective 3dmark vantage and 11 scores!
-
failwheeldrive Notebook Deity
-
Would have to be 384-bit to hold 3GB VRAM, ignoring 192-bit as a possibility for their highest card. There's no way they've managed that and the insane clocks.
-
failwheeldrive Notebook Deity
-
Not sure if the code name will be the same for 7970M and 8970M though -
(in case the picture doesn't work, which it probably wont on a lot of computers, here's the link: http://imgsrc.baidu.com/forum/w=580.../dbb44aed2e738bd46055b9eba18b87d6267ff9f0.jpg)
taken with a iPad lol
courtesy of 冷冻金枪鱼/Frozen tuna @ http://tieba.baidu.com -
8970M codename is "Neptune XT".
Another goodie from that picture: GTX 780M have the same codename as GTX 680MX (N14E-GTX). Another confirmation about my beliefs that 780M will have 1536 cores. Awesome -
Edit: Nevermind, now seeing it through Firefox -
-
-
This card could be the most amazing thing for mobile gaming since VOIP, and I still would not purchase it...The competition could price their card at 50% more with no performance gain and I still would opt for the competitor.....
Welcome to the world of dealing with one of many UNSATISFIED customers as a result of the 7970m...... -
aaah nice, i was wondering when a thread like this would surface
let the guess-work begin!
ok so according to that wiki link the 8970M should be based off the desktop 8870, which is definitely in the realm of possibility, if the given 160W TDP holds to be true. they could easily downclock that baby to 100W as they did with the 7870 before. going by the raw number of unified cores (1536 vs. 1280 = +20%), the speed bump in core clock (1000 vs. 850 = +17.6%) and some kind of bump in memory clocks (1.9 Ghz does sound a bit far-fetched tho...), we could very well be looking at a performance jump of 30-40% compared to the 7970Msounds realistic to me
all pure speculation at this point of course! just so much funi missed that...
-
failwheeldrive Notebook Deity
You're guessing 30-40%? IDK, I've been guessing 15-25% without a die shrink.
-
that was just based on the numbers that wiki link provided
of course i have no idea if those will hold true or not, i was just calculating and throwing numbers around pretty much
-
failwheeldrive Notebook Deity
Yeah I know
Normally how I've guessed performance increase from raising clock rates is to average the percentage increase of core and memory clocks. Like +30% core and +20% memory would be 25% total increase. Of course it's never a 1:1 ratio between overclocking and framerates or benchmark scores. When you start changing stuff like cores and shader blocks it gets really confusing though lol.
-
Well, without a change of architecture, we'll probably be back to the good-old days of +20% performance increase per generation. I hope I'm wrong though. Maybe there's enough advances in the new card to justify making another upgrade.
If those specs are real, though... well, something's gonna hit the fan of awesome -
If the leaked GPU-Z picture is real, then the card wouldn't be 100W, would be something like 125W or 135W.
-
depends... maybe AMD was able to modify the silicone so as to be able to lower the core voltage, thus maintaining the TDP envelope while raising the clocks
happened before with the 485M --> 580M
-
-
I honestly don't think the 7970m actually has a 100W tdp. It runs cooler than my GTX-460m card which was supposed to have a tdp of 75w, and it does so with barely any core contact from the heatsink. I think i have 25% core contact (only one corner makes contact).
Also, I'm sure I saw a clevo factsheet at one point measuring the tdp of the 7970m as "between 75-100w".
I can honestly see 8970m fitting a 100w tdp even with the faster memory.
Not to mention that the GCN 2.0, even though the same architecture in theory, doesn't imply it having had no upgrades in terms of efficiency.
Then there's the fact that I see some of you guys talking about that mem freq as though you hadn't already done 300-400mhz of overclock already on our current cards.
What's to prevent a different and cooler chip design for the memory on the new cards? -
tru that, up to 1600 Mhz is already possible with OCed 7970M cards without blowing up the respective machines in the process
-
-
while ur reasoning surely makes sense hackness, u cant really compare desktop with mobile gpus concerning their performance/watt ratio. a properly OCed 7970m easily surpasses the performance of a desktop 7870 (even the ghz edition) and sometimes even the level of a desktop 580. that doesnt mean that in the OCed state, the 7970m reaches a tdp of 200W+
the mobile gpu components are just trimmed for efficiency and powersavings, way more than their desktop counterparts. same goes for mobile vs. desktop cpus. and because of that efficiency the mobile hardware is just so damn expensive!
Sent from my Galaxy Nexus using Tapatalk 2 -
-
nobody said that GPU-Z screenie was real, were all just speculating here
-
-
King of Interns Simply a laptop enthusiast
Highly binned it will be. More efficient than the 7970M sure. However if they clock it high at stock then it will another watt eating card. I hope it is clocked reasonably and gets us some power savings.
-
i want moah POWAH! less savings...
Sent from my Galaxy Nexus using Tapatalk 2 -
King of Interns Simply a laptop enthusiast
-
So I was doing some crappy math, and I thought of the following:
The AMD-displayed benchmark graphs on the new MSI machine equipped with the 8970m listed the performance ratio between the 8970m and the GTX-675M as exactly 2:1.
From Notebookcheck, the GTX-675M scores 4000 3dmark11 GPU score. Double that is 8000.
8000/6200 (the average stock 7970m gpu score in 3dmark11 Performance tests with the most recent drivers) we get... almost exactly 30% improvement.
Once more consistent with upgradelaptop's statements of around 30% improvement over 7970m -
Never trust notebookcheck too much, average 675m scores 3.5k.
-
while the actual benchmark figures are usually a bit lower than in real life (mostly due to using drivers at gpu release), the performance relations between gpu models listed on notebookcheck are always spot on
@sangemaru: if ur calculatons are correct, then i already managed to come within 3% of 8970m stock performamce
Sent from my Galaxy Nexus using Tapatalk 2 -
Have chance that once less than 100W TDP high end laptop vga's?
8970m
Discussion in 'Gaming (Software and Graphics Cards)' started by Silverfern, Jan 15, 2013.