isn't the 880 just going to be a rebrand of the 780? i doubt it will drop that much if it is a rebrand. (780 is a 680mx so it isn't a rebrand of the 680m)
-
HopelesslyFaithful Notebook Virtuoso
-
Source: GTX 6 75M was a rebrand. GTX 6 80M was not.reborn2003 likes this. -
Of course it's too early to tell, and I can't back up anything I've said with credible sources, but rumors tend to be somewhat accurate...though, usually on the 'high side' of things.
EDIT: We should start hearing more about it in the next couple months. :thumbsup:
reborn2003 and Cloudfire like this. -
HopelesslyFaithful Notebook Virtuoso
from my understanding 22nm was not going to be ready summer of 2014 hence why i said rebrand. unless there is new news....
-
Intel does that with their CPUs. Ivy Bridge: 28nm. Haswell: 28nm, new architecture.
Of course 22nm will bring a lot of improvements on the efficiency scale, meaning they can build faster GPUs (especially in the mobile department due to thermal/power restrictions), but a new architecture in itself will also boost performance -
-
I'm not surprised that there is little news at the moment. NVIDIA usually launch the new high end mobile GPUs in summer. It was summer 2012 for the 680M and summer 2013 for the 780M. Logical for 880M to be summer 2014. Seeing as AMD haven't provided any real competition yet, there's no need for NVIDIA to rush another GPU to market. Anyways, I think I'd rather a fully-fledged Maxwell 20nm in summer than a 28nm or Kepler rebrand just so we get it a few months earlier.
Sorry I know this thread is meant to be about AMD but I was just responding to the posts about 880M here. -
Meaker@Sager Company Representative
-
HopelesslyFaithful Notebook Virtuoso
-
If NV continue on the "122W is acceptable" path, will AMD follow?
-
HopelesslyFaithful Notebook Virtuoso
i hope so a decent laptop with a decent cooling system can easily handle it
-
TSMC is starting mass production for partners February 2014. Wonder how long it takes for them to reach the market? But we will undoubtly have 20nm by summer 2014 for sure. What I wonder, if AMD and Nvidia do the next architectures on 28nm first. If they do, we could have new GPUs just around new year.
Read the recent reports about 20nm here:
http://www.xbitlabs.com/news/other/display/20131022230815_TSMC_Shares_More_Details_Regarding_16nm_FinFET_and_20nm_Progress.html
-
Nvidia is seeing greatly diminishing returns on 28nm. I don't know that they'd waste the R&D for a new architecture, with this process.
-
King of Interns Simply a laptop enthusiast
Pretty sure the next big release will be around Spring next year? Probably includes any offering from AMD. That was when the 7970M was released anyways...
-
there will be leak and a few ES cards going around but the mass wont be able to get it until june or july
-
Meaker@Sager Company Representative
The 780M is a 110W card by the way, not 122W.
-
HopelesslyFaithful Notebook Virtuoso
-
780m is ~ 20% increased power consumption over a 680m. So 680m is either an ~ 85W TDP card or the 780m is ~ 120W.
Source: Me.
http://forum.notebookreview.com/sag...s-sager-np8250-clevo-p157sm-review.html#powerreborn2003 and long2905 like this. -
HopelesslyFaithful Notebook Virtuoso
if i recall the 6990 was more power hungry than the 680 and 7970 so whats all i am saying
-
dude you cant just left a bomb like that and leave??? give us more details >.>
unityole likes this. -
Meaker@Sager Company Representative
That's not a bomb, it's a wish.
-
'
Its said to come out early 2014 according to wccftech, so leaks are bound to happen anytime soon I suppose -
-
well they did announce R9 series on desktop side.
Cloudfire, unityole and HopelesslyFaithful like this. -
Desktop discussion isnt really for these forums, as we are only really interested in Notebook GPUs. -
I just thought that new Hawaii cards + the Clevo reseller listing 9970M was a pretty good sign. Sadly it didn`t turn out that way.
Recent news now say early 2014 is when the mobile cards are coming out. Hopefully presented at CES
unityole and HopelesslyFaithful like this. -
-
Robbo99999 Notebook Prophet
(Total Watts used by the system while gaming) - (CPU overhead/other component overhead) = (GPU Watt Consumption).
That's what I did for my system anyway, because I was curious how many Watts my 670MX was using when fully overvolted & overclocked, I worked it out as about 90W for my fully overclocked GPU. Working out the same thing for stock clocks it only comes to about 45-50Watts, and the 670MX is supposed to be a 75Watt card. So, I think that the TDP listing of these cards by NVidia is not really indicative of how much power these things actually suck! I reckon the 680M consumed less than 100W, and I reckon the 780M consumes 100W, but that's mostly a guess on my part, but I know that Fermi used a load more power than Kepler 600 series for their mobile chips - my previous card in this machine, a 560M, used a full 75W and that was listed a 75W TDP. All this makes me think 680M uses less than 100W, and 780M very close to 100W.
(I used Throttlestop to work out my CPU Watt consumption, and I used Heaven Benchmark for load testing). -
Meaker@Sager Company Representative
-
Robbo99999 Notebook Prophet
-
HopelesslyFaithful Notebook Virtuoso
-
Meaker@Sager Company Representative
I never owned a 6990 so could not say, the 780M power control is specifically set to 110W though.
-
Robbo99999 Notebook Prophet
EDIT: Actually according to screenshots NV Inspector lists the Power Slider in percentages, not on Watt values, so I'm still confused how you know power control for the 780M is set at 110W? -
It's probably somewhere in the vbios.
-
Robbo99999 Notebook Prophet
EDIT: @Meaker: where did you see this 110W figure you keep mentioning? How do you know the power control is set to 110W, where is that listed? You've been asked quite a few times now, by more than one person, I think we just want a simple answer from you. If you just heard it from someone then just say so. Anyway, I'll leave it at that - I find worming out of questions annoying, but I will leave it there even if I don't hear back from you.Cloudfire likes this. -
I`d like to know the power consumption of the card as well.
Isn`t anything listed as power in the vbios a limit anyway? Like you have stock clocks, and then you can increase clocks as long as its within that limit (TDP). -
Cloudfire likes this.
-
Robbo99999 Notebook Prophet
1) System idle, 0% CPU usage = 30W, but CPU at 0% usage using 5W according to Throttlestop, therefore 'system overhead not including CPU' is 25W
2) Overclocked & Overvolted Heaven Benchmark = 140W
3) CPU usage during Heaven benchmark = 20% => shown as 25W being used by CPU in Throttlestop.
4) Using previously acertained figures in steps 1 through 3: 140W - 25W - 25W = 90W
So, that's how I calculated 90W. It's the best I could do. I think it's reasonably accurate. It does take into account the things you mentioned though. -
HopelesslyFaithful Notebook Virtuoso
-
-
unityole likes this.
-
so now we know 880m will likely be a rebrand of 780m with 8gb vram instead of 4. harder to clock it higher cause of more ram (memory clock) onto waiting for 9970m or Rx 290m.
desktop Rx 290 doesnt just replace 8970 but completely out class it theres a good chance we'll see it catch up to 780m/880m, or perhaps a better performer of this coming entire year. of course if nvidia doesnt go and do something like 880mx? -
No.
High-end GPUs on laptops are thermal bottlenecked. The new Rx 290X is simply bigger, not more efficient, so it won't make much different on a laptop platform unless AMD decides to follow NV and increase TDP limit. -
HopelesslyFaithful Notebook Virtuoso
I hope my next upgrade is GPUs 125-150W.
This isn't counting if Dell decides to just merge the CPU/GPU heatsinks in the m17x/m18x systems. I bet that would give both a good bit of room.
unityole likes this. -
@hopelesslyfaithful guess so, need another die shrink lol maxwell won't come until 2015 I guess -
I do agree with HopelesslyFaithful that mobile GPU TDP can be increased further.
Gaming notebook and mobile workstation designers, how about something like this next time? We might squeeze one XM CPU and one 100W dGPU into the next ultra portable if you guys do manage to beat the efficiency of some hobbyist's hand-made weekend project.
Attached Files:
Cloudfire, HopelesslyFaithful and geko95gek like this. -
-
HopelesslyFaithful Notebook Virtuoso
-
Holy crap thats a lot of copper. Almost like its too much and blocking the air circulation.
What temps improvements did he get? -
The possibility of balanced cooling load is not realistic:
1. If both GPU and CPU are under load, sharing the heat doesn't help either party.
2. If, without loss of generality, the GPU is idle while the CPU is under load, the interconnected heat pipes will transfer some of the heat from the CPU to the GPU heatsink initially. However, due to the higher thermal resistance between the heatsink and the GPU as compared to between the heatsink and the fins, little of that heat will go into increasing the GPU temperature. As a result the GPU fan will not be activated and the GPU heatsink quickly heats up to choke out any heat transfer from the CPU heatsink.
If anything, I would like to see someone come up with a viable idea of extending the exhaust fins. The issue would be to minimize the frictional pressure loss lest the exhaust air flow velocity (which is critical in forced convection) drops drastically..
Incoming: AMD 9970M
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jul 15, 2013.