100W GPU at P4500 would be an epic fail. 100W is the limit to what they can push out to notebooks. If P4500 will be the greatest they can produce right now with Kepler, they can just quit and surrender to AMD alltogether since AMD have made a P5800 GPU and there is no way people are buying the 680M.
P4500 at 100W would mean that they will have to wait 6 months or something for a improved architecture to get up to P5800, but until then AMD have moved on to the 8000 series, rendering this new GPU useless.
P4500 at 75-80W would make more sense because it would not combat 7970M, but have a market on its own plus it would take over for 570M.
This is not the latest rumor. Its old and if you have been using your eyes the last 2 months you would have seen it before. That score is from a GK106, NOT the GK104 MXM you see from the same BS article from videocardz.
The latest rumor is that Nvidia is working on a new 680M, not this old stuff you keep dragging up again.
-
-
-
They are not stupid. They plan everything up ahead. If they come out with a 100W P4500, I just don`t see what that plan is. Catering for stupid fanboys who will buy Nvidia whatever happens? -
Cloud. How would you explain that GTX660M is 75W and is as weak as it is? So 75W GTX660M scores P2700 and 100W GTX680M will score P7000? Wheres the sense in that?
And GTX680M will not be P4500 according to those leaks, but minimum P4900 and likely above P5000 with new drivers. -
-
P4900 score is also total bull. That score is from the GK106 680M overclocked. It was from the same leak where we saw the P4500 score -
Please stop using P#### scores, they are not the indicator for GPU. A significant difference in CPU will always yield larger changes in P score and lower changes in GPU score. At least for vantage and 11.
Anyways, the kepler architecture scaling is a bit wonky so far, it seems to have it's current best balance at GTX670, but power consumption and performance does not scale linearly.
I think we will see the GTX660ti, or whatever is the 1152 cuda cores, as the GTX680m. Or at least it seems the most likely since it will still be more powerful than HD7970m while not being too much hardware downclocked. -
3dM11 P scores are totally comparable. difference in CPU score doesnt affect much the P score. At least as long as the processors are from i7-2 to i7-3 mobile range, its only maximum 50points difference between the lowest and highest.
---
Anyway. If this turkish reseller put GTX680M in their list then we are likey to see the true specs and scores rather soon
---
What comes to power consumtion of GTX660M, then i can do test as soon as i get my W150ERQ and compare it to HD7970.
GT650M GDDR5 is identical to GTX660M. Only the clocks speeds are different. This can be fixed. -
I think the 680M would have to run around 700Mhz @ .8v with 1344SP to meet the 1344SP expectation.. -
-
-
Where did you see M ?
I was taking example from desktop cards obviously. :GEEK:
-
But that's FPS only, there might still be microstuttering. -
I want to break in to Nvidias headquarters and steal that damn 680M. Man I`m sick of this speculation.
Where do you guys reckon they keep the GPU? California or some underground cave guarded by heavy military? Chopper flying around the entrance, lazers on the wall to detect movement with automatic machine guns that shoot on any movement. If there is some chinese dude who reads this who either owns one or knows somebody who do, I am willing to pay big bucks for you to ship whatever engineering crap you have -
i just want two and that they hit 45k+ in vantage perf setting and be able to run BF 3 at 120fps stabel on maxed out settings not ultra preset oh yea stock
this is what i would want .
dont care if its unrealistic ;D -
4 different scenarios guys but only 2 that fits the memory configuration of this card. Which one will it be...
#1 If the 680M have 768 CUDA cores, it is based on the 130W GK106 GTX 660 and have a 192 bit bus. That can`t be true since the turkish site is quoting 4GB. That doesn`t fit. It must have either 1.5GB or 3.0GB for a 192 bit bus.
#2 If the 680M have 768 CUDA cores, it may be based on the 195W GK104 GTX 680 (1536 cores) but sliced in half, and have a 256 bit bus which fits the 4GB memory.
#3 If the 680M have 1344 CUDA cores, it is based on 150W GK104 GTX 670 and have a 256 bit bus. That fits the 4GB memory.
Geforce GTX 670 på bild ? kostar 3800kronor - Grafikkort - SweClockers.com
#4 If the 680M have 1152 CUDA cores, it is based on the 150W GK104 GTX 660 TI and have 192 bit bus which also doesn`t fit the 4GB memory.
Nvidia GeForce GTX 660 Ti Not Coming Until Q3 '12?
------------------------------------------------
@Torment: LOL keep dreaming -
if you are snyggt jobbat med traden -
-
I am from Norway. Thanks -
I think, given the power restrictions, the most likely scenarios are 1 and 4.
Or maybe it'll just be a rebranded 7970m. -
GTX 660 = 7870 according to some sites btw. So worst case scenario should be downclocked 7870.
VGA ZOL
If 660 TI goes, so should 670 R3d since they share the same TDP and probably the same power consumption too (?). Question is what way will give the best performance out of the two.
GTX 660 looks to be the easiest route, since its only 130W, but that will give a pretty "MEH"-performance over the 7970M. Oh god, please don`t pick this one and let this be Fermi vs VLIW all over again. -
inga special bokstaver
sorry to every one else here just speking some native tung swedish to be excact -
nVIDIA release 2 versions. One is 2GB and the other is 4GB with 2 different memory interfaces.
Followed by the *1344* 685M.
BTW, the card is in Texas or California. -
But 2GB and 4GB still only work on 256bit. Its gotta be 3GB and 4GB or 1.5GB and 4GB
That 1344 may be our 685M yes but that leave #2 is our only option for GTX 680M. Well it could be that that turkish reseller got the memory amount wrong like R3d says -
113 pages of speculation with next to zero actual information. This thread is a testament to pointlessness and wasted time. Is it so you can say "I told you so" when it's actually released? How many hours have you invested so far for that chance?
-
Pointlessness schmoylessness. I think its fun to speculate. -
The only thing I'm sure of, is that it isn't a direct translation of any desktop GPU, and it won't be 192-bit.
-
-
Could anyone with a 7970M get some FPS readings for S.T.A.L.K.E.R. Call of Pripyat? That would be much appreciated.
-
^^ probably +60fps but who knows
-
-
SlickDude80 Notebook Prophet
-
Which is harder to max out, Crysis 1 or 2?
-
Crysis 1 is *no tweaks* -
-
Here´s GK104:
GK104 Block Diagram Explained | techPowerUp
I dont think they are able to shut down less than single SMX at the time.
Not physically and econimically reasonable i think. Perhaps not even physically possible. -
680m to have 4 gigs? If that's true AMD will follow and create a 7970m successor with 4 gigs. I think alienware provides the 7970m crossfire. 580m sli from the m18x gets 100 fps maxed out on bf3.
-
-
Meaker@Sager Company Representative
Well since the 7970M can now be overvolted, this could be interesting....
-
With a good notebook cooler, 1100 core shoultn be any problem.
-
i would love to see the 7970m in CF on a x7200 with 990x
with max Over volt and max oc what it would do
since in a x7200 power and cooling for the cards wouldent be a prob 2x 300w psu's -
Meaker@Sager Company Representative
-
-
Why would anyone believe that the 680m would have double the RAM that Nvidias flagship desktop cards have???
-
-
-
SlickDude80 Notebook Prophet
EDIT: nevermind -
-
So it's not too implausible. -
Indeed. Theres more high amount of RAM low perfomance cards in notebook market. So thats not valid point at all.
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.