I use, for gaming, my 680m at 875/2200 clocks. I already have basically 7.3k GPU score on such clocks.
If a stock 680m is getting 24fps in a game, then a 780m will get around 30fps. Big deal. That's your 25% improvement. At any rate, yes, that's why we and others wait more than a few months to upgrade our GPUs. It takes more than a year to get a very high performance improvement, usually a new architecture.
Personally, since I just got my current rig not long ago, haswell, 700m/HD8000 series are not really of my interest. It's nice to have super raid with new higher speed, a faster CPU and GPU. But I already have a fast machine. Enjoy the new MSI machines which sound like amazing beasts.
It will be good to compare 680m, 680mx and 780m once they are all out and ready.
-
-
NVIDIA GeForce GTX 680M video card benchmark result - Intel Core i7-3610QM Processor,CLEVO P1x0EM score: P7602 3DMarks
-
-
If we are talking average fps, yes it is. If it's minimum, then it hardly matters.
If it's maximum fps though..... *cries*
As has been said before, it is an improvement for stock performance users. In my case, I would have probably 29fps vs the 30. -
*Sigh*... not impressive. Sure, OC'd may be able to squeeze a little more, but still... not worth the upgrade. I'm sticking it with my trusty 7970m another year.
-
you guys are quite able to say such in a very comfortable positition. My 7970M on the other hand sucks so bad It couldnt go beyond stock clocks
-
@long2905 welcome to my side of the party (sucky sucky 7970m). On the other hand I hope the 780m works in my M6600 as the 680m didn't do so well.
-
GTX 680M: 1344 cores @ 1.1V (example) @ 720MHz - - - TDP = 100W
GTX 680MX: 1536 cores @ 1.1V (example) @ 720MHz - - - TDP = 122W
GTX 780M: 1536 cores @ 1.05V (example) @ 720MHz (+GPU Boost) - - - TDP = 100W
700M series run on lower voltage than 600M series. 700M series also have GPU Boost 2.0. I have been posting that for about a month now.
Why do you think GTX 780M score 10% better than GTX 680MX in 3DMark11? That have to be because the 780M run higher clocks (GPU Boost 2.0).
680MX 3DMark11
GTX 780M 3DMark11 -
Also there is one more thing, this is a new generation, 780m doesn't belong to gtx 600m family, so we cannot 1/1 judge the TDP compared to specs of 680mx, yes they are both Kepler, but 780m surely is an optimized version of 680mx. The same reason they were able to clock 580m higher than 485m within same TDP. I guess enabling a few more threads should be also possible, no?
-
I might be wrong maxheap but I think 780M will have very little optimizations compared to 680MX. They are most likely both GK104.
GTX 485M was a GF104 core, GTX 580M was a new chip GF114 and it came with optimizations and therefor they could up the clocks.
If GTX 780M is GK114 however, then we will see lots of optimizations. Or maybe Nvidia have improved the GK104 silicon and that they have baked in the optimizations in the GK104 chip. We will just have to wait and see I guess. -
(A) 680mx was run on a MAC (hence the lower scores)
(B) Those are not official numbers for the 780m
So in essence you're trying to say that they run 15% more cores, same clock speed, with 22% less TDP, with 25% higher score? -
10 pages later...
Cloud obviously won't budge on his opinion.
HTWingNut obviously doesn't agree with Cloud.
Let it go, fellas. Just wait until the card is released. -
They reduce the TDP down from 122W to 100W and it perform 8% better, not 25% more. Its as simple as GTX 680MX running at 720MHz and the 780M boosting up to 778MHz. GPU Boost 2.0
The notebooks still ship with 180W PSUs. What does that tell you? They draw the same power. TDP is the same. Voltage reduction. I have provided you with 3DMark scores that show how they will perform.
"Those are not official numbers"
So you disagree with every number you see just because it isn`t a official review by Anandtech or whatever? Even in the Haswell thread you don`t believe screenshots although they are from a MSI notebook.
"680mx was run on a MAC"
So? -
680m 3DMark11 is ~ 6100, right?
7600 is about 25% higher than 6100.
-
680M 1.10V 100W TDP: 1344 cores * 720MHz = 967680
1198080/967680 = +1.238 = +24%
It fits. -
-
GTX 680MX 1.10V TDP 122W: 1536 cores * 720MHz = 1105920
GTX 780M 1.05V TDP 100W: 1536 cores * 780MHz = 1198080
1198080/1105920 = 1.08 = 8.0% -
780M 1.05V 100W TDP: 1536 cores * 650MHz = 998400
680M 1.10V 100W TDP: 1344 cores * 720MHz = 967680
998400/967680 = +1.032 = +3.2%
It fits. -
Lets take it again HTWingNut. Are you getting old?
Its reduce TDP from 122W to 100W (-18%) and increase scores by EIGHT % (8%). (680MX vs 780M)
Or
Have the same TDP but increase score by 24% (680M vs 780M) -
King of Interns Simply a laptop enthusiast
Maybe HTwingnut chip makers are into harry potter and all that these days
so anything is possible right?
Only kidding cloud I look forward to seeing the 780 in action. -
-
I personally see it adding up. But yeah, nothing to do than wait and see. -
Is there a book to place a bet here? Before that happens we need the 3rd option that you are both wrong. I'll be betting on that one.
-
Who ever lose must have "I bow down to you, winners name, allmighty oracle" in the title or signature.
I`m down with that -
No, one will have "lucky" the other "not lucky" as their title.
-
http://www.maximumpc.com/article/news/nvidia_rumored_release_geforce_gtx_780_graphics_card_may_23 -
-
LOL...I remember last year when we had the same thread about the GTX680M the discussion ended with a lot of us getting band...
-
^^ good times
-
Very nice score, will be possible to buy separately and stick into MSI GX60?
-
Meaker@Sager Company Representative
-
Should be a nice and predictable incremental upgrade from 680m. Why are some people bringing up compute performance for a gaming gpu? I guess if you're into bit coin mining, then by all means get an amd card and have fun wasting electric. For any serious gpgpu work, you're going to use cuda.
Something important to consider is that crossfire in its current form is severely broken and causes noticeable microstutter due to its runt frames and high latency. Single gpu users won't care but those in the Clevo and AW crowd should take note.
Sent from my GT-N7000 -
Remember how everyone said: "Nvidia is going under. It will only release that P4500 GPU" and I had to tell everyone that they were working on another GPU to replace it? I do.
However on the 7970M you were spot on and I was like way over "there". I didn`t believe anything of what you said when it was announced. Don`t act surprised though. I have never been accustomed to AMD space technology. Because thats how I see it. Its like alien and strange to me and I`m not sure I can trust it -
Just to kill the discussion before it even starts: TPU is wrong on the TDP. Its 100W. The rest seems to match
NVIDIA GeForce GTX 780M | techPowerUp GPU Database -
failwheeldrive Notebook Deity
-
-
Total overkill on his setup unless he has like six monitors, except maybe the CPU. -
failwheeldrive Notebook Deity
-
Hey! You already have one, let the others have some candy!
-
failwheeldrive Notebook Deity
-
-
Meaker@Sager Company Representative
-
-
King of Interns Simply a laptop enthusiast
My word joker. How could you afford two of those beasts!
-
Meaker@Sager Company Representative
Way less than a seat upgrade on some cars, if you have a real income then dropping 3-4g on a system is not tricky.
-
King of Interns Simply a laptop enthusiast
Depends if you have a family to support and a wife that would perhaps object!
That would be me unfortunatelyHave to win her over every year when I upgrade my GPU by saying that I make "most" of money back by selling old one...
-
failwheeldrive Notebook Deity
)
-
I read that Dual GTX 690's in SLi will perform better than Dual Titan's. True? -
-
Meaker@Sager Company Representative
Unlikely as 3rd and especially 4th card scaling is shaky at best along with all the extra stutter it brings and input lag.
3DMark11 on the upcoming GTX 780M!!!
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 2, 2013.