I use, for gaming, my 680m at 875/2200 clocks. I already have basically 7.3k GPU score on such clocks.
If a stock 680m is getting 24fps in a game, then a 780m will get around 30fps. Big deal. That's your 25% improvement. At any rate, yes, that's why we and others wait more than a few months to upgrade our GPUs. It takes more than a year to get a very high performance improvement, usually a new architecture.
Personally, since I just got my current rig not long ago, haswell, 700m/HD8000 series are not really of my interest. It's nice to have super raid with new higher speed, a faster CPU and GPU. But I already have a fast machine. Enjoy the new MSI machines which sound like amazing beasts.
It will be good to compare 680m, 680mx and 780m once they are all out and ready.
-
-
Right. Here's my 3DMark 11 with stock voltage overclocks:
NVIDIA GeForce GTX 680M video card benchmark result - Intel Core i7-3610QM Processor,CLEVO P1x0EM score: P7602 3DMarks
-
I get what you're saying, but that specific scenario actually is a big deal.
-
If we are talking average fps, yes it is. If it's minimum, then it hardly matters.
If it's maximum fps though..... *cries*
As has been said before, it is an improvement for stock performance users. In my case, I would have probably 29fps vs the 30. -
*Sigh*... not impressive. Sure, OC'd may be able to squeeze a little more, but still... not worth the upgrade. I'm sticking it with my trusty 7970m another year.
-
you guys are quite able to say such in a very comfortable positition. My 7970M on the other hand sucks so bad It couldnt go beyond stock clocks
-
@long2905 welcome to my side of the party (sucky sucky 7970m). On the other hand I hope the 780m works in my M6600 as the 680m didn't do so well.
-
For the millionth time: Lower voltage = less power consumption = less heat = less TDP
GTX 680M: 1344 cores @ 1.1V (example) @ 720MHz - - - TDP = 100W
GTX 680MX: 1536 cores @ 1.1V (example) @ 720MHz - - - TDP = 122W
GTX 780M: 1536 cores @ 1.05V (example) @ 720MHz (+GPU Boost) - - - TDP = 100W
700M series run on lower voltage than 600M series. 700M series also have GPU Boost 2.0. I have been posting that for about a month now.
Why do you think GTX 780M score 10% better than GTX 680MX in 3DMark11? That have to be because the 780M run higher clocks (GPU Boost 2.0).
680MX 3DMark11
GTX 780M 3DMark11 -
Also there is one more thing, this is a new generation, 780m doesn't belong to gtx 600m family, so we cannot 1/1 judge the TDP compared to specs of 680mx, yes they are both Kepler, but 780m surely is an optimized version of 680mx. The same reason they were able to clock 580m higher than 485m within same TDP. I guess enabling a few more threads should be also possible, no?
-
I might be wrong maxheap but I think 780M will have very little optimizations compared to 680MX. They are most likely both GK104.
GTX 485M was a GF104 core, GTX 580M was a new chip GF114 and it came with optimizations and therefor they could up the clocks.
If GTX 780M is GK114 however, then we will see lots of optimizations. Or maybe Nvidia have improved the GK104 silicon and that they have baked in the optimizations in the GK104 chip. We will just have to wait and see I guess. -
Because for the millionth time
(A) 680mx was run on a MAC (hence the lower scores)
(B) Those are not official numbers for the 780m
So in essence you're trying to say that they run 15% more cores, same clock speed, with 22% less TDP, with 25% higher score? -
10 pages later...
Cloud obviously won't budge on his opinion.
HTWingNut obviously doesn't agree with Cloud.
Let it go, fellas. Just wait until the card is released.
-
Do you even look at the numbers I`m providing you?
They reduce the TDP down from 122W to 100W and it perform 8% better, not 25% more. Its as simple as GTX 680MX running at 720MHz and the 780M boosting up to 778MHz. GPU Boost 2.0
The notebooks still ship with 180W PSUs. What does that tell you? They draw the same power. TDP is the same. Voltage reduction. I have provided you with 3DMark scores that show how they will perform.
"Those are not official numbers"
So you disagree with every number you see just because it isn`t a official review by Anandtech or whatever? Even in the Haswell thread you don`t believe screenshots although they are from a MSI notebook.
"680mx was run on a MAC"
So? -
780m 3DMark11 from screenshots was ~ 7600, right?
680m 3DMark11 is ~ 6100, right?
7600 is about 25% higher than 6100.
That tells me that it may be clocked slower.
No, I'm skeptical and it's from an unknown source. If someone showed you a photo of a bridge for sale, would you buy it based on their comments alone? It could have been overclocked, even at stock voltage, but still. Show a screenshot with GPU-z or something indicating clock speeds. Not just the end result.
Macs are notorious for running slower with Windows software, that's all.
You're right. I give up. I will just wait and see instead of hashing out with unknowns. -
780M 1.05V 100W TDP: 1536 cores * 780MHz = 1198080
680M 1.10V 100W TDP: 1344 cores * 720MHz = 967680
1198080/967680 = +1.238 = +24%
It fits. -
You just said 8%...
-
Siiiigh. Against the 680MX. 7100 vs 7700 points. 8%
GTX 680MX 1.10V TDP 122W: 1536 cores * 720MHz = 1105920
GTX 780M 1.05V TDP 100W: 1536 cores * 780MHz = 1198080
1198080/1105920 = 1.08 = 8.0% -
And where did you come up with the 780MHz? I could have thrown 650MHz in there. More cores, more power required right? Just because it's 1.05V you won't magically gain 15% more cores AND reduce TDP by 22% AND increase scores by 25% (over 680m).
780M 1.05V 100W TDP: 1536 cores * 650MHz = 998400
680M 1.10V 100W TDP: 1344 cores * 720MHz = 967680
998400/967680 = +1.032 = +3.2%
It fits. -
Lets take it again HTWingNut. Are you getting old?
Its reduce TDP from 122W to 100W (-18%) and increase scores by EIGHT % (8%). (680MX vs 780M)
Or
Have the same TDP but increase score by 24% (680M vs 780M) -
King of Interns Simply a laptop enthusiast
Maybe HTwingnut chip makers are into harry potter and all that these days
so anything is possible right?
Only kidding cloud I look forward to seeing the 780 in action.
-
I'm not getting old, I am old. And I'll be happy to admit I'm wrong, but it's just not adding up. I will now officially shut up and wait for actual cards to show in the hands of reputable sources. I hope we can all do the same.
-
I personally see it adding up. But yeah, nothing to do than wait and see. -
Is there a book to place a bet here? Before that happens we need the 3rd option that you are both wrong. I'll be betting on that one.
-
Who ever lose must have "I bow down to you, winners name, allmighty oracle" in the title or signature.
I`m down with that
-
No, one will have "lucky" the other "not lucky" as their title.
-
Just read that the GTX 780 (desktop GPU) is expected to be released on May 23rd, so, we will know soon enough.
http://www.maximumpc.com/article/news/nvidia_rumored_release_geforce_gtx_780_graphics_card_may_23 -
pffft coward, you might base your expectations on luck, I base mine on specifications out there.
-
LOL...I remember last year when we had the same thread about the GTX680M the discussion ended with a lot of us getting band...
-
^^ good times
-
Very nice score, will be possible to buy separately and stick into MSI GX60?
-
Meaker@Sager Company Representative
Cloud still owes me on that since I predicted everything green and red correctly
-
Should be a nice and predictable incremental upgrade from 680m. Why are some people bringing up compute performance for a gaming gpu? I guess if you're into bit coin mining, then by all means get an amd card and have fun wasting electric. For any serious gpgpu work, you're going to use cuda.
Something important to consider is that crossfire in its current form is severely broken and causes noticeable microstutter due to its runt frames and high latency. Single gpu users won't care but those in the Clevo and AW crowd should take note.
Sent from my GT-N7000 -
Pft, you were far away from spot on with the 680M performance if I remember correctly. I however were bullseye and predicted the 680M performance and pretty much could foresee any Kepler chip that came out.
Remember how everyone said: "Nvidia is going under. It will only release that P4500 GPU" and I had to tell everyone that they were working on another GPU to replace it? I do.
However on the 7970M you were spot on and I was like way over "there". I didn`t believe anything of what you said when it was announced. Don`t act surprised though. I have never been accustomed to AMD space technology. Because thats how I see it. Its like alien and strange to me and I`m not sure I can trust it
-
Just to kill the discussion before it even starts: TPU is wrong on the TDP. Its 100W. The rest seems to match
NVIDIA GeForce GTX 780M | techPowerUp GPU Database -
failwheeldrive Notebook Deity
ermahgerd, this is off topic but I want your desktop lol. Can I borrow it for a year or two?
-
Funny thing is, all he does is play WoW at 6000 FPS and surfs Pr0n.
-
Lol'd
Total overkill on his setup unless he has like six monitors, except maybe the CPU. -
failwheeldrive Notebook Deity
Titan sli is perfect for 2560x1440/1600. Plays all current games at great framerates, and they'll be reasonably future proof as well (not that anyone with two Titans will keep them longer than a year lol.) -
Hey! You already have one, let the others have some candy!
-
failwheeldrive Notebook Deity
lol but his is beeetttteeeeeerrrrr :'( -
Don't forget MLG minecraft. Gotta get that 9001 fps.
-
Meaker@Sager Company Representative
Errr, and I quote:
1160-1180mhz at 1.1v later..... I almost hit the 9500 score you were joking about... Only reason I did not is because I need a system without optimus and a CPU with voltage control. -
Get one more Titan and you'll be close enough
Wrong. At 1440p with no AA maybe one Titan would be okay but with AA added then two Titan's OC'd are able to handle games like BF3@1440p maxed out + AA and allows the fps to never get below 60..that is true gaming bliss.
Yep pretty much. Just for extra measure I OC mine hahaha. Can't say I miss my M18x at all. BTW svl7 developed an unlocked vbios for Titan as well if you didn't know already, it gets rid of GPU Boost 2.0, power limit is 350w and voltage can be set to 1.215v. -
King of Interns Simply a laptop enthusiast
My word joker. How could you afford two of those beasts!
-
Meaker@Sager Company Representative
Way less than a seat upgrade on some cars, if you have a real income then dropping 3-4g on a system is not tricky.
-
King of Interns Simply a laptop enthusiast
Depends if you have a family to support and a wife that would perhaps object!
That would be me unfortunately
Have to win her over every year when I upgrade my GPU by saying that I make "most" of money back by selling old one...
-
failwheeldrive Notebook Deity
Awesome, I'll head over to TI to check it out, thanks for the heads up! I'm still debating on whether I should pick up a second Titan or get a new case and go wc... I know what you mean though, the M18x is an incredible machine, but I don't miss mine much either (unless I'm lugging my desktop around the house for some reason
)
-
Hence the rolleyes.
I read that Dual GTX 690's in SLi will perform better than Dual Titan's. True? -
Yes. The TITAN is still a single card. The 690 is two downclocked 680s in one card. It'll still outperform it by a bit. But the dual 690s are quad SLI; three or more TITANs will beat it whereas you cannot go past two 690s.
-
Meaker@Sager Company Representative
Unlikely as 3rd and especially 4th card scaling is shaky at best along with all the extra stutter it brings and input lag.
3DMark11 on the upcoming GTX 780M!!!
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 2, 2013.