with 690 you can make quad SLI (2x3072 CUDA cores = 2x690 >= 4x680 - epsilon) in terms of single gpu (as it is a double gpu card). But with TITAN you can make 3xTITAN (3x2688 CUDA cores). I guess triple SLI TITAN should perform much better right? Damn, I miss having desktop![]()
Also as Meaker is saying, I think in terms of efficiency at least triple SLI is way better than quad SLI, right?
-
-
Meaker@Sager Company Representative
I would take 2 titans over 2 690s let alone 3.
-
Anything more than SLI is just a waste, so i would not take 3 of anything. Other than that 690 is limited by ram (2GB) so titans for me.
-
failwheeldrive Notebook Deity
-
-
Wish I was THAT rich too hahaha
Waiting for the 780m and Haswell. I hope I won't be dissapointed -
Meaker is right, the performance of dual 690s doesn't scale well at all. This is what I got with an OC using svl7 vbios for Titan: http://www.3dmark.com/3dm11/6495702 and here's 2 x 690: http://www.3dmark.com/3dm11/3605389 with a large OC they can reach 37k GPU which isn't the greatest scaling.
Anyway back on topic I guess, we went way OT. -
TheBlackIdentity Notebook Evangelist
-
Titan SLI is clearly a much better choice than 690 Quad SLI. Not just performance, but heat wise, power consumption, noise, latency, just about everything
NVIDIA GeForce GTX TITAN SLI & Tri-SLI Review | techPowerUp -
failwheeldrive Notebook Deity
Oh, and clear your inbox. I can't pm you anymore
-
grrrrrrrrrr, jealous!!!
-
TheBlackIdentity Notebook Evangelist
I am however thinking about two 780's or even a 790 if nvidia makes it.
PS: If anyone wants to get that screen make sure to buy the P version. The one that doesn't have a P is useless for gaming. Here's the review on the P version. http://www.anandtech.com/show/6741/lg-29ea93-monitor-review-rev-125 -
You people are forgetting that game recording and streaming are a thing, and even though a Titan at 1080p should run most anything at 60fps, there are people with 120Hz and 144Hz 1080p monitors that stream 720p 60fps and want their games to all be 120/144 fps constant, minimum. One Titan can't really do that (ESPECIALLY not at max settings, AA or not). Trust me. Far less when 1080p 60fps streaming becomes a solid thing when internet steadily improves worldwide and people need to squeeze even more power out of their system. Currently the only good method of doing this is to build a second PC and hook your gaming machine up to a capture card, but cap cards simply don't grab 1080p 60fps right now (at least none that I know of) and you have to do a bunch of stuff to get the audio to work on it.
The Titan should have been our desktop 680 and the cut down titan should have been our 670 and the 680 should have been our 660Ti and the 670 should have been our 660 and our low end desktop cards should have been somewhere around the 660Ti, 660 and 650Ti etc. Would have propelled gaming very far if that happened, because I'm gonna assume their price points would have been similar and there'd be no $800 single GPU. That's why our laptop cards are so close to the desktops, because they're the midrange architecture. 1440p gaming (especially with 120Hz screens and above) is going to require an upgrade with the power difference between a Titan and a GTX 580 to happen again versus the current Titan. Of course, if Maxwell can do that in 2014, the new consoles will probably start holding us back again not even a year after they releasebut oh well.
On a side note, running multiple screens doesn't need AS MUCH memory as people make it out to... it does require a bit, but not a huge amount. If you're running a GAME on multiple screens though, like how people use 3 screens or even 6 screens for a huge panoramic view, then yeah. You're gonna need massive amounts of GPU memory for damn sure... but it doesn't happen very often and many games don't have the ability to run like that without community mods or even at all, mod or not. So I don't know how relevant those massive screen numbers really are with regards to memory since most gaming is going to be done on one monitor. -
King of Interns Simply a laptop enthusiast
I am pretty sure the human eye cannot perceive the difference between a steady 60 fps and something higher. Let alone more than double!
So why do they need their games to run at 120/144 fps constant? What we can perceive is frame speed drop even for short amounts of time. -
@king : You must never have played on 120hz screen to say that. Human eye can perceive even 1/4000 sec images or light, depending of the contrast and lights. It's only a matter of retina persistence.
@d2 : streaming is limited by cpu, not gpu. -
TheBlackIdentity Notebook Evangelist
-
-
Interesting, do you remember where was it you first read from? (nothing sarcastic, just wanted to read a bit about it)
-
TheBlackIdentity Notebook Evangelist
-
Hey guys! the eye can only see 24 fps obviously!
I really hate this misconception, and it's one that simply won't die off no matter how many years pass -
Definitely I did, however I didn't see any link to US airforce.
-
I found a summary of the USAF testing:
-
Kinda puts it into perspective. If the 780M is an underclocked GTX 680, then we're about to see some SERIOUS numbers being pushed.
I'm still waiting on the mobile TITAN to appear. >w> -
Thanks bro!
-
-
-
I`ve used a 120Hz screen. Its night and day between 60Hz and 120Hz in my opinion. 120Hz is so much better. To see the difference while gaming you need over 60FPS so its really important to have a good GPU with those screens
-
King of Interns Simply a laptop enthusiast
Or two or more lol if you like to play Crysis 3 maxed out
-
Most people don't see the benefit of 120hz screens mainly because they either don't have a monitor capable of such rate or higher, and because they don't have content created around such fps. It's easy to dismiss its benefit until you play around with it.
-
Quagmire LXIX Have Laptop, Will Travel!
With the early numbers seen on 780m and 8970m, I'm thinking it's worth it if you're coming from a 580m or 6990m and pat yourself on the back for showing restraint upgrading to a 680m or 7970m from those (even though both gave a great boost, now you'll have super boost).
A "Titan" mobile would likely be breaking some type of physics lawsIt would be super collider fears of ripping open a black hole all over again.
-
failwheeldrive Notebook Deity
The Titan is actually incredibly efficient given its performance, making it a great candidate for a mobile version imo. Since the desktop gk110 is only 250w, it shouldn't be too difficult to create a 100-150w mobile card with less voltage, cores, memory bandwidth, etc. -
Quagmire LXIX Have Laptop, Will Travel!
No doubt a great Kepler and of course I was being facetious, but wouldn't that mean EVGA gets into the mobile market? Man, would I like to see that.
-
TheBlackIdentity Notebook Evangelist
-
-
(Nothing to see here)
-
-
-
Sent from my SPH-L900 using Tapatalk 2 -
I think the alarming part here is that he doesn't get paid for this, but comes with such stuff all by himself.
-
Haha, he is just very enthusiastic
I think. Maybe he does get paid by nVidia hahaha.
Nontheless, the cool thing about 780m is that every single clock increase will yield more performance than my 680m. OCing will be fun. Having full high end desktop level performance on mobile is quite an interesting idea... hmm,... -
TheBlackIdentity Notebook Evangelist
-
Meaker@Sager Company Representative
Or they just rotate the core 45 degrees....
-
Oh man I`ve been thinking:
GT 650M: 384 cores @ 850MHz. GDDR5 @ 9000MHz
3DMark11 Graphic score: 2300
GT 750M: 384 cores @ 967MHz - 1100MHz. GDDR5 @ 1250MHz
3DMark11 Graphic score: 3012
30% higher score, 30% higher clocks
-----------------------------------------------------------------------------------------------------------
Here comes my worst thoughts:
GTX 680M: 1344 cores @ 720MHz. GDDR5 @ 900MHz
3DMark11 Graphic score: 6000
GTX 780M 1344 cores @ 800 - 940MHz. GDDR5 @ 1250MHz
3DMark11 Graphic score: 7700
30% higher score, 30% higher clocks.
Do you guys see the similarity here?
What if GTX 680M is nothing more than a GTX 680M with GPU Boost and lower voltage? Would they dare to do something like that? -
It's part of what we mentioned before. It is possible, for it to be an OC'd version, or likewise a 680mx with a proportional clockspeed. 680m certainly had enough room, so only time will tell.
When is it supposed to release? -
June most likely.
-
If I should guess,
Announcement May 23rd along with the GTX 780 or
@ Computex 4-8th June
If they just slap on GPU boost and call it a day I`m not sure I want to upgrade. Haswell plus 3 SSDs in Raid0 is pulling me toward it, but I have to consider it more carefully. Its nice to have a system that overclocks the GPU for me though, and let me pick maximum temperature so who knows.
GTX 680MX as 780M is more exciting though because I have more cores to play with. -
Meaker@Sager Company Representative
3 SSDs in raid 0 wont help anything :/
-
Sure just 1000MB faster than single SSD and 500MB faster than current Raid0 card...
"Won`t help anything" is a very vague reply Meaker. -
Meaker@Sager Company Representative
Straight line speed improves yes, but the small file speeds see no benefit from Raiding them so windows wont load faster nor will most games.
-
3DMark11 on the upcoming GTX 780M!!!
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 2, 2013.