The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    3DMark11 on the upcoming GTX 780M!!!

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 2, 2013.

  1. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    with 690 you can make quad SLI (2x3072 CUDA cores = 2x690 >= 4x680 - epsilon) in terms of single gpu (as it is a double gpu card). But with TITAN you can make 3xTITAN (3x2688 CUDA cores). I guess triple SLI TITAN should perform much better right? Damn, I miss having desktop :(

    Also as Meaker is saying, I think in terms of efficiency at least triple SLI is way better than quad SLI, right?
     
  2. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I would take 2 titans over 2 690s let alone 3.
     
  3. Undyingghost

    Undyingghost Notebook Evangelist

    Reputations:
    78
    Messages:
    437
    Likes Received:
    33
    Trophy Points:
    41
    Anything more than SLI is just a waste, so i would not take 3 of anything. Other than that 690 is limited by ram (2GB) so titans for me.
     
  4. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56
    It really depends on your monitor setup. It goes sort of like this: 1080p=Titan, 1440/1600=Titan SLI, 5760x3240 and up=4 Titans
     
  5. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    That would be insane. Wish I was rich... :D
     
  6. felix3650

    felix3650 Notebook Evangelist

    Reputations:
    832
    Messages:
    631
    Likes Received:
    224
    Trophy Points:
    56
    Wow that is just WOW. A Titan is powerful by itself, let alone 3 of them driving each one it's monitor. I guess it would have trouble finding a suitable PSU :p
    Wish I was THAT rich too hahaha :)

    Waiting for the 780m and Haswell. I hope I won't be dissapointed :D
     
  7. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Meaker is right, the performance of dual 690s doesn't scale well at all. This is what I got with an OC using svl7 vbios for Titan: http://www.3dmark.com/3dm11/6495702 and here's 2 x 690: http://www.3dmark.com/3dm11/3605389 with a large OC they can reach 37k GPU which isn't the greatest scaling.

    Anyway back on topic I guess, we went way OT.
     
  8. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    Well if you don't go insane with the anti aliasing one overclocked Titan is enough for 1440p especially if it's on water. At that res I don't see much point going above 2xmsaa anyway. :rolleyes:
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  10. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56
    That's true, the Titan could be considered overkill for 1080p. I need a better monitor lol.

    Oh, and clear your inbox. I can't pm you anymore :p

    Absolutely correct. Nothing beats the Titan's efficiency or latency :thumbup:
     
  11. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    grrrrrrrrrr, jealous!!!
     
  12. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    I think a single Titan is a perfect match with this screen. Newegg.com - LG 29EA93-P 29" 5ms HDMI 21:9 UltraWide LED Backlight LCD Monitor, IPS Panel 300 cd/m2 5,000,000:1 4-Screen Split Built-in Speakers

    I am however thinking about two 780's or even a 790 if nvidia makes it. :D

    PS: If anyone wants to get that screen make sure to buy the P version. The one that doesn't have a P is useless for gaming. Here's the review on the P version. http://www.anandtech.com/show/6741/lg-29ea93-monitor-review-rev-125
     
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You people are forgetting that game recording and streaming are a thing, and even though a Titan at 1080p should run most anything at 60fps, there are people with 120Hz and 144Hz 1080p monitors that stream 720p 60fps and want their games to all be 120/144 fps constant, minimum. One Titan can't really do that (ESPECIALLY not at max settings, AA or not). Trust me. Far less when 1080p 60fps streaming becomes a solid thing when internet steadily improves worldwide and people need to squeeze even more power out of their system. Currently the only good method of doing this is to build a second PC and hook your gaming machine up to a capture card, but cap cards simply don't grab 1080p 60fps right now (at least none that I know of) and you have to do a bunch of stuff to get the audio to work on it.

    The Titan should have been our desktop 680 and the cut down titan should have been our 670 and the 680 should have been our 660Ti and the 670 should have been our 660 and our low end desktop cards should have been somewhere around the 660Ti, 660 and 650Ti etc. Would have propelled gaming very far if that happened, because I'm gonna assume their price points would have been similar and there'd be no $800 single GPU. That's why our laptop cards are so close to the desktops, because they're the midrange architecture. 1440p gaming (especially with 120Hz screens and above) is going to require an upgrade with the power difference between a Titan and a GTX 580 to happen again versus the current Titan. Of course, if Maxwell can do that in 2014, the new consoles will probably start holding us back again not even a year after they release :p but oh well.

    On a side note, running multiple screens doesn't need AS MUCH memory as people make it out to... it does require a bit, but not a huge amount. If you're running a GAME on multiple screens though, like how people use 3 screens or even 6 screens for a huge panoramic view, then yeah. You're gonna need massive amounts of GPU memory for damn sure... but it doesn't happen very often and many games don't have the ability to run like that without community mods or even at all, mod or not. So I don't know how relevant those massive screen numbers really are with regards to memory since most gaming is going to be done on one monitor.
     
  14. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I am pretty sure the human eye cannot perceive the difference between a steady 60 fps and something higher. Let alone more than double!

    So why do they need their games to run at 120/144 fps constant? What we can perceive is frame speed drop even for short amounts of time.
     
  15. edryr

    edryr Notebook Consultant

    Reputations:
    24
    Messages:
    258
    Likes Received:
    75
    Trophy Points:
    41
    @king : You must never have played on 120hz screen to say that. Human eye can perceive even 1/4000 sec images or light, depending of the contrast and lights. It's only a matter of retina persistence.

    @d2 : streaming is limited by cpu, not gpu.
     
  16. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    Ohh great. This stupid misconception again. The human eye can detect up to 230fps. Tested by the US airforce.
     
  17. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Not all this again. Every person perceives motion differently... and everyone has their own comfort zone. So whether 30fps, 60fps or 240 it doesn't matter.
     
  18. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Interesting, do you remember where was it you first read from? (nothing sarcastic, just wanted to read a bit about it)

     
  19. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    Google it.
     
  20. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Hey guys! the eye can only see 24 fps obviously! :rolleyes:

    I really hate this misconception, and it's one that simply won't die off no matter how many years pass :p
     
  21. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Definitely I did, however I didn't see any link to US airforce.

     
  22. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I found a summary of the USAF testing:
    There is no original source, as far as I can find.
     
  23. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    It's a start though. :p Kinda puts it into perspective. If the 780M is an underclocked GTX 680, then we're about to see some SERIOUS numbers being pushed.

    I'm still waiting on the mobile TITAN to appear. >w>
     
  24. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Thanks bro!

     
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Mostly, but not entirely. Games DO get an fps drop when streaming even if the CPU isn't being maxed out. This I can guarantee. For 60fps it may not be a problem, but like I said, when you want 120Hz at 1440p when streaming 60fps at 720p or 1080p...
     
  26. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Besides the point that most gamers who are accustomed to it can tell a difference between 60Hz and 75Hz (far less 120Hz), some games simply work better at higher framerates. Call of Duty, for example, works best at 333 fps constant, followed by 250, followed by 125, followed by ~72, followed by 60 etc.
     
  27. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`ve used a 120Hz screen. Its night and day between 60Hz and 120Hz in my opinion. 120Hz is so much better. To see the difference while gaming you need over 60FPS so its really important to have a good GPU with those screens
     
  28. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Or two or more lol if you like to play Crysis 3 maxed out
     
  29. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Most people don't see the benefit of 120hz screens mainly because they either don't have a monitor capable of such rate or higher, and because they don't have content created around such fps. It's easy to dismiss its benefit until you play around with it.
     
  30. Quagmire LXIX

    Quagmire LXIX Have Laptop, Will Travel!

    Reputations:
    1,368
    Messages:
    1,085
    Likes Received:
    45
    Trophy Points:
    66
    Sure enough it's a start and the yet undetermined OC potential could be huge (love to see it), but I was expecting more with 780m being so removed from the 7970m (680mx and 680m that already have higher GPU scores and typically better OC potential). Like a couple folks have posted though, they have some crappy 7970m OC'rs, always the luck of the draw, more so when you don't want to adjust voltages or vBIOS. In this case, based on the presented scores of 1 benchmark, you would clearly be buying a good GPU boost of 22% 780m stock vs. 7970m stock and adding green team goodies on top.

    With the early numbers seen on 780m and 8970m, I'm thinking it's worth it if you're coming from a 580m or 6990m and pat yourself on the back for showing restraint upgrading to a 680m or 7970m from those (even though both gave a great boost, now you'll have super boost).

    A "Titan" mobile would likely be breaking some type of physics laws :) It would be super collider fears of ripping open a black hole all over again.
     
  31. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56

    The Titan is actually incredibly efficient given its performance, making it a great candidate for a mobile version imo. Since the desktop gk110 is only 250w, it shouldn't be too difficult to create a 100-150w mobile card with less voltage, cores, memory bandwidth, etc.
     
  32. Quagmire LXIX

    Quagmire LXIX Have Laptop, Will Travel!

    Reputations:
    1,368
    Messages:
    1,085
    Likes Received:
    45
    Trophy Points:
    66
    No doubt a great Kepler and of course I was being facetious, but wouldn't that mean EVGA gets into the mobile market? Man, would I like to see that.
     
  33. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    Actually if they made a new MXM board they wouldn't even need to cut the memory bus. 2112 cores with a 384bit bus and 3 gigs of memory would be doable. The new board would need to be wider though because the ram would need to surround the core from 3 sides to accommodate the lanes connecting them to the core.
     
  34. harmattan

    harmattan Notebook Evangelist

    Reputations:
    432
    Messages:
    642
    Likes Received:
    55
    Trophy Points:
    41
    My thoughts as well. Since there is really only one game (Crysis 3) that currently can't be maxed at 1080p by dual 680ms, we are really getting into the territory of diminished returns (unless you're looking for more 3DMark points, in which case you have more money than sense). I can, however, see the benefit if you're coming from a previous gen chip e.g. 485m/580m/675m, 6990m, OR if you have a single 680m or 7970m and that's just not cutting it.
     
  35. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    (Nothing to see here)
     
  36. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Which is why I'm expecting MXM 4.0 with Maxwell.
     
  37. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    It seems those really intriguing posts of yours seem to become pretty frequent... Maybe you should think about what you write before hitting the "reply" button, I think the overall quality of the threads and discussions here would definitely improve, especially if you did this every time you post something.

    Very unlikely, it's not like the industry wants even bigger cards (well, besides Asus and Samsung who apparently aren't capable of putting all the necessary parts for a GPU on a current MXM boards). With the introduction of PCI-e 3.0 in the revision 3.1 of the specs MXM is up to date, uefi support has been there ever since the announcement of MXM 3.0. I can't really think of anything they need to add just now.
     
  38. dandan112988

    dandan112988 Notebook Deity

    Reputations:
    550
    Messages:
    1,437
    Likes Received:
    251
    Trophy Points:
    101
    How much does nvidia pay you per month?

    Sent from my SPH-L900 using Tapatalk 2
     
  39. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    I think the alarming part here is that he doesn't get paid for this, but comes with such stuff all by himself.
     
  40. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Haha, he is just very enthusiastic :p I think. Maybe he does get paid by nVidia hahaha.

    Nontheless, the cool thing about 780m is that every single clock increase will yield more performance than my 680m. OCing will be fun. Having full high end desktop level performance on mobile is quite an interesting idea... hmm,...
     
  41. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    Maxwell is getting GDDR6 so they can get away with a 256bit bus. I think Volta will the one that gets a new mxm board. They'll need more space for the core with the stacked memories surrounding it.
     
  42. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Or they just rotate the core 45 degrees....
     
  43. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Oh man I`ve been thinking:

    GT 650M: 384 cores @ 850MHz. GDDR5 @ 9000MHz
    3DMark11 Graphic score: 2300

    GT 750M: 384 cores @ 967MHz - 1100MHz. GDDR5 @ 1250MHz
    3DMark11 Graphic score: 3012

    30% higher score, 30% higher clocks

    -----------------------------------------------------------------------------------------------------------

    Here comes my worst thoughts:

    GTX 680M: 1344 cores @ 720MHz. GDDR5 @ 900MHz
    3DMark11 Graphic score: 6000

    GTX 780M 1344 cores @ 800 - 940MHz. GDDR5 @ 1250MHz
    3DMark11 Graphic score: 7700

    30% higher score, 30% higher clocks.

    Do you guys see the similarity here?
    What if GTX 680M is nothing more than a GTX 680M with GPU Boost and lower voltage? Would they dare to do something like that?
     
  44. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    It's part of what we mentioned before. It is possible, for it to be an OC'd version, or likewise a 680mx with a proportional clockspeed. 680m certainly had enough room, so only time will tell.

    When is it supposed to release?
     
  45. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    June most likely.
     
  46. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    If I should guess,
    Announcement May 23rd along with the GTX 780 or
    @ Computex 4-8th June

    If they just slap on GPU boost and call it a day I`m not sure I want to upgrade. Haswell plus 3 SSDs in Raid0 is pulling me toward it, but I have to consider it more carefully. Its nice to have a system that overclocks the GPU for me though, and let me pick maximum temperature so who knows.
    GTX 680MX as 780M is more exciting though because I have more cores to play with.
     
  47. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    3 SSDs in raid 0 wont help anything :/
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Sure just 1000MB faster than single SSD and 500MB faster than current Raid0 card...

    "Won`t help anything" is a very vague reply Meaker.
     
  49. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Straight line speed improves yes, but the small file speeds see no benefit from Raiding them so windows wont load faster nor will most games.
     
  50. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    You do realize that the 680m comes with boost, just as any other Kepler GPU?
     
← Previous pageNext page →