The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    3DMark11 on the upcoming GTX 780M!!!

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 2, 2013.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Straight line? There are many softwares including scenarios where sequentual speed comes to play Meaker. Especially when working with big files (loading big maps in games too...). There are other benefits (and cons) but that belongs in a different threads so let us not have that discussion. I like the bragging rights that comes with it. Lets just leave it at that :p

    Except 680M does not have GPU Boost 2.0 and its not the same as boost current 600 series have...
    I find it strange that someone like you who mod GPUs don`t know the difference.
     
  2. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    If it has the faster memory than who cares? The only thing holding back the 680m was the slow ram. It's like going from a 660ti to a 670. That's a pretty significant upgrade.
     
  3. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I think it will be 1536 CUDA cores alright, but I wouldn't have upgraded if I had 680m.. Still may not upgrade if AMD is indeed releasing 8990m on 20nm fabrication this year, that would be huge.
     
  4. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Specially at high resolutions with AA etc and other effects enabled, the extra memory speed should help.

    What Meaker means is that the traditional use of windows etc won't really change much, because you are still bound by small file performance I/O, not so much for huge files transfering in a single second.

    Still, it sounds mighty impressive!
     
  5. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Yeah I think it is 1536 cores too. Even if 680m overclocked reached the same potential, from a marketing standpoint and the fact they already have 680mx designed and ready, allows them for much more bragging rights to the average customer, where "more cores is better" regardless of everything else. It will be easier to sell at a premium compared to just an OC'd version.
     
  6. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Well the 660 TI have 192bit bus while GTX 670 have 256bit bus, so you get higher bandwidth while output the same heat on both since the memory clock is the same :)
    If 780M with GPU Boost, there won`t be any other difference than having to do the GPU Boost manually. Or OC the memory. Lower voltage is always nice though

    I`m hoping too just like everyone else. I`m just open to the possibilty with a 680M rebrand. AMD is doing it too so who knows if Nvidia follow.
    They doubled the core count on GTX 760M compared to 660M so that might be sign that they will increase cores with GTX 780M too :)

    Several possibilities
     
  7. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    A full gk114 optimized for lower voltage and with 1200-1250mhz ram would be pretty epic and would also leave AMD in the dust. This is just speculation however and even I'm starting to get bored of it. We'll see what it'll be at Computex. If it does end up crushing the 8970m you can expect it to have a stellar price tag though.
     
  8. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    How do you get the idea that I don't know the difference? :D Seriously... that's exactly what I have been talking about earlier this day - use your brain before posting.
    Boost is just a vbios feature, the new crap isn't any better than the old, that's why I disabled it in most of my mods, even on the Titan. It seems to me that you never used anything with your so praised boost 2.0... you just see slides from Nvidia and therefore conclude it must be something great.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Because you quoted me and said that 680M have boost while I was talking about GPU Boost 2.0 which is new for 700 series (and Titan).

    "Use your brain before posting"
    Says the guy who don`t understand that people have different needs. Seriously svl, that was a little short sighted and rude of you to say.

    For the overclocker: It might be a hassle and not worth it since people have different profiles regarding what they anyway do so they control the temperature that way. I still think that the temperature slider is a nice feature.

    For the average joe: Its great because it automatically overclocks the GPU based on temperature and available power and will never exceed the temperature, while the system ensure that the user have maximum performance from the GPU at all time. The user who want better performance don`t have to deal with Afterburner and create profiles or overclock.

    For OEMs: Great. As long as the base specs fit their thermal envelope on their notebook, wether its slim, fat, big or bulky. They will now have a system that ensure that the GPU give the customer the best performance, while not having to deal with fried GPUs or other components because they miscalculated TDP or put too hot GPU in their system. With notebooks and the restricted thermal rooms available, I`m pretty sure GPU Boost 2.0 will be gladly accepted among notebook OEMs.

    No I haven`t tried it yet, but I have a few friends who have the Titan and overclock theirs and I have seen the system in action.
    Its kinda like Nvidia`s program that automatically pick the best settings in any game based on your hardware. Some hate it and want to find out what fits them best, while other people like it.
     
  10. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    No dude, GPU Boost 2.0 or any other variant is pure garbage concocted by NVIDIA to protect themselves and their AIB partners from RMA returns. They don't want people doing real overclocking so they introduced power limits and b.s. "boost" that really just scales up to the cards pre-defined TDP (it never exceeds it no matter how good your cooling).

    As a result, cards like the Titan come with weak VRM and that's why 1.215v is the hard limit of the card--to exceed it you have to do a zombie mod. If they had properly built a $1000 card, none of this would have been necessary. It's all about the $$$ but fortunately AMD hasn't gone down that road. See what you made me do? You made me make AMD look good and now svl7 is gonna rub it in. :D

    P.S. SVL7 is right, you should read up on what this stuff is about first. If anyone knows about boost and the inner workings of nvidia vbios, its him.
     
  11. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    You are welcome to tell me where I`m wrong Joker since you say I need to read about it ;)

    "NVIDIA to protect themselves and their AIB partners from RMA returns"
    And how is that not related to OEMs (which I talked about)? And how on earth can you guys compare desktops which have an almost unrestricted thermal capacity against notebooks with 100W TDP limit? Its like two totally different worlds.

    I understand the serious overclockers concern about Titan. I know about the restrictions the GPU Boost put on the user who really don`t want another feature to slam down on overclocking. I wrote about it on the post above yours. You don`t have to tell me this, I have been involved with different vbios with Titan and see what they do. But its wrong to say GPU Boost is not meant for notebooks when it own pros which seems tailored for that sort of systems.

    Not everyone is hardcore overclockers. The system still allows you to overclock. I know many who will be happy with it and don`t see it as crap. I see great potential using this in mobile enviroment. I`d like to see you guys come with argument why it should not be used by the Average joe...
     
  12. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    You're wrong in presenting GPU Boost 2.0 as anything new or good for the consumer. It's a step backward no matter how you slice it. The only ones putting a positive spin on it are NVIDIA, their shills and their AIB. NVIDIA fully gimped the Titan out of the box, it was the community and especially svl7 that unleashed its potential. Same goes for 680M, out of the box its nothing really impressive vs 7970M. None of this was needed before NVIDIA got even more greedy and began handcuffing their designs.
     
  13. DaCM

    DaCM Notebook Evangelist

    Reputations:
    204
    Messages:
    576
    Likes Received:
    6
    Trophy Points:
    31
    Like a guy who wants triple RAID SSDs is an average joe.. The average NBR forum member fits into your 'overclocker' category in my opinion, so for most of the people in this discussion Boost 2.0 only matters as long as they figure out how to disable/work around it.
    Also I'm not really sure how you are in a position to criticise svl, given his contributions to the community and proven expertise with GPUs. It's also worthwhile to point out how you don't react to the posts that make remarks on your quite biased views on nVidia GPUs and your general behaviour on the forum. Giving an explanation to why you are constantly exaggerating and discarding opposing opinions might increase your credibility.

    Anyway, I think there is a very real chance that nVidia will just release an overclocked 680M, given that the 485/580/675 were the same cards as well as far as I know, and that AMD doesn't seem to have anything that would beat the 780M significantly in this scenario either. nVidia is also not exactly known for releasing the best stuff they have even if they don't have to.
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    How is it a step back for the Average Joe who do not overclock? I`d like to see some arguments against that please

    Tell me, where have I said I was Average Joe? Where is your data showing that NBR members don`t fit in that category? If we have both, shouldn`t we cover both grounds in our arguments (which I did..)? There is room for both, but you seem to disregard one of them. Also, you clearly are not aware that overclocking is possible with GPU Boost active.

    As for your Nvidia bias BS, troll harder next time ey? You are clearly flaimbaiting.

    I didn`t critisice SVL, I tried to have a debate against him. He do know his way around vbios and that stuff, but that does not mean he is always right. I disagree on him about GPU Boost 2.0. I`m allowed to have a different opinion right?
     
  15. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Average joe's don't purchase $1000 cards. If they do, they are way too much of a minority to care about. This was purely about greed motivating NVIDIA at the expense of the enthusiast who ARE their biggest customer in that price segment.
     
  16. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Well you do have some point Joker. But on the other hand, I know several people who owns notebook with 680M and have never tried to overclock it. I don`t think they actually know how to do it. I`m sure they would gladly accept GPU Boost. :p
    I totally understand and agree that to some extend the GPU Boost does mess up the overclocking at some point. So I guess its ok to have different views on it. Lets just leave it at that. I personally think its interesting and welcome it. The rest can join forces with SVL to get rid of the godforsaken crap :)
     
  17. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    You simply talked about "boost", you didn't specify anything. That's exactly what I mean, can't you just stop being unclear and then picking on words of others if they didn't get what you actually meant (not what you wrote).


    See, that's another very important point. You shouldn't debate against someone, you should debate with someone.
    That's a very fundamental rule for a reasonable discussion.


    I would answer that with no, but I'm afraid that was a rhetorical question :p
    It's not as if I expect you to realize that anything Nvidia does could possibly not be the best idea for everyone, but that's how it is. We're talking about the highest-end GPU here which costs almost as much as a Titan, and the boost is just castrating it, boost 2.0 is even worse than the first gen stuff when it comes to this. You haven't used 2.0 yet, right?
    It's just as Joker says, boost has no business in the enthusiast market.
    Also that stupid policy of Nvidia which results in only reference cards being released... but that's another story.


    What? He clearly has a valid point here. Please don't just label anything which criticizes your favorite GPU vendor as "trolling". The 680m is seriously overvolted and also underclocked, they could have configured it with 0.95V and 800MHz from the very beginning. 680mx is just another chapter of that story.
     
  18. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    BTW did you guys hear that AMD plans to release their 20nm chips in 2013? Probably will have a paper launch in December with real cards (both mobile and desktop) hitting in early 2014 but it should be out on the market before NVIDIA can answer with their own die shrink.
     
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I have been talking about GPU Boost 2.0 and 780M for over a month now. I`m sorry that you haven`t followed the discussions lately, but I personally think I was pretty clear when I compared GT 750M and GTX 780M. Its a forum, not an essay text that will be corrected by some teacher. You drew some semi conclusion that I was talking about regular boost, I did the same thinking you didn`t know the difference. Its ok.


    Sorry. I`m from Norway and english is not my native language


    See, why don`t you read what I write next time ey? I wrote about overclockers, average joe, oems, how is that not everyone? I said it may not be ideal for some. You seem pretty quick to draw conclusions and just don`t want to listen to what I`m writing.
    Titan cost just as much as 780M yes. Well that may be right, but not everyone overclock it either. Same thing happend to some Titan owners. Some grew tired of Boost, changed to higher power settings, tried tons of VBIOS, some didn`t mind. Why should 780M be any different?
    And I disagree with the whole "it cost $800 so they must be enthusiasts" . I know several people who just want to game. They don`t know much about hardware, they own 680M, they just want enough power for the newest games.

    Stop nitpicking my words. I responded to his "It's also worthwhile to point out how you etc etc". That is pure trolling.

    I have seen Prema and his data on the different voltage between 650M and 745M. It got a pretty big voltage reduction, so yes, I can understand that they have a lot of voltage headroom to play around with. Hence why I had no problem believing 680MX on 100W TDP.
     
  20. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    omg, these threads turn so quickly into reality shows :D
     
  21. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Very good point, Nvidia, just like any other company, is using marketing strategies here (to beat its rival, AMD, on paper), check this out:

    1. Nvidia plans 675mx as 680m last year leaking screenies to understand what AMD has in the pocket.
    2. AMD, seeing Nvidia aiming at a moderate increase in power, challenges them with 7970m in an instant.
    3. Nvidia, realizing they are about to lose the high-end market to AMD, pulls the next year product, actual 680m, a year in advance.
    4. Nvidia, realizes that they are about to lose a year of profit from the next gen, underclocks AND overvolts 680m, so that it performs much worse than it can on stock environment (and comes on-par with 7970m).
    5. Nvidia, pulls the clocks and volt to normal on 680m, to re-release it, for a huge gain in performance (stock environment), a very nice illusion.
    6. Question is, will AMD, seriously again challenge Nvidia with 20nm this year?

    I wouldn't call it trolling, but certainly profit-oriented :)

    +rep for the beautiful reasoning there Svl!

     
  22. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Hahahaha I agree maxheap. People don't fight so much and don't get so offended Cloud. Enthusiasts will work to overcome the limitations that comes with GPU boost. Users oblivious to the whole deal will enjoy their high end gaming :)
     
  23. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Yep, hehe. Running 900MHz GPU / 1100MHz vRAM on 680m stock volage. Can run 950/1150, but that's near borderline so would rather play it safe. Still ~30% improvement at stock voltage and temps < 80C.
     
  24. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    At stock voltage, and it's 30% ish overclock for both core and additional memory overclock. It's massive. I have used 900/1100 but I default to lower clocks usually. With my latest repaste, I barely break 73~C when gaming, sometimes even less. Only very demanding games have ever reached me to 80s and thats when overclocked to 900/1100
     
  25. DaCM

    DaCM Notebook Evangelist

    Reputations:
    204
    Messages:
    576
    Likes Received:
    6
    Trophy Points:
    31
    No. However it's also worthwhile to point out how you avoided giving an answer again.

    As others said though, I think nVidia can easily get away with a tweaked 680M this year as the 8970M is pretty much an overclocked 7970M as well, and keep the 680MX for next year or release a 780MX later on maybe :p .
     
  26. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Unless nVidia wants to delay Maxwell, I think they will offer some form of optimized 680mx as the 780m this year, likely in the next month. Whether it will truly be a 680mx with just lower voltage parts and lower clocks is yet to be confirmed. I can easily see them bumping up the clock speed of the 680m considering how it can easily overclock and at reasonable temperatures, and maybe that's all it will be.
     
  27. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Power consumption does go up a lot with the frequency though, so a 30% increase in frequency is drawing 20-30% more power.
     
  28. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    True, but if voltage is dropped a bit it may be able to maintain a 900MHz clock (just for example) at same power as the 680m at 720MHz. I dunno. Just that increasing shader count and clock speed and still maintaining 100W TDP and power consumption seems a bit unrealistic. Then again, it isn't power really that's the concern it's TDP right?
     
  29. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    TDP is power though really.

    The way boost works, a benchmark like 3dmark 11 is going to see the biggest increase while a game benchmark like crysis 3 will see the smallest gain.
     
  30. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
  31. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Yes but that's based off rumored specifications, I think at one point it was claiming it would be titan based.
     
  32. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
    I see. So as of now the most reliable thing on 780M are the 3DMark11 scores you found carried out by somone in China?
    And I also think I read somewhere on here you or someone else saying 8970M is a 7970M with higher clocks?
     
  33. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    This is true. Just 100W TDP does not mean 100W consumed by the GPU. That's the problem with things like synthetic benchmarks. It takes perfect advantage of the short boost period to artificially inflate scores.

    Wikipedia is guessing like everyone else is.
     
  34. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    I will wait till the next generation, this just isn't a big enough leap, my 680M reaches insane scores and runs everything at uttermost high-ultra.

    I can deal with it :D
     
  35. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    BUT BUT BUT (so many big buts)... you could play at 70 FPS instead of 60!!!
     
  36. senshin

    senshin Notebook Evangelist

    Reputations:
    124
    Messages:
    311
    Likes Received:
    11
    Trophy Points:
    31
    yeah, that would give you 500% performance and you input lag will get better with 1ms!!!!!! or something like that lol
    Also I think with the 790M you can feel the graphics, you can feel water and gunshots, and then AMD goes bankrupt and they buy Intel.... cannot wait!!

    What a hype
     
  37. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I'm hoping Dell goes for the 4GB model this time. Games today are chomping up some VRAM, especially on high-res externals.
     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I'm going from single 280M (which can no longer overclock, even 5MHz) to SLI 780Ms.

    How much FPS do I get? :D
     
  39. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It won't matter. Even a high end desktop at 1080p/1200p won't consume 2GB vRAM.


    LOL. Don't get me wrong. Just from 680m to 780m it won't be that significant. From even a 580m or less powerful it will be a world of difference. But heck, 280m to SLI 780m is like ecstasy.
     
  40. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Wow, huge leap. Can't believe you still use it, lol!

    @HT: With multiple monitors it will. I'd still prefer 4GB over 2GB, especially if I don't plan to upgrade for a few years. Personal preference I suppose. :p
     
  41. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I said 1080p. Multiple monitors why not just get a desktop.
     
  42. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    You ever been in a dorm room? A desktop would take up too much space... Maybe after I finish school, assuming I have time to game, I'll get myself a nice desktop setup.

    Edit: And if you were wondering how I have room for multiple monitors, I don't. I sneak into a lab at night and setup monitors to game, haha. Haven't gotten in trouble yet. :p
     
  43. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Desktop take up too much space compared with something like an M18x? Something like this: Amazon.com: Cooler Master Elite 120 Advanced - Mini-ITX Computer Case with USB 3.0 Ports and Long Video Card Support (RC-120A-KKN1): Computers & Accessories

    16x9x8 inches. Smaller footprint than most 17 inch notebooks.

    [​IMG]
     
  44. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Let's just say I got it for school in 2009 (and they lied about what they were going to teach and what we needed, otherwise this'd have never been purchased) and I haven't been able to get a job till earlier this year. So... yeah. SOOOO many SP games to replay with max graphics. Or hell, I might even 3D for some stuff like Prototype 2 and Darksiders 2.
     
  45. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Haven't looked into those types. Can all the hardware fit in that thing? Looks small like X51.
     
  46. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Yep, that's what I've got. mITX motherboard with support for Extreme CPU's, full size cards (up to 12 inches I think), ATX PSU. Limits are 2 RAM slots, single PCI-e slot, 3 hard drives/SSD's, CPU cooler limited in size, but there are several good ones to choose from. But for a gaming machine it is more than adequate. Throw in a GTX 680 (or even 690), 2x8GB DDR3-2133, i7-3770K and you're ready to rock.
     
  47. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56
    Yup, you can go all the way up to 690s and Titans with Mini ITX builds. People even fit full wc loops in them these days. You should also check out micro ATX cases like the Corsair 350D, J.Dre. They're not much bigger than mini itx cases and can fit multiple graphics cards. Pretty crazy what you can do with desktops these days.
     
  48. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Even the C70 mid tower I'm using is not that big or heavy. Not that I'd ever need to carry it around but when I do need to move, it'll be a very painless one. I don't see myself buying another 12lb+ gaming notebook again. Next one I buy will be (if they develop it) a light weight one that's 1" thick that uses an EGPU external slice to latch on the GPUs from the bottom. With a modular design, I can decide when and where to game with it. But that's a totally different topic.

    I think the 780M GTX will probably be the 680MX but tweaked a bit to meet the 100W TDP of most high end notebooks. Of course you never know, NVIDIA could very well get greedy and just toss an overclocked 680M in there and still whip AMD. :D
     
  49. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I used to take my Shuttle "desktop" everywhere. It fit in a duffle bag and hung a 17" monitor off the side of it for use wherever I went. Shuttles are / were great, problem is proprietary motherboard and power supply and form factor. So you're stuck with their products and replacement parts are expensive. Plus they weren't always the best quality. But it was great to have.
     
  50. failwheeldrive

    failwheeldrive Notebook Deity

    Reputations:
    1,041
    Messages:
    1,868
    Likes Received:
    10
    Trophy Points:
    56
    Yeah, mid towers are the perfect size for most builds imo; they're big enough for SLI and watercooling but still easy to carry around. I was thinking about getting a C70 for a while because of its wc potential and carry handles but I ended up going with the old full tower Phantom for some reason... I guess I wanted a 630, but didn't want to spend $170 on a case lol.
     
← Previous pageNext page →