The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    My M18x turned into a zombie...so now I'm building that monster desktop I always talked about.

    Discussion in 'Alienware 18 and M18x' started by vulcan78, May 10, 2014.

  1. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    In some tests there is no difference. That said, so far, the only thing I have found that Windows 8.X.X is better at is Fire Strike graphics scores. Fire Strike was optimized for Windows 8. In spite of the optimization, the physics test results, even in that benchmark, are better under Windows 7. Futuremark, for whatever reason, decided to mostly ignore the physics results when calculating the overall Fire Strike score. In many tests Windows 7 fares better than Windows 8. In gaming I typically see no appreciable difference or benefit to using one versus the other.

    It seems like the folks that say Windows 8 performs better are either mistaken, repeating what they have heard others say, or both. I haven't found any basis for validating such comments, and the opposite seems to be true more often than not. I'd be guessing, but I suspect many that were running degraded Windows 7 installations that were bogged down by a year or more of crap, corruption and Windows Updates cancer probably felt a fresh install of Windows 8 was faster. In that case they were most likely right simply because a clean OS install almost always acts and feels peppier.
     
    vulcan78 likes this.
  2. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Firestrike is more based off of gpu and combined score because people were arguing that the physics score gave to much of an advantage over people who used dual and quad core cpus. So the compromise was to raise the limit on combined. Speculation of course.
     
    Mr. Fox and vulcan78 like this.
  3. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    Yeah, I've heard that too, LOL. Sound like something low budget benchers with AMD CPUs would say instead of admitting they made a decision to cut corners and purchased an inferior product. If they (Futuremark) would just calculate the score using simple math instead of "magic" then people making excuses for poor physics performance wouldn't have a leg to stand on.
     
  4. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    No matter how it's sliced.....The masses will always have something to say if their brand does not fair well. Benchers just get the best thing at that time and run with it till the next best thing hits the market. Then swap out old parts for the new ones. Does not matter of brand. As long as it's the best or near best at the time of purchase.

    This round kind of cut things down by allot! since money really did make a serious difference this round.
    cpu 1100.00
    motherboard 550.00
    ram 450.00
    gpus on average 600.00x4=2400.00+
    and so forth....
    Same with laptops. Price is going up while the users enjoyable experience is going down. (highend only)
     
    TBoneSan likes this.
  5. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    @Vulcan: It's surprising how little radiator thickness matters. Compare the performance of a 30mm rad to a 60mm one. Below 1400 fan rpm, the 60mm rad gives a measly 5% extra heat dissipation. It's not until you crank the fans all the way up to 2200 rpm aka jet engine level noise that the 60mm rad pulls ahead by 16%. At least within the Nexxos line of radiators, the main advantage of thicker rads that I can see is less flow restriction. The UT60 is less than half as restrictive as the ST30, which could be important if you plan on running a high restriction loop.
     
    vulcan78 likes this.
  6. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Update:

    Wow guys, I took your advice and returned the rear fan from exhaust to intake and in keeping with this approach I also turned down the 140mm fan I mounted in the PCI-E bracket area from 100% to 30% RPM (so as to not steal all the air from the side-panel fan from the secondary AIO) and peak temps at the end of Valley with the clocks at 1241 core / 1950 memory went from 55 and 56C to 49 and 49C!

    Idle temps went from 26 and 24C to 24 and 23C! This is with Chrome running mind you!

    I'm thinking of simply disconnecting the 140mm fan I have in the PCI-E bracket area, but right now both the primary and secondary GPU temps are relatively even with extended periods of Valley getting secondary 2-3C hotter than primary.

    I've experimented with flipping this fan from exhaust to intake but to my dismay, while rooting out 7C higher secondary temps I discovered that it was pulling in hot exhaust from the PSU which sits RIGHT next to the PCI-E area in the Air 540. It was pulling that hot PSU exhaust right in, over the secondary GPU and through the secondary's AIO. Returning it to exhaust eliminated the 7C difference and I thought that it was helping keep airflow over the copper heat-sinks I've positioned around the VRAM and out of the back of the case but now I'm not so sure as during the recent Valley runs the exhaust from this fan wasn't even luke-warm. It may be because now that I've turned it down to 30% the secondary AIO is now able to overcome it and pull all the air coming from the side-panel fan.

    Anyhow, thanks for the advice guys.

    Wow, while typing this, I'm looking at Hwinfo64 and current GPU temps have dropped yet again to 23 and 21C! This is a huge difference over what they were previously, talking about a 5C reduction in idle temps.

    These are STELLAR idle temps BTW, 30C is regarded as GREAT idle temps, even for water.

    If memory serves me, I remember 40C being typical, average low idle GPU temps in the R2.

    Motherboard is currently at 25C and CPU avg. is also in the mid 20's.

    Edit:

    Just ran Firestrike, have a look at the temps:

    Firestrike side-panel fan.jpg

    NVIDIA GeForce GTX 780 Ti video card benchmark result - Intel Core i7-4930K,ASUSTeK COMPUTER INC. RAMPAGE IV BLACK EDITION

    I took about a 500-700 point hit overall going from driver 377.88 to 344.11 for whatever reason. The bench in my signature is with nearly identical clocks and is 700 points higher.
     
    Last edited: Nov 30, 2014
  7. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Yeah if its one thing I learned with radiators it is that running them at full bore only makes a few degrees difference over running them at say 60-70% RPM. I'm really surprised as to the marginal performance difference between 30 and 60mm radiator thickness, I was expecting there to be nearly double the performance. Was this comparison done with push-pull or only fans on one side of the radiators?
     
  8. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Push only from bottom.

    Pretty much the only reason I'd be going for the 45mm and 60mm Nexxos rads at this point is because I plan on running 3-4 pairs of quick disconnects in my loop, which is going to add a ton of restriction so every little bit counts.
     
  9. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Oh that might be why, I think with thicker radiators push-pull is pretty much prerequisite to see any gains, which can be difficult depending on space limitations.

    Taking a look at your desktop, very nice! I almost pulled the trigger on a 500GB 840 EVO on Black Friday ($189. and included a copy of Far Cry 4). I hesitated and was researching transferring Windows to a new drive and around 7PM PT the deal evaporated (12PM ET). I think I would've had to re-install Windows, which took me nearly all weekend the last time I had to do so as I have a lot of peculiar tweaks going as you can tell by my desktop snippet in my last post (running a custom dark theme for one, so much easier on the eyes).

    I'm certainly getting my money's worth out of the 256GB SSD that is my boot drive, it came out of the R2 and is going on 3 years now, it's holding up great but doesn't have the 840 EVO's IOPS (74 vs 98k) or write speed (420 vs 540MBps).

    I will wait for another deal and try to time upgrading to an 840 Evo or Pro after Win 10 gets ironed out and DX12 becomes commonplace. If I'm going to have to re-install Windows I am going to wait until I feel compelled to upgrade the OS.

    I've already looked into Acronis Migrate Easy and unfortunately it doesn't exactly work on Win 7, something about the partition.
     
  10. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,344
    Likes Received:
    2,350
    Trophy Points:
    181
    :)

    Excellent...
     
  11. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    TY! You have a real beast of a desktop!
     
  12. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Welp I severely underestimated how expensive watercooling can get. $350 spent on fittings alone. :eek:

    Granted apart from the Koolance QD4 quick disconnects everything else I bought Bitspower, which is known to be pricey. I also deliberately doubled (and in some cases, quadrupled) down on all the rotary fittings and quite a few other adapters+extenders, so I'm going to end up with at least $150 worth of redundant fittings (for a simple CPU loop), but still I would not have expected the fittings to come out to THIS much.

    On the bright side, since I severely overpurchased on fittings I'm probably only 2 GPU blocks away from a full CPU+GPU loop. Originally I only planed to get some redundant rotary fittings just in case because Murphy's law, and the thought of putting my loop on hold because gosh darn it I'm missing that ONE crucial fitting would absolutely kill me, so eventually I just went "screw it, I'm gonna buy enough for a CPU + 2x GPU loop so I can definitely build a simple CPU loop in peace and in one shot".

    ...yeah you can tell I'm new at this.
     
    Last edited: Dec 2, 2014
    johnksss and vulcan78 like this.
  13. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Damn dude, I'm excited to hear you've taken the plunge into water cooling, sounds like you're doing it the right way too. Yeah you'll likely end up adding a second Gigabyte 970 in time and eventually want all that heat expelled out of your case. I was actually ok on air with a single 780 Ti playing in 2D, it was only after adding the second 780 Ti and then discovering and falling in love with 3D Vision that I realized that consistent 98% loads in all my favorite titles with commensurate load temps and the NOISE of the ACX coolers running full bore did I feel compelled to research liquid cooling.

    One card in 2D no problem, if you ever get into 3D (Occulus Rift) or pick up a 4k monitor and add a second GPU I believe liquid cooling is the only way to go.

    Post pics when youre done.
     
  14. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Rocking two Gigabyte 970s OC'd to stock 980 level already :p

    Heat is surprisingly OK, even before I installed a PCI mounted side fan the top card would top out at around 74C, while the bottom one chills out at a cool 62C. With the side fan bottom card stays put, but top card doesn't go 1C over 70C. These Maxwell are amazing for thermals indeed.

    But yeah the 970 is just a stop-gap solution for me. I'm either going to double down on the GM200 or 390X, depending on which offers a better overall performance. (and if that 390X comes with "only" 4GB of vram I'm going to throw a fit :mad:)

    Don't have a build log because this one was a "rush build" -- wasn't satisfied with the initial build in the Enthoo Luxe, so tore it down and ported everything over as fast as I could. But here's a few finished shots so it's something:

    [​IMG] [​IMG]
    [​IMG] [​IMG]
    [​IMG] [​IMG]
    *the upper rear fan has been changed to an intake instead of being an exhaust. Made no difference to thermals but it made me feel better to have some air going over the mobo VRMs. Also just switched out the 840 Pro for the Extreme Pro today.

    Before you ask, yes the case is sitting on a ULine dolly, because the thing is just too freaking bulky to move around without one. And it being 70 pounds sure doesn't help one bit.

    A nice benefit of the dolly is the case gets some very good clearance for the floor fans. The opening actually lines up rather nicely with them too.
     
    Last edited: Dec 2, 2014
    octiceps likes this.
  15. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    If the rumored 390X specs--20nm GloFo process, 4096 shaders, 1.25 GHz 4096-bit HBM = 640 GB/s bandwidth (!)--turn out to be accurate and it edges out GM200 perf/price wise, screw the 4GB VRAM (which is a first-gen HBM limitation), I'm still getting it. Single GPU only though. Done with multi-GPU BS, at least for now, unless something drastic happens that changes my mind.
     
    Last edited: Dec 2, 2014
  16. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Nice set-up, I like the 140mm fan added, I have what might be the same bracket that I couldn't get to fit because of the SLI bridge, which may fit now that I've switched to the 2-way flexible alternative but doesn't matter any more as I have the 180mm side-panel fan.

    Yeah the lower TDP does seem to correspond to 5-10C lower temps Maxwell vs. Kepler.

    I'm also excited about AMD's counter-punch, I might actually go AMD when my 780 Ti's really get long in the tooth, Nvidia's quality control had taken a real dive as far as Maxwell's launch is concerned, all of the 344.xx drivers are problematic, for example 344.65 and later (and presumably earlier, I've yet to try) cause Chrome to appear completely jet-black with 3D Vision enabled. 3D Vision support is becoming a "Legacy" feature, maybe one out of four quality titles supports it now, PhysX as well. With these features disappearing, along with quality drivers, why am I paying more for Nvidia again?
     
  17. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    After going through two laptops with Crossfire AMD GPUs...I'm tempted to say never again, but I'm not that foolish. Certainly not in the near future though. Maaaaaaaybe their driver support is fine for single GPUs but I've found it to be really lackluster for any multi-GPU configuration on top of shoddy reliability and poor power efficiency.

    However, I really do hope that the 390X rumours are true. Even if I'm going to be buying nVidia, I still want AMD to be competitive to force nVidia to push their products forwards. I suspect nVidia's waiting to release their "Big Maxwell" GPUs (Titan 2/980 Ti) until the 390X is out, they have no reason to while the 980 is the top dog and selling well.
     
    vulcan78 likes this.
  18. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    This is gotta be one of the better case's I have seen in a long time for cable management.
     
  19. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah cable management was a big reason for getting the case (the other being watercooling, since I knew I'd get the itch eventually). I don't know where I'd be without those velcro strips lol
     
  20. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    N=1 I see you raised it off that very fluffy carpet when testing it :D
     
  21. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Fox, I missed your comment earlier, your Wanted system specs in your sig is hilarious BTW, that would be awesome if it became reality.
     
    Mr. Fox likes this.
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    :yes: yes, it sure would... although, I would actually be just as happy with 1080p or 1440p for the display. 3K just looked fancy along side the other specs.
     
  23. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    The more i look that would be the case I would get if i was going to do a case. No flash strickly all business. And those velcro straps. Damn priceless!!

    Seriously, everything in those pictures that say the case name. All that came with it??
     
  24. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    When I saw those velcro strips I was hooked and knew that was my case right there.

    All those velcro strips you see come with the case, and you even get 2 spare (one long one short) in case you need more velcro. The PWM fan hubs you need to buy separately, but the case comes with a somewhat crude version installed. And yes the Enthoo Primo is a damn fine case indeed.

    On a separate note, I now understand what people mean when they say Bitspower compression fittings will peel the skin right off your fingers. Hot damn are these things tight. Thought it was hard screwing down the tube but NOPE trying to unscrew them had to break out the pliers. Will definitely need to get some vice grips or bench vise at this rate. Those barb fittings suddenly look a lot more appealing...
     
    johnksss likes this.
  25. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    LOL!

    They still look good, that's for sure.
     
  26. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    I'm experiencing the exact same thing ATM, cards will not run a higher core clock than 993Mhz or it crashes. No idea what's up. It's not the driver.
     
  27. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Epic update:

    I replaced the 780 Ti's with a single 980 Ti, I'm seeing 1550Mhz solid with hybrid cooling and about 90% of the performance of the outgoing VGA's:



    I mean 1549MHz, as can be seen in OSD, not 1249MHz. I just came from 780 Ti where 12XXMHz is about the most you typically see.

    Money permitting I'm also replacing the 680M's in the R2 with a single 980M. It's been an adventure but I'm done with SLI.

    I truly wish Nvidia sold a separate portable G-Sync module that was somehow compatible with many / all DTR displays. That G-Sync is awesome.

    I've been hanging out over at on OC.net.

    http://www.overclock.net/t/1558645/official-nvidia-gtx-980-ti-owners-club/4160#post_24168171
     
    Last edited: Jul 14, 2015
    TBoneSan likes this.
  28. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Release the Kraken: Concluding Remarks (better lighting, thoughts on SLI, idle temperature)

     
    TBoneSan likes this.
  29. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Hey brother Vulcan! Good to see you.

    I ended up building a air240 build. It originally had 980 SLI in which I recently swapped in favor of a single 980ti. I wasn't enjoying the SLI experience on the desktop side of things. I've found running games around 80-100 fps on a single card to be more pleasurable than 140fps in SLI. That and the temp constraints of SLI on air in the 240 ended up being a waste having both cards run only 80% tdp.

    My thread here
    http://forum.notebookreview.com/threads/build-progress-corsair-air-240-x99-sli.773760/

    Having a single 980ti was a good move for me just now until I look into some options.
    I have got a Evga hybrid cooler on order (finally 1 month away) which a wanted over the Kraken g10 route. I watched your video a while ago and thought the vrm cooling with little heatsinks to be problematic, hence waiting out for a hybrid cooler. Seems you've come to a similar conclusion.

    I'm hoping to mount the hybrid radiator in the back section like they've done here in the last 2 pics.
    http://www.corsair.com/en/blog/2014/november/dennis_build_log

    By the way, I know your a 3D fan. Have you played Gta5 in 3D? It's breathtaking ! If anything , that will force me to maybe go SLI again.

    UPDATE: just watched your video. Noticed you still have the Kraken .. I misunderstood there. Glad it's getting you some good results.
     
    Last edited: Jul 15, 2015
  30. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    R.I.P. SLI
    2004-2015
     
    TomJGX likes this.
  31. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    It feels that way. Nvidia need to make a concerted effort to show they take SLI seriously and it's a legitimate upgrade option for gaming. Recently benching and 4k users are probably the only users not looking like chumps. Forget any users that want 120+ fps that doesn't feel like 50fps with a ton of input lag.
    With just about every release we have to fix things ourselves with Nvidia Inspector. In the off chance the game comes with a profile we're left with cruddy scaling and horrid frametimes.
    Man I'm keen for SLI again, it's brought me so much pleasure in the past, but not until Nvidia pull their slimey green finger out of Kermit's butt.
     
  32. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Never mind
     
    Last edited: Jul 16, 2015
  33. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Eh, I think they are just as much to blame. They are happy enough slapping their 'way it's meant to be played' endorsements on just about every title. At least in those cases they need to take some ownership with some QC. I thought that would be common sense or perhaps I expect too much.

    And frametimes of late.. ! AMD is fairing far better in Dx11 titles in dual card setups at least.
     
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I edited my post. Multi-GPU is a many faceted problem and on second thought I don't think it's right to point fingers.

    CrossFire driver support lags significantly behind SLI though and it still suffers from the same issues in non AFR friendly games. Plus you don't have the benefit of something like Nvidia Inspector to try to fix things yourself. Although this is hardly a benefit as it's a huge time waste to find alternate SLI profiles (trial-and-error and personal experience is it--there is no documentation for this stuff), and most of the time it's a game not driver problem so it can't be fixed at all or without introducing graphical glitches. Current state of multi-GPU just sucks all-around.
     
  35. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Awesome! So you did put together a desktop, man I had no idea, I guess I haven't been on here enough. What chipset are you running? We're you influenced by that PCGamer Air 240 build with the X99 and 980 SLI?

    http://www.gamespot.com/articles/building-the-ultimate-matx-sli-pc-with-intels-5960/1100-6423349/

    Yeah isn't an aggressively overclocked 980 Ti nearly as fast as 780 Ti / 980 without the hassle of SLI? I'M LOVING IT.

    Yes I moved one of the G10 + H55's from the outgoing 780 Ti's to the 980 Ti. I went with MSi's 6G as it has the best VRM / MOSFET cooling plate suited for this task. I have not used an IR thermometer but by touch the backplate over the VRM area is nowhere near as hot as my 780 Ti's got running a modified vbios. With that it was hot enough to the touch that you almost couldn't keep your hand on the back-plate over the VRM area, which is why I added all of the copper heat-sinks and reverted to the default vbios AND switched from the hulking air-cooler to an H60 mounted in the front (to allow airflow down onto the back-plate from the ceiling mounted 140mm fans.

    I do intend to get an IR thermometer to substantiate my feeling here, but I think that right now the G10 + H55 mated to MSi 6G with back-plate has everything covered.

    I did some more fine-tuning this morning, and dialed in 1560MHz on the core and 8100MHz on the memory (+550) at the 1.281V limit with 125% Power Target.

    21.8k GPU
    http://www.3dmark.com/3dm/7778806?

    (It should be over 22k GPU here but I'm being held back by Win7, I should pick up a good 500-1k points GPU when I switch to Win10)

    Compared to my overclocked 780 Ti SLI best with recent drivers, a difference of only 10% or so:

    23.4k GPU
    http://www.3dmark.com/fs/5048179

    It passed Heaven with no artifacts, whereas 1570-1575MHz is artifact city. It's still not getting above 50C even with this voltage, wattage and frequency (125% PT = 350W).

    I'm actually seeing 123% PT in Firestrike Extreme and ~115% PT in Firestrike Normal and Heaven.

    The video in my last post are what temps look like in GTA 5, with the temps going from 45C sustained to 50C in games with high memory load such as Titanfall. Reposted here for convenience:



    As I state in the video, I'm absolutely loving the 6GB of VRAM, I'm seeing 4-4.5GB avg. in Titanfall.

    For this game I run a more conservative OC of 1525MHz / 8000 memory @ 1.243-1.255V (reporting error here). This OC, with it's ~30mV less voltage and lower memory OC drops my temps a good 3C so I think this is a really happy medium and a good everyday OC.

    Yes if I was in your shoes I would absolutely go with EVGA's Hybrid kit. What seems to be the behavior of reference 980 Ti is that every reduction of 10C from 85C results in a 20MHz increase in core overhead with no additional voltage. This is why the Gigabyte G1 owners are typically getting away with more aggressive overclocks, i.e. 1500MHz at 65C etc. This seems to be the consensus over at the 980 Ti forum on OC.net. Most everyone has a maximum frequency going from say 1440MHz core max at 85C up to 1550MHz or so at 50C.

    This EVGA 980 Ti Hybrid review here backs up this theory, where both cards are reference PCB with the Hybrid stable at 1514MHz and reference at 1440Mhz with the only difference being that the Hybrid is only seeing 50C on the core and reference is seeing 85C (at default voltage).

    http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks/Page-2

    I'm at 50C on the core with another 46MHz core and 150MHz memory at the 1.281v limit with the same radiator surface area due to stellar airflow, two fans on the H55 push-pull, and possibly the IC Diamond I used. I believe the MSi 6G back-plate is also doing a far better job at cooling the MOSFET area but without an IR Thermometer and EVGA's Hybrid to compare it to this is just my assumption, take a look at the 6G's VRM plate:

    6g plate 3.jpeg

    As far as 3D Vision. I LOVE IT! But friend, I have to say we're a bit late to the party with official support nearly non-existent. For example, Nvidia advertises 3D Vision in their GTA 5 optimization guide, even giving it a rating of "Excellent" but the experience of the end-user has been somewhat different. All of us are experiencing some bizarre 3D Vision induced CPU bottleneck with 3D Vision active vs. 2D, dropping CPU utilization from 55-70% down to 34-50% in my case (4930k)! GPU utilization with 780 Ti SLI wasn't even at 100%, the problem was that 3D Vision was inducing a CPU bottleneck! And for whatever reason, anything less than 60FPS 3D Vision was a total stutter-fest. 50FPS was the bare minimum for a playable experience. In the end I simply reverted to playing the game in 2D and marveled at the 120FPS, G-Sync, all settings maxed 2D afforded:

    https://forums.geforce.com/default/...keep-gta-discussion-here-/?offset=314#4515874

    The GeForce 3D Vision sub-forum is a great place to haunt BTW, if it weren't for the help of the community there I wouldn't have been able to get Shadows of Mordor and Alien Isolation working with 3D Vision.

    YOU ABSOLUTELY MUST PLAY THE AFOREMENTIONED TITLES IN 3D VISION. 100% better than 2D, HANDS DOWN.

    Oh and one good thing about ditching SLI is that many if not most of the 3D Vision fixes break with SLI, it always seems to be an issues with shadows only rendering in one eye. Which reminds me, I need to try Dragon Age: Inquisition and possibly Skyrim in 3D Vision again. Oh speaking of DA: I, that was one of the main and recent reasons I finally decided to replace 780 Ti SLI with a single 980 Ti; I was seeing SLI induced stutter at like 90 FPS in DA:I! I didn't even know stutter was possible at 90 FPS! Returned to this same scene and stutter is gone and framerate is the same! (Again, 90% of overclocked 780 Ti SLI performance on a single card is MIND BLOWING).

    Here's those Shadows of Mordor and Alien : Isolation fixes.

    Don't play these games until you get the fixes going, or, replay in 3D if you already have beat them in 2D:

    http://helixmod.blogspot.com/2014/11/middle-earth-shadow-of-mordor-dx11-3d.html

    http://helixmod.blogspot.com/2014/10/alien-isolation-dx11.html

    I've been spending most of my time in the MSi 6G sub-forum over on OC.net, if youre on there stop by and say hi!

    http://www.overclock.net/t/1561999/...ard-overclocks-and-pictures/810#post_24175810

    I will be back when I save up enough money to swap out the 680M's in my R2 with a single 980M. Hopefully that will coincide around when Win10 becomes a worthwhile upgrade with sufficient beta testing having been done by the early adopters. :)



    Octiceps, nice to see you again!

    SLI is totally dead. A 50% improvement in performance is not at all worth the cost and additional heat:

    http://www.maximumpc.com/nvidia-gtx-980-ti-2-way-sli-crushing-performance/
     
    Last edited: Jul 16, 2015
  36. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    +1

    Even more so if you're talking 780 Ti/980 2-way SLI vs. single 980 Ti/Titan X. GM200 is exactly 50% bigger than GM204, so when clocked identically (which as you've discovered is easy to juice 980 Ti to 1500+ MHz) it will perform exactly 50% faster when GPU-bound. SLI needs much higher FPS than single GPU to feel as smooth due to AFR's much worse frame times, so at 50% scaling the 50% faster single GPU is the clear choice. Only when you consistently get 80% scaling or higher does SLI start to make sense.
     
  37. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    If their plans are, as they appear to be, molesting desktops with the asinine tree-hugging focus on power saving and efficiency that the industry has become literally OCD about with laptops it's going to be even more of a blow to monster multi-GPU desktops. The race to the bottom is a much shorter sprint than the one that leads to the top. I suppose if you have really short legs, and no stamina, that could be a blessing in disguise. I really don't appreciate how the bar has been lowered by NVIDIA to the point it has, but I guess that's to be expected when their one and only rival is a runt that is more interested in supporting game consoles than developing earth-shattering high performance hardware for PC enthusiasts.
     
  38. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Youre not kidding about industry now focusing on efficiency instead of MOAR performance.

    Check out the Broadwell-H review, 65W TDP LOL!

    http://www.guru3d.com/articles-pages/core-i7-5775c-processor-review-desktop-broadwell,1.html

    Yeah all of us with like 1kW PSU's are REALLY concerned with our CPU's consuming 150W (or 90W for 4790k) We wan't to get that down to 50!

    Who cares about, oh you know, maybe hitting 5GHz stable? No! I wan't my 65W TDP CPU!

    These people have gone completely off the deep end with the efficiency nonsense. They must be patting themselves on the back thinking they're improving public perception of them as a "Green Industry", same idiots who must recycle every last piece of aluminum foil their steak burrito was wrapped in and then get in their hybrid or electric car, chock full of planet insulting lithium and everything else after work.

    The irony is overwhelming.

    At least their hearts are in the right place.

    I suppose.
     
  39. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    That's an exact summation of the current state of SLI. I'm hearing rumors that DX12 will introduce improvements in SLI scaling but it remains to be seen.

    Speaking of which. Are you moving to Win10 when it releases at the end of the month or are you going to wait?
     
  40. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    On second thought, improving CPU efficiency is actually a good thing for everyone primarily rocking a DTR and limited to a 330W PSU....

    So yeah part of what they are doing makes sense.
     
  41. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    No, not really. None of this was a problem until they arbitrarily decided that it was. Now that their newer drivers throttle the living crap out of my GPUs I don't even need a dual 330W AC adapter setup because the drivers try to limit me to 180W and everything is a slide show unless I use 345.20 or older drivers. A single 330W AC adapter is fine for people playing games at stock GPU clock speeds. In fact, 330W is good up to about 1007/1500 with 780M SLI. It's also enough for 980M SLI running stock clocks. Those that want to overclock can take care of themselves. The imbeciles calling the shots from Micro$haft, Intel and NVIDIA are just an impediment when they start trying to make decisions for us.
     
    TBoneSan likes this.
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'll be waiting until the end of the year or early 2016 to install W10. Reason being that it will very much be a WIP at release and I fully expect M$ to make significant changes to the OS in the first 6 months-1 year as they gauge public reception. Besides, the earliest DX12 games won't be arriving until the holiday season so I see no real reason to jump on the wagon this soon.
     
  43. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    There is no point in "upgrading" to Windows 10 at the moment. It taxes the CPU just as bad as 8.1 if not worse. Windows 7 still taxes the CPU the least out of the three, to the point where a 4.2Ghz 3920XM in windows 7 would require 4.5-4.6 Ghz in windows 8.1/10 for equal performance.
     
    vulcan78 likes this.
  44. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Actually I had a lot of influences which led me to choose the Air 240. The main one was I wanted power per square liter as at the time I wasn't sure how much longer I'd be in Japan and I wanted to be able to get it back home easily - air 240 fitted the bill. Now I've decided to stay on a couple more years I am considering changing to a less compromised case. However, since SLI is sucking so bad right now I think it's fine as is on 1 gpu and the CPU at 4.5ghz (likely just add a AIO to the GPU and be done with it).

    Yeah I certainly am late to the 3D vision game. I knew full well but snapped up a good deal on eBay. I was knocked off my seat by the quality of 3D. Immersion on that level was totally new to me - I haven't said that in a while.
    Yeah GTA5 works great but it needs to allow separate convergence setting for 3rd and 1st person. I normally switch between the two, so playing with 3d forces me to choose. Also any competitive gaming is out of the question.

    As for wanting to switch to Windows 8 to get your 3DMark score up a little - don't do it!!!

    I've re-used my Windows 8 key I had bought for the m18x R2 to when I upgraded to 980m SLI. I didn't have a W7 Key/CD at the time I built my PC so I used what I had (W8.1) and bloody hate it more and more everyday.

    It seriously screws around with CPU stability. I haven't managed to nail the gremlin in my system. I have my suspicions. I can't prove anything yet but I think it has something to do with UEFI and Windows 8.1 up to shenanigans. Every few weeks I have to do a CMOS drain for my OC to run stable - its absolutely bizarre and suddenly happens. I'm almost certain it's Intel XTU not working right with my chipset in Windows 8.1 and doing something goofy with the bios on it's own causing massive instability. Add to that Nvidia drivers appear they might be over-reaching system power handling operations too, not just on laptops. My suspicions is it might have something to do with desktop Gsync and/or DX12 preparations. Either that or they are simply trying to manufacture consent for more system privileges than required to perhaps exploit later *hat off*.

    Long story short. Windows 7 in Legacy doesn't do any of these sneaking things (excluding Nvidia driver stuff). Check out your 3D Mark 11 physics score in Windows 7.. subtract 10-15% and guaranteed that's what you'll be looking at under Windows 8.1.

    I hope windows 10 is decent. I've been holding off buying a new Windows 7 key until I at least try it out. If it turns out to be crap, I'll finally buy a Windows 7 copy and be a happy chappy.
     
    Last edited: Jul 17, 2015
    vulcan78 and Mr. Fox like this.
  45. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    T-Bone, good choice with the 5820k! Yeah I'm hearing mixed things about 8.1, some say they are seeing better performance with it, but then others are relaying your sentiment. I'm in line with Octiceps, there really isn't any point to upgrading Win10 so early when no games at present support DX12.

    As for 3D Vision, I highly recommend the following games:

    Tomb Raider 2013
    Max Payne 3
    Alien: Isolation (with fix)
    Shadows of Mordor (with fix)
    Metro: LL and Redux
    Talos Principle
    Path of Exile

    Anyhow, if you have a reference PCB 980 Ti I definitely recommend picking up EVGA's Hybrid cooler. I posted this in the MSi 6G forum, basically I'm getting away with 1526MHZ core / 2000Mhz memory (8k) with no additional voltage, PT at 120%, temps are a good 3-4C lower than previous OC of 1551MHz core / 2050 memory (8.2k) and 1.281v but performance is virtually indistinguishable. Oh and that peace of mind that comes with not having to worry about your voltage being at the 1.281 limit!

    http://www.overclock.net/t/1561999/...ard-overclocks-and-pictures/910#post_24180522

    Update:

    Holy crap guys, I don't know if the card is "broken in" or what but having seen some measure of stability in Titanfall for about 45 minutes at 1526 core / 2000 memory / 1.218v (default?) I simply ran the Graphics Tests 1, 2 and Combined in Firestrike Extreme in infinite loop for just over 45 minutes, nearly twice the amount of time Gamers Nexus used in the very same test:

    http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks/Page-2

    No crash and a peak core temp of 43C, 4C lower than that seen with the voltage at 1.281v and the clocks at 1550 core / 2050 memory. What is also interesting here is that PT didn't exceed 112% with the reduced memory OC. It seems that extra 50MHz memory really induces extra wattage and / or I don't fully understand the relationship between voltage and wattage and maybe the reduced voltage is resulting in less wattage being drawn.

    However, 1526MHz held steady while gaming and benching so whatever is happening I'm locking it down here and crossing my fingers that it behaves this way next time I start up my computer.

    Here's those final Firestrike scores.

    Normal (21.4k GPU)
    http://www.3dmark.com/3dm/7786555?

    Extreme (10k GPU)
    http://www.3dmark.com/3dm/7786576?

    I still can't believe I'm getting away with 1526MHz core and 2000Mhz memory with no additional voltage. I think the postulation that the extra voltage might not help with increasing OC overhead with Maxwell might have some truth to it. It feels exactly as it did when I was dialing in my CPU where a certain frequency became a threshold where any more out of it required an immense increase in voltage. For my 4930k this is at 4.5GHz, where it can run at with 1.362-1.375v, only slightly up from say the 1.344v needed for 4.4Ghz, but 4.6GHz? It needs 1.47v! That extra 100MHz is suddenly requiring an additional 100mV whereas the preceding increase from 4.4 to 4.5GHz only required maybe 30mV. That's how you know youre at the limits of a chip.
     
    Last edited: Jul 17, 2015
  46. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    I don't know why it's affecting mobile GPU's so hard but the latest hotfix, 353.49, is working fine for me. All I've done was disabled hardware acceleration in Chrome to avoid the TDR's and so far, going on 5 days now with my 980 Ti and this driver, I've yet to experience a single TDR.

    I'm even tempted to turn HA back on just out of curiosity.

    Oh and this with the insane overclock I'm getting away with!

    BTW, I love seeing you taking it to Nvidia in the driver feedback threads, they've been doing a crappy job and completely deserve it.
     
    Mr. Fox likes this.
  47. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Update:

    I managed to sell both 780 Ti's on ebay today, amazingly within only a few hours of posting them under "Buy it now" with "Best Offer" for $300 each, making their replacement with the single 980 Ti nearly a free upgrade as I also managed to get not only Batman: Arkham Knight, but also Metal Gear Solid: Phantom Pain on the grounds that Nvidia has replaced the former with the latter as a promotional item due to broken state of Batman: AK and an ETA for actually being playable somewhere between Sept. and Spring 2016.

    + $100 for a brand new 980 Ti with double the VRAM and 90% of the performance of 780 Ti SLI (21.4k GPU Firestrike vs. 23.4k, and 3DMark's SLI scaling is better than the majority of games out there, except Tomb Raider 2013, nothing has better scaling that that game) all on a single card, with half the heat being pumped into my room, DX12 ready, Maxwell exclusive features, 850W PSU no longer running on the ragged edge. I should be good well into the end of Pascal when non-reference 1080 Ti arrives sometime in 2017.

    10/10 would do again.

    Oh and 1513MHz and 8GHz on the memory (+500) with default voltage is still running rock solid with average temperatures in game a good 7C lower than that seen by the AIO cooled 780 Ti's, around 43-47C.

    In other news, I'm sure you guys have heard, but here's the first review of 980 Ti Kingpin, she's a beauty!

    http://www.hardwareluxx.com/index.p...reviewed-evga-geforce-gtx-980-ti-kingpin.html

    Edit:

    Oh I almost forgot, hotfix 353.49 has been nothing but bullet-proof for me, at least with a clean install via DDU and this card, not a single TDR, I've even turned Hardware Accel back on in Chrome!

    No throttling or any other funny business either.

    In case you missed it, here's that video of 1549MHz with a little voltage, no artifacts, throttling or anything else again:

     
  48. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    To give you NBR guys an idea of how awesome it was to sell my 780 Ti's and pay only $100 up for 980 Ti with two free games, that's like selling both the 680M's out of my R2 for the $350 each and then paying the $100 difference up for 980M. It's that awesome.

    Unfortunately I don't think I will find a buyer for my 680M's when I eventually replace them with a single 980M once Win 10 has been sufficiently beta-tested. Assuming 980M will work with Win 10 like it does with 8.1.
     
← Previous page