The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]

    Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.

  1. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Used to work lots better doing it that way on my other Clevos with G.SKILL 3000 and 3200 and that is how I run my craptastic Kingston HyperX Impact 2400 at 3000 in Machete. It will not boot with XMP 3000. Works excellent with manual settings.
     
    Last edited: May 16, 2018
    Papusan and Robbo99999 like this.
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
  3. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    I am assuming you aren't running 1.45v outside of benchmarks?
     
    Mr. Fox likes this.
  4. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    ASUS has the most exhaustive motherboard settings and are generally popular, is usually why. For example gigabyte is generally just as good an overclocker but their BIOS is horrrrribleeeee so people avoid like plague. Those Strix boards are just slightly souped up PRIME Zx70-A boards with added features like wifi etc, as far as I remember. They're not worth their cash. Maximus X Hero is the one-stop shop too, if you're going beyond that you want the Apex (which only has two DIMMs, which makes it not useful for some of us).

    As for why so many use the ASUS, it's probably as the bottom post in this short thread explains: https://www.techpowerup.com/forums/...-maximus-x-hero-vs-asrock-z370-taichi.241511/

    I wouldn't recommend any non-OC Formula ASRock board Z270 and prior though, I saw a noted improvement in their feedback and general acceptance once Zen came out, and they make up the bulk of my recommendations now (mainly due to being cheaper too). I mean come on the only difference between Maximus X Hero and Maximus X Code is a freaking STICKER that ASUS charges like $50 USD more for. ROG tax ish a no-no. Not that you'll get problems with a Hero mind you. I just don't really think it's so needed.

    I haven't seen any EVGA board get openly praised since the X299 Dark, so I admittedly know less than I'd like in that category, but if they're just as good then hey more power to that, I'd not mind having more to recommend around
     
    Papusan, KY_BULLET and Mr. Fox like this.
  5. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    I was referring to EVGA GPUs. You also have to consider warranty. EVGA is supposed to be a pleasure to deal with, and their GPUs rock. Their Classified motherboards are also excellent.

    Taichi might be a fanstastic motherboard, but the warranty is one year and only to their Authorized Distributor (not the end user). ASUS warranty is generally 3 years (or up to 5 years on some) for their enthusiast motherboards, and you don't have to deal with a third party.

    https://www.asus.com/us/support/article/678/

    https://www.asrock.com/support/index.asp?cat=RMA

    I also think there is something to be said for following the lead of master overclockers and seeing what motherboards are consistently being used at the top of the food chain. That said, it takes nothing away from the Taichi being an excellent product.
     
  6. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh, I was only talking about motherboards. GPUs, I don't think it matters much on Pascal once you hit 1080Tis from what I remember reading/hearing. XOC BIOS and any GPU is any GPU I think?

    Interesting about the warranty being only one year, that's quite low... I wonder if they're only doing one year because they can get away with it in the US *insert thinking face*

    Ah well. Maybe I'll send an e-mail to ASRock and ask them why
     
    KY_BULLET and Mr. Fox like this.
  7. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    I run 54x6 with 1.456V almost all of the time. My idle temps are 28°C-32°C and load temps are 60°C-75°C depending on the benchmark. If my house is hotter than the normal 68°F-70°F, then I will drop down to 53x6 and 1.410V to keep the temps in the same range.
     
  8. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    I'd also be curious why they put you off and force you to deal with the place where you bought it as well. That kind of sucks having to deal with a third party. I could see an RMA taking a helluva long time to resolve with that kind of silly nonsense going on. I did not know that until today, and knowing it's one year and only working through an authorized distributor, I will not likely ever consider buying anything from ASRock even if it is a good product.
     
    Last edited: May 16, 2018
    D2 Ultima and KY_BULLET like this.
  9. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    You're only running about 50mv more than me for my Skylake CPU, and your temperatures are pretty much the same as mine, so I'd probably be happy with that safety level too. Recommendations for Skylake was up to 1.45V as a 'safe' everyday maximum, is it the same recommendation for Coffee Lake?
     
    KY_BULLET and Mr. Fox like this.
  10. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    I don't know. Probably about the same maximum, but I never pay that much attention to recommendations like that. I focus on the temps more than anything else. That's where the greater likelihood of damage exists.
     
    Papusan, KY_BULLET and Robbo99999 like this.
  11. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,460
    Likes Received:
    12,841
    Trophy Points:
    931
    He would be fine with a small fan over ram and still be able to keep his AIO intact.

    We totally forgot to test those sticks on my board.

    Yeah, that XMP is a big hit or miss most of the time. That comes from prior experience. Look at these 3600's I have. I thought they were junk when i had my RVE. They would not even run at 3400Mhz. Now they run at 4260Mhz and can run as low as cl15 or cl16 depending on bios.

    Glad I did not junk them when I had the chance.

    How much did you gain?
     
  12. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Good good, and I know you'll let us know if your CPU ever gives out, but I doubt it will - you're not one to not say!
     
    Convel, KY_BULLET and Mr. Fox like this.
  13. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Well with those temps I understand, I currently keep my 5820k idle at the same temp @4.6GHz with max temps hitting 62°C max with a AIO @1.316v

    I've hit 5GHz @1.46 but temps rose to 90°C when benchmarking, which I don't like, so here's to cool temps :cool:
     
    Papusan and Mr. Fox like this.
  14. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Well next time we have a benching party you can test these Corsair 4000 sticks and see how high they can go. :D
     
    jaybee83, Papusan and Johnksss like this.
  15. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,460
    Likes Received:
    12,841
    Trophy Points:
    931
    For sure.
    I totally changed everything. I bought all new fittings for the chiller plus inline temp sensor. New tubing in clear. New water block. New bench table, but there seems to be some technical difficulties with that part. LOL

    Now system drops to below 40F (Lowest being 32F=0C) under full load. That is about 5C water temp. So I'm pretty sure it would run the 8700K at 0 or 32F, but then the board would need to be insulated around the cpu.
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
    Last edited: May 16, 2018
    iunlock, bloodhawk, Papusan and 2 others like this.
  16. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Awe, man... looks sweet, Brother John. I can hardly wait. Let's get together again in the latter half of June, either your place or mine.

    Did you get a Kingpin 1080TI?

    Where did you get that inline H20 temp sensor?
     
    Johnksss and KY_BULLET like this.
  17. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,460
    Likes Received:
    12,841
    Trophy Points:
    931
    KY_BULLET and Mr. Fox like this.
  18. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Seems to fit here as a point of reference.
     
  19. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    @Mr. Fox has the new 1440p G-Sync monitor arrived yet?
     
    KY_BULLET and Mr. Fox like this.
  20. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Yes, as a matter of fact, it has Brother @Talon. It's totally awesome. No backlight bleeding and no dead or stuck pixels. Runs 165Hz with no effort. Loving it. I haven't played around with G-Stink yet. Not sure what "ULMB" technology is. Never heard of that before. Going to have to Google it and see what the purpose of that is and decide if I care. The red is not as tacky looking as I expected. I'm still not in love with that, but it looks a lot more gawdy in photos, including my own. It's not nearly as "matador" orangey-red tacky looking as photos portray it. But, I am very pleased with it overall.

    IMG_20180517_165056.jpg
     
    iunlock, Johnksss, Robbo99999 and 6 others like this.
  21. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    @Mr. Fox ULMB used to be called LightBoost during the pre G-Sync days. As the name states, it uses backlight strobing to reduce motion blur or ghosting during fast motion. It's supposed to make LCDs basically on-par with CRTs in terms of motion clarity, but the downsides to ULMB are that it lowers your max refresh rate (typically to 120Hz or 144Hz), lower brightness, and flicker. That said, many competitive gamers still prefer ULMB at a lower refresh rate to G-Sync at a higher refresh rate.

    Edit: Here is a demo. The difference in motion blur with and without BFI should be obvious. You can also play around with the controls.
     
    Last edited: May 17, 2018
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Thank you. Very good explanation. I can see where changing the settings between fixed refresh rate, ULMB and GSYNC changes what the browser test tells me, but I am not able to recognize a difference (visually) in how the motion looks on my screen. It looks the same to me regardless of the settings are applied in NVCP and I do not see any motion blur. Everything looks buttery smooth no matter what settings I choose. But, I will try tinkering more with the browser controls. Maybe I can induce it or something. :vbthumbsup:

    Edit: OK, now when I do this configuration there is a CRAZY amount of difference that can be noticed. The bottom row looks super smooth like the default browser test, but the lines above it really suck. BAADDD motion blur, LOL.
    Test.png
     
    Last edited: May 17, 2018
    KY_BULLET likes this.
  23. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    What is the "browser test" you're referring to? Make sure your NVCP is set to fixed refresh rate or G-Sync for full screen mode, not to ULMB.

    Try the default configuration again. Make sure it's in fullscreen (green arrows at the top-right). Half frame rate (top) should have clearly more motion blur than half frame rate + Black Frame Insertion (bottom).

    If I add Full Frame Rate UFO to the configuration, the middle line (half frame rate + BFI) still has less motion blur than the bottom line (full frame rate).

    When I change the config to 5 UFOs at 1/5th frame rate, all the UFOs flicker noticeably, but the progressive improvement in motion blur as more black frames are added is even more obvious.
     
    Mr. Fox and KY_BULLET like this.
  24. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    The link you provided https://www.testufo.com/blackframes runs in my web browser. So, that's what I meant by "browser test" LOL. :vbwink:

    This one is really trippy. If I stare straight ahead I see vertical black bars, but if I let my eyes track the movement from left to right, the black bars disappear and there is a colored background.

    https://www.testufo.com/eyetracking#pattern=stars
     
    Last edited: May 18, 2018
    KY_BULLET likes this.
  25. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    Papusan, Convel, Mr. Fox and 2 others like this.
  26. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Congrats on the new monitor! ULMB makes a massive difference to motion clarity even vs 180Hz - which I've tested on my monitor. Problem is the lack of G-sync when running ULMB, so you have to use either normal V-sync or live with tearing. Also the other negatives that yrekabakery mentions below, but in my experience they're not such big issues as the lack of G-sync or needed reliance on high latency V-sync. I prefer high refresh rate G-sync to ULMB, even in competetive online shooters. I also find that ULMB makes tearing more noticeable. G-sync is the bomb Mr. Fox, you shouldn't be calling it G-stink! ;-)
     
    Last edited: May 18, 2018
    Mr. Fox and Talon like this.
  27. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    This. While I've tested and used ULMB before, I don't like how dark it makes the monitor and I only see maybe a slight advantage to blurring. For me high refresh rate hardware G-Sync is amazing. I've been using my monitor around 2 years? now and I don't have any desire to swap it out anytime soon. I don't like HDR from what I've seen on my 120Hz Sony 4K HDR TV. I think the issues I've had is the that it probably uses HDR 400 which makes the picture dark and less desirable IMO. Maybe HDR600 or HDR1000 would change my mind.
     
    Mr. Fox and Robbo99999 like this.
  28. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Nice .. whats the DRAM voltage at? (Along with VCCSA)
     
    Mr. Fox and KY_BULLET like this.
  29. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    1.4v DRAM (XMP default)

    1.125v VCCSA and VCCIO
     
  30. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    That actually pretty impressive for 4000Mhz 16-17-17-38
     
    Papusan, Mr. Fox, Talon and 1 other person like this.
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Thanks, I really am impressed with it. Awesome monitor, and I am generally difficult to impress with something like a monitor. I normally don't get too excited about them, but this is an exceptional product. The only way I was able to convince myself to spend this much money on a monitor was to consider the fact that it a one-time purchase that should last me at least 5 years. If I spread the cost out over 5 years, it was actually pretty inexpensive ($120/year). That's about what I usually pay for a cheap monitor that does everything I want it to, except for improving my overclocked benchmark scores. I haven't found a monitor that can do that yet.
    I played some Gears of War 4 last night for a while. I could not tell any difference with G-Sync enabled or disabled. Everything looked exactly the same to me. It was that way on my G-Sync laptops as well. Maybe I am doing something wrong, or maybe I am just very insensitive to screen tearing unless it is abnormally severe. The only difference I could see was my framerate was limited to 165 FPS in my OSD. But, G-Sync was not really a deciding factor in my purchase. It was more of a nice feature freebie thrown in at no extra charge as I selected this one based on specs and price.

    I will continue to tinker with it to see if I can see anything special about it with other game titles. Perhaps Gears of War 4 runs too smooth for me to see a difference, and that was the only title I have tried playing so far. It definitely needs to be disabled for benching, as having it enabled lowers the scores. I'll play Crysis 3 this weekend and Battlefield 1 this weekend and see if those look better with G-Sync enabled.

    I played around with ULMB and decided that was kind of silly. Don't really care for it.
     
    Last edited: May 18, 2018
    Papusan, Convel and KY_BULLET like this.
  32. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331

    You won't see a difference at 165fps, where you're going to be able to see the difference is when your FPS drop, or the game stutters. G-Sync will rapidly adapt and match FPS to refresh rate, not allowing for tears and making the stutter less perceivable. Basically your hardware is far too powerful for the games you're testing most likely.
     
  33. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Yes, that is why (besides liking them) I decided Crysis 3 and Battlefield 1. If I play those at 1440p and Ultra settings, it will be more taxing than a UWP filth game like Gears of War 4 and I may be able to see some benefits to G-Sync that I haven't seen so far. (That being said, I do love Gears of War and that is the only reason I am able to ignore the fact that it is a UWP filth title.)
     
  34. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The tearing at 165Hz is very minor, so if your FPS never drops you might never notice tearing even with G-Sync off, unless you happen to be sensitive to it.

    Also, if you're locked at 165 FPS, you're actually using VSync, not G-Sync. To properly set up to use G-Sync, you need to not only enable VSync but also limit your frame rate to 3 less than your maximum refresh rate. So limit to 162 FPS. That lets G-Sync take over and eliminates the input lag of VSync. You need to enable VSync when using G-Sync because otherwise there is tearing when you limit FPS to below the refresh rate.
     
    Papusan and Mr. Fox like this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    Oh, maybe I am doing it wrong. I thought VSync has to be DISABLED for G-Sync to work properly. I always have VSync disabled for everything. If it needs to be enabled, perhaps that is why I have never (on any system) seen a difference other than lower framerate with G-Sync enabled.
     
    Papusan and Talon like this.
  36. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

    Best response of why I've seen why some use V-sync On in NVCP in addition to G-Sync. I don't use the toggle on, and run just G-Sync. But...

     
    Convel, Papusan and Mr. Fox like this.
  37. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    I will probably do the same as you. But, I will still test it both ways to see how it works. I have never liked using V-sync. I've only used it when I absolutely had to because the game was so messed up it was unbearable without V-sync enabled. Since I have very low sensitivity to screen tearing, I may or may not be able to see a lot of difference, and it ultimately may not matter that much to me. Only one way to find out, though... do it.
     
  38. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    @Mr. Fox maybe I wasn't entirely clear.

    When you enable VSync and your frame rate is inside the G-Sync window, it's not actually behaving like traditional VSync. As Jorim Tapley wrote, turning on VSync enables the ability of G-Sync module inside your monitor to eliminate tearing caused by frame time variances. VSync has no input lag penalty when you're inside the G-Sync window, its only function is to prevent tearing. VSync only adds input lag when you don't limit your FPS to 3 below your refresh rate, in which case it'll sit at 165 FPS and traditional VSync takes over instead of G-Sync.

    There's absolutely zero point to having G-Sync enabled if you don't also enable VSync because then there is tearing inside the G-Sync range.

    I use mobile G-Sync, so there's is no G-Sync module present. But I still need to enable VSync when using G-Sync, otherwise it tears inside the G-Sync range. This is really apparent in the G-Sync Pendulum Demo.
     
    Papusan and Mr. Fox like this.
  39. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    I am going to test it. Thank you for the clarification. It could be that I have never seen any value in G-Sync because I always have V-Sync disabled. That is part of my setup routine. Under normal circumstances I despise V-Sync. But, if it makes G-Sync work and I like what I see, I may temporarily enable it or modify individual game profiles to automatically do it for me.
     
  40. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    You could do what I do.

    When I want to game, I turn on G-Sync for fullscreen mode in NVCP and set VSync to On in global 3D settings.

    When I want to bench, I turn off G-Sync in NVCP, and that automatically turns off VSync as well (sets it to use 3D application setting).

    For all my games I've set a 117 FPS limit using either their built-in limiter or RTSS, because my screen is 120Hz.
     
    Papusan likes this.
  41. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    It seems the mobile "software" version of G-Sync is different and I seem to remember needing to do exactly what you're talking about when I used my laptop G-Sync screen. On desktops with the "hardware" version of G-Sync this isn't necessary at all as demonstrated by the Pendulum demo. I can leave V-Sync OFF in the NVCP and turn G-Sync on and the tearing is immediately removed in the Pendulum demo vs V-Sync and V-Sync OFF inside the demo.
     
  42. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    What about with the test pattern?

    The tearing with NVCP VSync off was more obvious to me with the test pattern than with the pendulum.

    Maybe VSync is needed on desktop G-Sync only when there are sudden frame time variances. If frame times are consistent there is no tearing with VSync off. On mobile G-Sync I need VSync on to prevent tearing at all times.
     
    Papusan and Talon like this.
  43. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    Zero tearing. If I switch in test pattern to V-Sync off tearing immediately (obviously). G-Sync On and V-Sync off in NVCP gives me zero tearing.

    Now if I play a game like and I am getting more than 144fps and I go outside of the G-Sync range then sure tearing would most likely show up, but that is why Nvidia came up with Fast Sync instead. That way you can have unlimited FPS (unlike v-sync on) and maintain G-Sync quality in the G-Sync range, and get low input lag and unrestricted GPU performance above the G-Sync range.

    So:

    0-144FPS -G-Sync On
    144FPS++ - Fast Sync On

    Still I just use G-Sync and V-Sync Off in NVCP. I just prefer to let the GPU run unrestricted and tearing is so minimal at high refresh rates and fps. For me G-Sync is there to save me in the event my FPS tanks or I get the occasional FPS stutter.
     
    KY_BULLET likes this.
  44. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Very interesting. In your case I think the optimal setup is G-Sync + 141 FPS limit as Blur Busters recommends, VSync on or off doesn't matter. Fast Sync adds a frame of input lag due to triple buffering and microstuttering due to the dropped frames: https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/8/
     
    KY_BULLET, Robbo99999 and Talon like this.
  45. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    All this talk of how to setup G-sync optimally, and how it works etc, the best resource and bible on this is Blurbusters (that I've linked before, and yrekabakery linked above): https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/

    Read that entire article flipping through all the different pages, it's the bible, it's based on their thorough testing & they really know their shiz! Don't take anyone else's word for it!
     
    jaybee83 and KY_BULLET like this.
  46. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    OK, played Crysis 3 for a little over 2 hours with V-Sync and G-Sync enabled. I could see some signs of it seeming to be just a little smoother in a few places, but it is kind of hard for me to gauge since I never noticed any issue without those features all the years I have been playing it. May try BF1 or CODAW later this weekend.

    Running Crysis 3 at 2560*1440 165Hz was certainly nice. All settings maxed out and rarely ever saw less than 120 FPS. Looks like the CPU peaked out just over 170W while playing at 5.3GHz. My modded 1080 Ti looks to be pulling over 400W running stock clocks and voltage. (The mod causes the GPU to recognize less than half the watts it is actually pulling, so more than double what HWiNFO64 is showing.) Total system power reported by the UPS is north of 750W.

    Crysis3_2hrs.JPG
     
    Last edited: May 19, 2018
  47. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,650
    Trophy Points:
    931
    So, I had what might be a brilliant idea this morning as I was dropping the kids at the pool.

    I am thinking about using a small 6-pack sized cooler with a bunch of coiled copper tubing inside of it, filling it with water and freezing it into a solid block of ice. Then connecting it to my loop with quick-connect fittings. Brothers @Johnksss and @bloodhawk have you ever heard of anyone doing this before. I think it should work great.

    I am also thinking about using the leftover parts from my butchered Riing and EVGA cooling systems (pump/block and 120MM radiator) to FrankenKool the P870DM-G.
     
    jaybee83, Johnksss, Papusan and 3 others like this.
  48. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Sounds like a good idea. I suppose it's possible that the water could start to freeze in the copper pipes & may eventually restrict or totally block flow through, or do you use an additive in the water (like antifreeze) to stop it from freezing? If the copper pipes don't freeze up, then as time goes on the copper piping is likely to form cavities in the ice block where it melts it away. Because ice is less dense than water then these defrosted cavities will be partially filled with chilled water and partly filled with air/nothing - so it's possible that as time goes by that the cooling efficiency would decrease due to lack of contact with the copper pipes. These are some of the problems I can imagine, but while it works I imagine it would take huge amounts of heat out of the loop - which is good.

    EDIT: you could probably achieve better results by having a saline (salt to lower temperature below normal freezing point) solution of ice cubes & water in the cooler in which you would have the copper pipes - it would be mainly ice cubes with enough water to allow good contact and heat transfer from the copper pipes - this convective action of the water is more efficient than just the conductive action of a solid block of ice, plus you don't have to worry about the cavitation I talked about in the previous paragraph.
     
    Last edited: May 19, 2018
    KY_BULLET likes this.
  49. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    If im understanding this correctly, it should work for flash cooling, or until the block of ice melts and the liquid reaches ambient temperature.

    However if there is water in the copper pipes and thats frozen, it will restrict the flow a lot.

    Something similar can be achieved by dunking a radiator in a tub of saline water and ice cubes.

    Second thing that will decide how well it will cool, is how many heatpipes are making how much surface contact with the heatsinks on the DM1 (Surface area). Basic pumps should be enough to keep the liquid flowing at a decent rate, unless there are too many bends.
     
    Last edited: May 19, 2018
    Johnksss, KY_BULLET and Papusan like this.
  50. KY_BULLET

    KY_BULLET Notebook Evangelist

    Reputations:
    802
    Messages:
    655
    Likes Received:
    794
    Trophy Points:
    106
    Fixed my wires up the best I could and added another ThermalTake 140mm (Pull) to the bottom of my case. This is the other fan you sent with the case @Mr. Fox . Thanks for that!

    Also, I used automotive wire loom for the bulk of wires in the back but I ran out of it so I couldn't make it look the best that I could've looked. When I get time to make a run to the auto parts store, I'm gonna get some more. It actually looks pretty good. Well, I should say looks better than the rat's nest of wires that I did have back there.

    Also I'm gonna have to get a corsair RGB controller for those pesky red fans I have in the front. Or I could go with the superman theme lol!

    http://imgur.com/gallery/t44Y0tK
     
    Johnksss likes this.
← Previous pageNext page →