@Raiderman - It depends on what games you play as to whether the slightly larger die or the faster memory will be worth it. But, also remember that what is good today may not be good tomorrow. HBM2, HBM3, and GDDR6 (coming this spring) all are above the 8Gbps limits on the 1070 Ti. The GDDR5X at least stays in there a bit more. That means slightly more future proof if bandwidth on mem is needed over the lifetime of the card.
Meanwhile, I consider the 1070 Ti to be a bastard middle product that serves little purpose. It was a retort to the Vega 64. As such, I really cannot recommend it, unless that $50 literally blows the project.
-
-
I have to agree. In fact, I never recommend going with a second fiddle GPU to save a few bucks. Maybe with a disposable BGA turdbook that you don't expect to keep past the warranty it kind or, sort of, almost makes OK sense, but otherwise the concept just seems foolish to me. I wouldn't buy anything less than a 1080 at this point in time. The 1080 is entry-level awesomeness. As long as budget is not set in stone, going for 1080 Ti makes even more sense. Only on the extreme cost end of the spectrum do I make an issue over price. Paying $1200 or more for a Titan is far more than I think is reasonable for benching and playing games, especially when top dog Ti GPU matches or beats it in most scenarios. Nothing wrong with Titan... in fact, I would love to have two or three of them, but not at the premium charged for them.Vasudev, Papusan, Raiderman and 1 other person like this.
-
Aside from that, with Ngreedia now charging $3K for a Titan V and banning all support for using geforce in data centers, unless crypto mining, they are trying to reach too much on the basis of greed. They added 40% of transistors to a monolithic die to get the increase in performance on Volta. That means they really are just sitting on their hands. Greed like that pisses me off, when development should be about doing your best, not just beating the competition. That is where the incentives through competition perverse and retard progress.Vasudev, Papusan, Mr. Fox and 1 other person like this.
-
Charging $3,000 for a Titan V rates close to the level of despicableness of the sub-human scumbags that are addicted to molesting the dead bodies that do not resist their advances or press charges against them for their Satanic crimes. It doesn't matter what the specs are, there is nothing that even remotely resembles good value in the beyond-asinine price tag. It has "stupid sucker" written all over it, LOL.
-
-
I say get the best card you can afford based off your monitor. If you are a 1080p 120hz or less gamer, get a 1070. If you are 1440p, get an AIB 1080 or RX Vega 56. If you game higher than 1440p or like me, Ultrawide, get a 1080 TI.
-
My take on the price tag of Volta is they are trying to sell as few of these as they can. They only need to produce the half a dozen or so for reviews and not worry about a production run of major quantities or supply issues etc.. So we can complain all we want, the company can just sit there and say the card is out as promised.
Mr. Fox, Vasudev, Rage Set and 1 other person like this. -
Its an MSI 27" curved screen 1080p 144hz. I have been able to set the resolution to 4k in windows, but I dont know if that translates into gaming.
Last edited: Dec 30, 2017Vasudev likes this. -
After setting it in windows and nvidia control panel, you need to change it in the game settings as well for video. Otherwise, the game will play at the prior setting.DreDre, Raiderman, Vasudev and 1 other person like this.
-
And, even still some games will refuse to play nice with a custom resolution. Some will work fine. Some will crash at launch. Some will identify the native resolution and refuse to go higher. And, using a custom resolution does not change the number of pixels in the panel, so it's not the same amount of overhead as a screen with the actual hardware resolution.
Raiderman, Vasudev, Papusan and 1 other person like this. -
For those that want to fix your GPU sag, cooler master sells a nifty little stand that helps support large graphics cards. Newegg sells them for 10USD plus shipping, and are SLI/crossfire compatible.
Vasudev, KY_BULLET, Mr. Fox and 1 other person like this. -
This video provides a nice example of why overclocking your GPU to play games is not particularly useful. If you need to overclock to have a nice and smooth framerate, then it is time for a GPU upgrade, or time to choose payable quality settings for your GPU. @D2 Ultima
DreDre, Vasudev, Rage Set and 1 other person like this. -
-
I just bought this case, and most neweggers refute his review on it. Thats why I went ahead and ordered it. I will let those who are interested know how it really is.
Other Thoughts: Not to bash Gamers Nexus, because I love that channel, but their review of this case is bogus. I ran my own tests with an 8700k and a 7700k, both delidded. Neither showed any significant temp difference with the front panel on or off. I used intel burn test (ran three times for each CPU, and each case configuration). I monitored temps with hwmonitor AND hwinfo64. The max package temp, and all cores were +-1 degree celcius, with some temps actually being higher on the open front panel configuration. I chalk this up to margin-of-error. So for people concerned with airflow, rest easy, because this case is absolutely amazing in that respect.
Other Thoughts: I’ve been following this case ever since it debuted earlier this year. After seeing some reviews I was a little skeptical (Gamers Nexus) but decided to go for it anyway after seeing some actual user reviews. I’m glad I picked this one. It seemed like it got bashed because it was overhyped and didn’t deliever in the eyes of some people. If your willing to pay the price by all means go for it. I got it on sale for $130 and I don’t regret my purchase at all
Other Thoughts: Was I disappointed when I saw the first reviews of this case? Yes.
After actually handling and using this case does the case's flaws merit the blowback it got? No.
Gamers Nexus clearly had a bone to pick and Steve clearly felt it was his personal responsibility to deflate the hype rather than review the case objectively. He admits as much in his review. Apparently he isn't the only one to feel this way based on the number of unverified owners reviewing this case.Last edited: Dec 31, 2017 -
Those work really good too. My 1080 ti has one supporting it.
-
I always take away the info from reviews very carefully. They are mostly useful for getting a better look at things that you cannot touch or handle in-person. Heck, that even applies to movie reviews. If the "professional" reviewers hate it, I know chances are really pretty good that I am going to think it is awesome. So, your experience being different comes as no surprise to me.
-
I have one thing to say to this:
It appears you have something wrong with that there CPU core eh
@Johnksss @Papusan @bloodhawk have a good chuckle
----------------------------
In the majority of cases you're totally right, but in the days of maxwell where people had +30-50% overclocks at stock and say a single 980Ti (reference sub-1000MHz, easy OC 1450MHz+) was the best you got (going to ignore Titan X for the moment), people gaming at high resolutions/framerates pretty much had no choice. Pascal really doesn't overclock that far off stock, so the bumps are more like 3-5%, not anything to write home about, so right now you're totally right. The question remains whether Volta/Ampere will once again overclock far, and how dependent on memory it is rather than core. I feel like intel and Nvidia have decided that they want to take the AMD GCN route where they launch the products as close to their maximum sustainable daily overclocks as possible. Of course this doesn't mean breakneck overclocking for a short benchmark is impossible, so it doesn't much affect you.
Edit: your average framerate only jumped a couple % (in line with the small % overclock you had I'd say) but your minimum framerate jumped 10fps, that's a really big deal. I'd overclock just for that minimum FPS jump alone, it's a lot smoother the less your frames ever sag.Last edited: Dec 31, 2017Papusan, DreDre, Robbo99999 and 2 others like this. -
LoL, must have forgotten to apply the change from M to G on that core. Good catch.
Edit: I bet it has been that way all along (since I set up the desktop when it was brand new) and nobody ever noticed.Last edited: Dec 31, 2017Papusan, ajc9988, KY_BULLET and 1 other person like this. -
-
No wonder my temps were so good after a 7-hour binge on Wolfenstein II: The New Colossus today. I can't believe how fast that 7 hours went by, LOL.
BTW: Happy New Year everyone!
Papusan, DreDre, Robbo99999 and 2 others like this. -
hahahahaa
-
Robbo99999 Notebook Prophet
Ha, awesome, that video was a lot of fun, especially the intro! I enjoyed the strange technicolor ghost of the once Head AMD Tech Guy too!
I think Gamers Nexus do a great job, I only really found them a few months ago if you can believe that (!), and I think temperatures of components in cases can vary greatly depending on what type of coolers you have & configurations, praps you have a very different configuration to that tested by Gamers Nexus. -
Robbo99999 Notebook Prophet
Hang on Mr Fox, I don't get this, was that core really at 5.2Mhz, that can't happen - you guys are just running a little on going joke about this right!
Raiderman likes this. -
No, it was not running 5.2MHz. I think 800MHz is the lowest possible speed. I customized HWiNFO64 to show GHz instead of MHz and forgot to change the M to a G on that core. With 6 cores that makes my OSD width 18 characters narrower on that line. @D2 Ultima noticed the M.Robbo99999, Papusan, Rage Set and 1 other person like this.
-
Yeah, I think they are my favorite YouTube channel. I am envious of the dude's hair. Mine used to be long and thick like that and now it is short and gray, LOL. Had to get a haircut when I switched from a blue to white collar job back in '86 and have never stopped regretting chopping it off. I grew it back once and was treated like an outcast by all of the self-righteous rednecks I worked with.Rage Set likes this.
-
Wow, that's pretty fancy looking. A little too much info for me though. I generally only include basic AIDA64 sensors that reflect upon my core specs (System, CPU, GPU and RAM)Raiderman likes this.
-
Ya, I just purchased Aida64, so I can have a system monitor on the LCD screen of my logitech g19s. Older Keyboard, but fun to tinker with. I can put the OSD of Aida on the 320x240 LCD screen. Pretty cool
Last edited: Jan 1, 2018Robbo99999, Papusan, Mr. Fox and 1 other person like this. -
Yes, that is really cool. It also works with some Logitech chassis/case-mounted LCD panels. I love AIDA64. The guy that supports it is named Tamas Miklos and he is an awesome person. Any time I get access to a new laptop model or unsupported motherboard, I send a report from AIDA64 and he follows up with me right away. Almost always get an updated version with added support to test within a day or two.
-
Robbo99999 Notebook Prophet
Ha, well that makes sense now! I've never had long hair, but it is indeed starting to go grey! -
Mine is little ugly at moment..
would you mind to share your LCD profile?
Thanks -
Want SLI?
Yeah, its a CPU
"Intel at the original launch did state that they were using Core-H grade CPUs for the Intel with Radeon Graphics products, which would mean that the CPU portion is around 45W. This would lead to ~55W left for graphics, which would be in the RX 550 level" It's a CPU
Intel Core i7-8809G "Kaby Lake + Vega" MCM Specs Leaked Again, Indicate Dual IGP
![[IMG]](images/storyImages/intel-8th-gen-cpu-discrete-graphics-2_678x291.jpg)
Intel with Radeon RX Vega Graphics: Core i7-8809G with 3.1 GHz Base, 100W Target TDP, Overclockable
Edit.
Intel reveals details of Core i7 8809G proc with Vega M Graphics & standard IGP
"The GPU this is Vega based, and the "Vega RX M GH". What's hilarious is that the processor also holds the standard Intel IGP, an HD 630 and that means this one processor has the two integrated graphics units. Other specs on say compute units etc are not shared, neither is HBM2 memory mentioned. Looking at the fact that this info is shared now, close to CES .. is indicative of an announcement next week. There already have been a number of leaks on what seems to be this Frankenstein CPU/GPU build"
Last edited: Jan 2, 2018Robbo99999, Raiderman, Mr. Fox and 1 other person like this. -
Wow, looks like a real abortion. Not sure why anyone in their right mind would do something idiotic like this. BGA filth on steroids, in search of stupid suckers that like to waste money on disposable gonad-free tech garbage.
One thing we are missing in social media is a "hate" button. At least YouTube has thumbs down, LoL.Last edited: Jan 2, 2018 -
Mine is little ugly at moment..
would you mind to share your LCD profile?
Thanks[/QUOTE]
Still working on mine
-
That keyboard is selling for $399 on Amazon, LOL. If I remember correctly, it was like $160 new while it was still being manufactured.
-
I think I paid $250 for the Microsoft entertainment keyboard 8000 when it first came out.
Ya, its ridiculous. Ebay is selling them for upwards of $500. I found a manufacture refurb on ebay for $99. Its like brand new. Here is the link if you are interested. Limited quantities left!
https://www.ebay.com/itm/Logitech-G19s-Gaming-Keyboard-Multi-Key-Wired-Keyboard-Panel-Screen-USB-2-0-Port/372129622467?ssPageName=STRK:MEBIDX:IT&_trksid=p2057872.m2749.l2649Last edited: Jan 2, 2018 -
I wish the new Logitech RGB mechanical keyboards had the LCD display. It's a nice feature to have. My old XPS M1730 had one between the keyboard and display hinges and I always liked it.
Thanks for the link. I am going to have to try hard to resist my gluttonous geek fetish for gadgets. I have like 5 keyboards already and even threw away a couple of expensive keyboards that nobody wanted not too long ago. I could not even give away the Dell wireless BT mouse and keyboard combo and M$ Wireless Elite mouse and keyboard combo that I threw into the garbage. Everyone I offered them to said "no thank you" LOL.Last edited: Jan 2, 2018 -
I know exactly how you feel with your geek fetish....lol. I just couldnt resist the keyboard, as it has been so long since having a desktop, and all the cool stuff that had come out during my hiatus. Gaming laptops are cool and all, but the fun factor is -10 when compared to a desktop. God help me, and my addiction.
Edit: For those looking for some backgrounds, I found these on the net. The problem with these backgrounds, is they are all meant for quad core cpu's. I make my own for 6, and 8 core.
Last edited: Jan 2, 2018 -
LOL. I don't really want to be cured, but I definitely need God's help to not be a slave to it.
$99 is a really good price for that Logitech.
I am really happy with the K95 Gaming RGB keyboard. I wish it did not have the 18 "G-Keys" on the left side. I have no use for them, but it was less expensive than the K95 RGB without them. I like that it is square (no odd shapes) and has a thick anodized aluminum top.
The Gamdias MEK M1 mechanical keyboard that iBUYPOWER included for free with my desktop is also really nice. I also has a metal top plate with a gunmetal finish, but it is only 7-color LED, not customizable RGB. I wish it had configurable RGB. That would make it a lot nicer.
Last edited: Jan 2, 2018Raiderman likes this. -
Uninstall XTU. Or, at least kill it in the task bar. You cannot have any 3D processes running when you enable/disable K-boost or SLI. It will give you that error every time. Sometimes you will see it if Windows 10 Store app is updating in the background as well.
I say uninstall it simply because there is no reason to use XTU with your amazing ASUS BIOS. XTU is missing too many useful features. XTU does not allow me to adjust BCLK, voltage or memory settings with my ASUS motherboard. It is basically totally worthless. Even the ASUS-branded version of XTU included on the motherboard support CD is totally worthless.
You also have the ASUS AI Suite 3 available, which has way more overclocking features than XTU, including custom fan profiles.Last edited: Jan 2, 2018KY_BULLET and Trafficante like this. -
Poke around in the BIOS and make sure onboard audio did not get disabled. There are settings for that.
Also confirm your speakers are plugged into the green 3.5mm jack in back. -
Been running my Corsair 3200Mhz CL16 LPX ram at 3466Mhz CL16 for 2 days now without issue. Any suggestions on a good memory benchmark to test this out?
Mr. Fox likes this. -
-
Be careful of that G19 on Ebay as in the product description it says it is for a G510S, not the same keyboard.
-
Huh? I purchased from this seller, and it is a g19s. I have the keyboard! Must be something messed up on his description.
Edit: That is because of Ebays stupid "match" what you are selling description. Half the time you cant find what you are trying to sell. No biggie, they are G19S keyboards, and mine looked brand new when I received it. -
Thanks! Looks like I'm right about where it should be then. Nice getting well above rated speeds at stock 1.350v.
Attached Files:
Robbo99999, Papusan and Mr. Fox like this. -
-
Update: Cooler Master H500P arrived today, and slapped it together real quick. I have to say it is a nice case, and looks sharp. It has the ability to mount your graphics card sideways, but have not done that yet. So far there is no temperature difference.
Robbo99999, Johnksss, Papusan and 1 other person like this. -
Looks great! Congrats!
I have the bracket to mount the GPU vertical. It was included with the View 71 case, but I have not wanted to spend the money for the PCI-e x16 riser cable. I'm not sure what, if any, benefit there would be to doing that, and I am not sure if the temps would be better or worse. As it is now, my GPU gets blasted with a constant flow of air on the back plate from the 360mm radiator and mounting it vertically it would receive a lot less air circulation. But, I think it looks like of cool mounted vertically.Last edited: Jan 2, 2018Raiderman likes this. -
The effects of these major software updates are unknown, though early estimates have placed the performance hit at between 5 and 30 percent, though newer Intel processors do have features that are said to reduce the slowdown. If these reports are true, Intel is in for a lot of trouble. The performance drop will depend on how much tasks depend on kernel access which is where the slowdown will occur.
https://overclock3d.net/news/cpu_ma...patch_expected_to_have_a_performance_impact/1
Intel messed up so bad, they have to redesign the Kernel on windows and linux to fix it! 5-30% performance drop if needs to access the Kernel. Says a lot!
@Mr. Fox @Johnksss @Papusan @hmscott @Rage Set @D2 Ultima @Raiderman @bloodhawk -
Started a discussion thread here:
Intel CPU Bug Cannot Be Fixed With Microcode Update | Cripples Performance Up To 30 Percent
http://forum.notebookreview.com/thr...rmance-up-to-30-percent.812424/#post-10657313Papusan, Talon, TANWare and 1 other person like this.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.
