I'm not so sure about that. Flashy lights + the famous Skull is too tempting say no to>Their Customer base is brand loyal and this is what matters. Why change what they are good at?
But some have seen the light and figured out that there have been a change.
-
-
@Poppysan | Reddit link to the JimmyToe Failienware video was a hoot ... Leave it to ol' Frank to sell the cheapest on the market & charge the highest for it. Lemme take you back to Jan 2019 F.Azor Reddit AMA (KFury's my nom de guerre)
The percentage of customers that bow-out after the basic 1year warranty helps the pyramid scheme stay afloat & he knew it, banked on it; EVGA has 3year warranty & nor is a 4year Dellware warranty free, it's add ~$400 (Frank on Truth Serum: cheap cards are cheap to replace in the long run, easiest to mark up at a premium & make profits on right now). So can you imagine lying to your customers out in the open like that?, spinning a yarn that a junk Alien card that 'crinkles' can double as the best possible solution for shock vibe & drop testing, that the rock-bottom budget Ti is in fact that very thing that helps meet 'stringent quality specs'? At least the gaslighting & neuro-linguistic programming were transparent ... Ahoy! & good riddance
In other news, AMD 3950x is coming to Aurora this month; turns out AMD recommends a 280mil AIO cooler minimum, yet, Aurora only has a slim 120x30 cooler (which dates back to 6700k but which 9900k is currently saddled to); the shill media reviews surrounding the junk cooler should be interesting, & I expect the Aliens to blame an 'experimental Bios!' this go-around based on whatever failures happen
edit: Speak of the Devil ... Tom's Review Failienware Aurora 3950x " JimmyToe 2080 Ti & it needs better cooling"
- Verdict: Inadequate cooling, Noisy under load
- Cooling & OvrClk: 120mm AIO for a 16-Core?
- 'We tried overclocking but didn’t get very far'
PCMag: A dated-looking (though clearly new) yellow & blue motherboard, home to some function-first (ugly) heat spreaders & few bells & whistles. An Alien-branded liquid cooler is installed over the CPU, with its small 120mm radiator connected on the top panelLast edited: Nov 14, 2019 -
You have a point. To some people the experience or "journey" is more important than whether or not the product functions well. They value form over function and culture more than core values. Look no further than crApple for the best example. Alienware has carved out their niche for the gamer kids and wannabe enthusiasts that don't mind spending extra for broken garbage as long as it looks fancy. Owning an Alienware has some loose similarities to MSBP (Munchausen Syndrome by Proxy). They need to live in a constant state of crisis and feel like a savior.
A person with MSBP often:
- Has medical (technical) skills or experience.
- Seems devoted to his or her child (laptop).
- Looks for sympathy and attention.
- Needs to feel powerful and in control.
- Does not see his or her behavior as harmful.
Last edited: Nov 14, 2019 -
-
This is pretty freaken cool! A little on the expensive side, however.
https://www.newegg.com/p/N82E16813157897 -
Wow, yes that is super expensive. Super fancy, too. But, seems like way too much to spend on a mobo if the CPU has almost no overclocking capabilities. Not sure what could be gained by spending so much on it versus an average, but high quality mobo.Last edited: Nov 15, 2019Convel, Raiderman, JoeT44 and 1 other person like this.
-
Nothing really to be gained using an AMD CPU. Was just browsing the MB selection and stumbled on to it.
After being away for a while, and updating my agesa, I am now able to run stock speed on my trident z's. Will see if I can tighten the stock timings
-
Definitely looks like a well built mobo. If it can really run the memory at 5200MHz that would be pretty amazing. From what I can see, none of the other ASROCK boards claim that capability. I didn't even know there was RAM available that is rated for 5200MHz. That's really crazy (in a good way).
Awesome that the Agesa update allowed your RAM to run correctly... definitely a step in the right direction.
It's really a shame that Ryzen doesn't support meaningful overclocking. If they did there would be nothing stopping them from achieving true greatness in the eyes of overclocking enthusiasts. It kind of ruins everything for the folks that care about that. Not much fun without the opportunity to stomp the snot out of peers that are running the same hardware.Last edited: Nov 15, 2019 -
I think I am going to get some 4400 RAM to play with. If that works out, I will offer my 4000 RAM for sale. If not, I will return the 4400. Here's the 4000 @ 4200. (Not sure 7960X will do 4400 or not... maybe it will.)
This is kind of meh... 780 Ti needs to be on water to stay cooler so I can push the core clock higher, but it's nice not seeing spastic erratic behavior like Pascal and Turing. Stays locked exactly where I tell it to.
Last edited: Nov 16, 2019Rage Set, Robbo99999, Convel and 6 others like this. -
-
This really shows how far the performance has evolved in 3 generations of GPU. Ran a Gears 5 Benchmark. Allowed it to set automatically for 780 Ti performance, which is mostly medium and 1080p. For 2080 Ti everything maxed out and 1440P. Still a MASSIVE difference in FPS with 2080 Ti. Crazy.
Attached Files:
Rage Set, Robbo99999, Convel and 8 others like this. -
-
Hi, so I got back to testing and I realised that 880M is not even being used! the low clocks were not because of 2D usage but because GPU was not used at all! The GPU 0 is exclusively used, meaning my intel HD, so do you have any clue as to why the gtx is not used at all?
Unreal, I actually got it working after months of doing nothing I came up with an idea, doesn't windows 10 have some weird setting now that enables you to pick what GPU to use for application base use?
So I test it on my games and put the GPU to nvidia and voila! Its used perfectly fine and normal!
Finally it works even for GTX 880M users.Last edited: Nov 16, 2019 -
Ah Yes I had to do this same thing recently on my Dell Precision M4800 with the FirePro M5100 as well.
-
Yeah, they just keep making Windows 10 suckier and suckier. There is hardly anything desirable about it... really crappy product.
Here is the image from your post that wouldn't display correctly.
-
-
Robbo99999 Notebook Prophet
Nice, that was a tempting thought for me, because I game high refresh rate, so it could help out in CPU/platform limited scenarios, but I think your sticks are single rank rather than dual rank...and dual rank imparts a big positive affect on performance when you have just 2 RAM sticks like I have. Yeah, and also my motherboard, I'm wondering if both my CPU & motherboard would be able to tweak to 4000Mhz to offset any single rank negative affects. Hmm, tempting, but on balance I'll stay with my Dual Rank and lower Mhz sticks, but buying from a notebookreview guru like yourself is a smart thing to do, especially when we know exactly what those RAM sticks can do!Mr. Fox likes this. -
getting better ...
Attached Files:
Rage Set, JoeT44, Papusan and 1 other person like this. -
-
Well, I am sending this stuff back. No good for some reason. Slower than my 4000 RAM running at 4000 default XMP with looser timings and will not POST with either XMP profile. I can get it to post at 4000 with higher voltage, but it won't POST at 4200 using the same voltage and settings as my Corsair. Cannot get it to POST above 4000 at all. And, using settings that allow it to POST at 4000 disables two of my PCIe slots that have NVMe SSDs in them. Not sure if it is defective or just something with the SPD is incompatible. They are also single rank. (I think that is fairly common for RAM at 4000 and higher.)
I even tried it with two sticks (dual- versus quad-channel) and still won't boot. Seems the Corsair Vengeance LPX is a better product, at least for the X299 Dark.
https://www.newegg.com/patriot-16gb-288-pin-ddr4-sdram/p/N82E16820225144?Item=N82E16820225144
Looking good Brother @makina69
Last edited: Nov 19, 2019Robbo99999, Rage Set, Papusan and 2 others like this. -
rofl 36.6ns. good thing AMD still at above 65ns. gonna need some real good optimization before considering AMD's machine for OC.Papusan likes this.
-
Yeah, I agree... AMD doesn't really offer anything of keen interest to an overclocker. You get good value and decent enough performance if you just want to run everything stock with BIOS defaults, but there's really no joy in that if you are into overclocking. I really cannot see myself ever purchasing anything from Team Red unless or until that changes dramatically, and it doesn't seem like overclocking is something they care about at this time.
I think you meant to quote @makina69's post. Yes, that is really impressive results he is getting. Crazy good.
-
Back to my Corsair Vengeance LPX with a little bit of tweaking. I actually used the Ryzen DRAM Calculator like @ajc9988 mentioned trying for Intel and that worked out pretty well. And, considering this is W10, which has poorer memory performance than W7, it is impressive that this beat my best memory read score by a small margin (which was using W7).
And, @Robbo99999 will like the lower latency... not bad for meshed quad-channel.
Last edited: Jan 13, 2020Convel, Robbo99999, Rage Set and 5 others like this. -
Ryzen DRAM calculator for Intel huh? Might also give that a try...
Papusan, Rage Set, Mr. Fox and 1 other person like this. -
This is what it produced. The few settings I could not identify clearly in my BIOS I left set to "Auto" and it seems stable that way. Not remarkably different than what I was using before. More conservative on a couple of settings, but I might have gone too aggressive on some because the more conservative settings seem a little faster. Maybe there was some error correction that needed to occur (slowing things down a bit) with the more aggressive settings.
Convel, Papusan, makina69 and 1 other person like this. -
More #1 scores for the 9750H in the BGA turdbook class...
https://hwbot.org/submission/4285452_
https://hwbot.org/submission/4285455_
https://hwbot.org/submission/4285439_
https://hwbot.org/submission/4285440_
-
I think the issue with AMD right now is, they are trying to squeeze out as much performance as they can from the factory (kind of like slightly OC from factory already) this is why we are seeing not much OC on those chips. At the same time they still can't match Intel CPUs in frequency. I can hit 4.7Ghz on my 3920XM from 2012, AMD CPUs are now barely getting to those frequencies on the new chips.
Here's my BGA turdbook lol
5,578 Single Core Avg
29,257 Multi-Core Avg
https://browser.geekbench.com/v4/cpu/14953084
https://browser.geekbench.com/v4/cpu/14953074
https://browser.geekbench.com/v4/cpu/14953063
While monitoring with Intel Power Gadget, didn't seem like it was properly taking advantage of the CPU. I'll try GeekBench 5 next.Last edited: Nov 21, 2019Rage Set, sweepersc2, Mr. Fox and 1 other person like this. -
-
Tweaked the RAM a little more and gave it a little more juice for 1T and got 4th place worldwide DDR4 read speed...
https://hwbot.org/submission/4285877_mr._fox_aida64___memory_read_ddr4_sdram_132324_mbs and @Robbo99999 will like the latency a little better, too
@ajc9988
Also ran this... not too shabby for 5.0GHz @ 102 BCLK (5.1 effective).
-
That could be, but if they are already pushing them to near their functional limits, then that is not a particularly positive statement about the quality of their silicon or architecture. By that I mean, the clock speeds are not very high. And, that then begs the question of what would they run like if they weren't doing that. How high (or low) would their clock speeds actually be and how would they perform that way? I have heard this theory a number of times. It might be spot-on accurate, but I am not entirely sure how to digest or interpret that if it is. It doesn't give me any warm fuzzies, and the idea is more discouraging than exciting.Last edited: Nov 25, 2019Papusan likes this.
-
For that, there are a couple GN videos showing scaling with cold. It may not be the overclocking you want, but those show it does similar to nvidia boost scaling. Unfortunately, that means for competitive overclocking on AMD CPUs, the primary performance factors will be 1) the silicon lottery (and having to buy top bins) and 2) cooling (from chillers to mods, etc.). Tweaking and optimizing settings does help, but I'm unsure that they have given fully the necessary controls to allow optimizations related to boost (which is why all core is still king of OC on Ryzen).
Hopefully, they move from their pre-selected cores to allowing a user to bench each core and select their own voltages and speeds for the core they determine to be best, etc., at some point in the future.
But it is getting closer to like OCing GPUs (which I personally am less adept at) and a lot on the lottery.Glad that recommendation worked. I know a couple settings need set slightly different between Intel and AMD for optimizations, but I also know any little bit helps. -
Well, maybe once they've gotten their product deeply embedded in the market they can take a bit of a breather, then shift gears and focus more seriously on the tuning expectations of overclocking enthusiasts while Intel is playing catch up on what they missed while their pants were down and they were not paying attention.
I hope they do not follow NVIDIA's lead with gimmicky crap like "boost" and similar nonsense. NVIDIA's flavor of overclocking is pretty nasty, not to mention mediocre. Pert near worthless if you're not willing to break out a soldering iron to fix their messes. That really sucks. -
Yeah, I'm doubting that will happen, unfortunately. I'm being Intel introduces a similar boost algorithm soon, if being honest. It's sad seeing the hobby under so much attack as tech is better than ever.
Let's see if Intel gets their 7nm process in good shape by 2021-22. If they do, the IPC increases with them will be astounding (estimated in the 40-50% range). AMD, if the rumors are true on Zen 3 and Zen 4 is comparable in IPC gain, will have roughly equal IPC gains over Zen/+. Getting 50% IPC in a 4-5 year span across the industry by all parties is great performance growth, ignoring the frequency for a moment. We don't know if Intel solved the 10nm frequency problem or if TSMC or Intel will have frequency issues on 5nm and 7nm, respectively.
Where I see things going is Raja having boost on Xe graphics. They use that research to so similar on their CPUs. Nvidia around 2021/22 introduce multi GPU cards, which Raja is already making for Intel. AMD will likely use the I/O die to mask the NUMA for the GPU to bring multi die to gaming, but have said they could already do it for commercial. And Via, although having the x86 license, is still around the 14/16nm node.
But with all that, it spells a bad time for overclocking moving forward. If just wanting to pop in a chip/card and get performance, there has never been a better time in history. Especially with the upcoming ASIC chip integration coming for CPUs and GPUs. But I am sadly coming to the realization that software is a bottleneck and unless I want to learn HVAC or go LN2, I may slowly dwindle off on my hobby. With that said, I will still optimize the heck out of my hardware (and post results, of course).Robbo99999, Ashtrix, Papusan and 1 other person like this. -
I am starting to lean more in that direction as well. It's hard to get excited about what seems to be the trend. It always sucks when the leaders and brain trust of any industry embrace an inferior agenda. The only satisfaction left for the resistance in that scenario is the smug adoration of their own superiority.
But yeah... it's going to eventually become pointless and an exercise in futility if things continue on their current path. And, if that is what ultimately happens I will not waste any more money buying high end cookie-cutter garbage just because it is more powerful. I'll just buy whatever is adequate to perform a specific task at the lowest cost possible and find an alternative hobby more worthy of my excitement and passion.
If it becomes too difficult to distinguish one's results from those that are achieved by click-and-run jockeys, there's no longer any joy in owning the best as far as I am concerned. But, I will wait and see where it ultimately leads before rendering a verdict. If there is any chance that overclocking awesomeness will endure, it certainly won't hurt my feelings any.Robbo99999, Ashtrix, Papusan and 1 other person like this. -
That's why I am wanting to learn video editing, networking, etc. Some of the high end equipment is now at a point where render servers, large NAS configurations, and high speed networking are coming to the home. CAT8 cables support up to 40Gbps for 30M runs, meaning a well planned home network may be able to utilize those speeds for much of the house (or at least a server cabinet to office scenario and a couple other rooms). And CAT8 is backwards compatible and does the 10Gbps for the 100M runs.
So just building a pro grade server based network for streaming owned content, etc. Has never been easier. Still learning and troubleshooting, just an adjacent hobby to OCing.
Or picking up calibration gear for stereos and displays (my other hobby).
People will always need something to do. If they take away the overclocking, we might as well focus on things similar but that increase enjoyment of other regular things we do. Warning on display calibration: once you do it (especially with a spectrometer and a colorimeter), you may not like watching anything on an uncalibrated display.Robbo99999, Ashtrix, Papusan and 2 others like this. -
I used to be primarily into PC gaming before I got addicted to the joys of overclocking. I've nearly lost interest in gaming over the years because overclocking is so much more fun. That (gaming) would most likely be where I would end up going as an alternative, unless I just abandoned the idea of doing anything for fun that involves computers. I'm still holding out in hopes that it doesn't come to that and good old fashioned overclocking mayhem endures the test of time. I guess I will cross that bridge when I get to it. Don't really like the idea of going back to gaming if I can avoid it. Too much of a time-waster to get hooked on that again.
If using a controller didn't suck so bad compared to keyboard and mouse I'd probably just keep a cheap laptop for work (like I already have done) and switch to an XBOX for gaming. I just loath using a controller though, and despite my best efforts to embrace them, I am having a hard time hating them any less than when I started trying years ago. I've got an XBOX Elite Wireless controller with the little keyboard. Cool device and really well made, but good Lord, it still sucks trying to play games using anything less than a keyboard and mouse. That being the case, I had to get a slightly better laptop than what I needed for work. Doesn't take a lot of money or horsepower to have fun playing games (if you don't count getting raped on the price of games for a console or new releases on PC... but, I don't normally pay new release prices).
Thanks for the advice on display calibration. I will likely steer clear of it because I do know that ignorance is bliss. Can't miss something you don't know about. Probably the safest bet for me, LOL. The last thing I need is to get addicted to that, too.Last edited: Nov 25, 2019 -
Sorry, but I have to call BS on this, no two ways about it.
What you're saying, in essence, is that you want AMD to deliberately detune their chips so that you can have some fun figuring out how to get that performance back, without having to literally get your hands dirty. Put another way, you want everyone else to get less than the best possible performance, right? And yes, I do expect Intel to do the same thing. AMD's just plain holding their feet to the fire, and that's a good thing. I don't think Intel's going to have a hammerlock on that 16'ish core high memory and I/O segment for very long, but I'm glad to see Intel being forced to slash its prices to compete. It's about time.
AMD isn't imposing frequency/multiplier locking on any of their chips, even the Ryzen 5's. You can play with frequencies, multipliers, voltages, to you heart's content. The fact that you're not getting a lot of return on that work doesn't mean that AMD's producing an inferior product. Quite the contrary, in fact: it indicates that they're producing the best products that they can and not hobbling their customers, the vast majority of whom don't want to get fancy.
You can overclock Ryzen 3000 chips, you know. The fact that someone got in excess of 5 GHz on a 3950x all cores indicates that it can be done. Sure, it took liquid nitrogen to do it. But you can get there, if you're willing to work hard enough at it. You can't do that with locked Intel chips, you realize. Intel puts locks on those chips to prevent you from overclocking, no matter how hard you want to work at it. AMD doesn't. Want a faster CPU or GPU? Be a sport, and break out the soldering iron and liquid N2 chiller. If you want an advocation where you don't want to risk burning or freezing your hands, take up something else, like programming. There are endless open source projects out there that would love to have someone do some serious performance tuning on them (and trust me, there's more than enough need for that!). As a project lead myself, and a contributor on other projects, I can tell you that first hand.
And this whole clock frequency mishegoss is the same thing. The top line chip frequency means nothing without taking into account architecture, memory, and all that. And I don't just mean IPC. IPC isn't very meaningful without a lot of context, including what those instructions are doing. For software that can utilize AVX512, Intel can blow AMD out of the water. Everything else -- isn't it interesting how AMD wins just about every benchmark other than AVX512? Because whatever they're doing with clock frequency, they're doing the system as a whole right. And that's what it comes down to -- system performance.
If anyone here bought a Ryzen 3900x and is disappointed that they can't overclock it, and the chip works at the advertised speed, I'm willing to pay $400 right now for it, where it will enjoy a much more appreciated home. But don't wait too long, because Intel's counterattacking and will force down the Zen 2 prices if AMD doesn't get there first with Zen 3 and I won't be interested in buying a used chip when I can buy a new one for the same money. -
Ryzen's overclocking limitation is more architectural than anything. Let's be clear on that. If you really want to debate that, there are many people on Real World Tech that would be happy to oblige. Before I even got into overclocking, I was (and still am) a real geek about the innerworkings of CPU's, GPU's and semiconductors. Intel only started selling unlocked procs due to the market finding ways to extract more (free) performance from supposedly locked procs. Intel artificially limited their procs for market segmentation and we fought back. It is true that overclocking is much easier and far safer now than its origins, but the folks in this thread desire pushing their hardware beyond normal means. However, that doesn't mean we should be limited by exotic cooling or an architecture that is maxed out.
EDIT: With all of that said, I am happy that AMD delivered a knockout to Intel. I am strongly considering replacing my 1950X with the 3960X and new mobo.Ashtrix, Papusan, ajc9988 and 1 other person like this. -
^^^Yeah. ^^^What he said.
Brother @rlk you can call BS on it all you want to, but doing so reflects a lack of understanding of the sport and the people that enjoy it. Manufacturers giving consumers the best performance possible with conventional cooling is not the same as providing robust hardware that can do more with better cooling and more tuning settings available in the firmware. The problem is not that stock performance is better than it ever has been. That's great for consumers. What sucks is hardware that may as well be locked because it responds very little to increases in power limits, voltage and cooling. The trouble with AMD is not that their stuff doesn't run well, it's that it doesn't respond to extraordinary conditions outside of the norm which require knowledge, trial and error, and special cooling. That is what overclocking is all about, and ordinary consumers can't and shouldn't get the same performance as the people that are willing to take the time and trouble to do greater things that are outside of the norm. And, they wouldn't appreciate it if they could. I would label that a "socialist computing" mentality and anything that smells like that really sucks. It's not a benefit to consumers, it is a limitation to enthusiasts. Consumers don't care and don't need more to do what they want to do, so that is just hollow rhetoric, bro. It's OK to be an AMD fanboy. Nothing wrong with that. Just don't make lame excuses for their inability to respond favorably to overclocking efforts.
That is the lack of understanding that I am talking about. Nobody wants them to detune anything. What overclockers want is for AMD CPUs and GPUs to scale well with more advanced tuning and thermal solutions just like AMD CPUs used to do and the way Intel CPUs still do today. Sadly, they don't. They do a little bit, but not very much. You'd not get very far with overclocking any CPU or GPU if you try to overclock the snot out of it without knowing what you're doing and taking special (non-consumer level) steps to keep it cool.Last edited: Nov 25, 2019Rage Set likes this. -
I don't disagree that it's architectural. What I disagree with is that it represents an architectural flaw: I think it represents an architectural strength, in that the chip and its accompanying firmware are architected to always try to deliver the best performance commensurate with environmental conditions. Looked at another way, AMD tries to overclock the chip as much as possible itself. That inevitably leaves less room for others to find more without resorting to something more exotic.
The fact that you're actually happy about the new chips speaks to that. -
I think everybody is happy about how they run stock. Who wouldn't be? You can call it a flaw or something else. Whatever you choose to call it, it is clearly a functional limitation that is not remedied to an appreciable extent through advanced tuning or cooling. Helps a little bit, but nothing worth writing home about. Not scaling clocks and performance well when conditions are more favorable is not what I would call a strength. I'd say that is an Achilles heel. They're not "factory overclocking" anything. They're optimized for normal conditions and don't respond in a fantastic way under extraordinary conditions. That's awesome for the masses, but it's a bum deal for overclockers that would prefer to use AMD if they could, but there's no point in bothering.Last edited: Nov 25, 2019sweepersc2 and Papusan like this.
-
There, in a nutshell, is where I call BS. You're saying outright that "ordinary consumers" should not be able to achieve the same performance as people who want to do more with it.
There are two ways that could be true:
- AMD locks down the frequency. If AMD did that, I wouldn't have a problem with your saying that you want to do more. That's what Intel did (and still does except for the K- and X-series processors).
- AMD programs the firmware to always try to deliver the best possible performance under the ambient conditions, allows you to try to do better, but simply doesn't leave you a lot of room to do so without resorting to exotica. That's where I call foul on you. The problem isn't that AMD is limiting your performance, it's that they're boosting the performance of everyone else.
-
Yeah, a good calibration can take 2-8 hours, depending on the display and method being used. Then, any update on firmware for the set can require another calibration. For equipment, a decent colorimeter (get the OEM variant of the i1 display pro which goes up to 2000 nits) you will pay around $200-300. Pick up a used i1 pro 2 (the i1 pro 3 was announced, so those should be sold by placed doing print and display for commercial purposes, or fabric color matching) for $200-600 (new costs $1200+). That is all it takes. You can use freeware HCFR or pay $140 for the consumer CalMAN software, which is not a bad buy and has a built in Dolby vision pattern generator.
With that, you just try to optimize the gray scale to as low a Delta E from the standard as possible, then move around the saturation, hue, and luminance levels of the primary colors and sometimes the secondary colors as well. And because doing it at home, you don't have to compromise on optimizing like a person you pay to calibrate for you, meaning you can get to where it is the best your equipment can achieve.
Now, reference equipment (going full professional here) costs around $6,900 for a Klien K10-A reference colorimeter, then around $7,700 for a CR-250 spectroradiometer, followed by $1000-2500 for a good pattern generator and around $2000 for the software to run that professional equipment. But that is at the level of what true professionals with certifications use, what broadcast studios use, etc. At the highest end, the top spectros cost $16000+ for the CR-300 and the Konica Minolta CA-2000 which can calibrate laser projectors and have the smallest light wavelength differentiation.
But that should give an idea. For speakers, use REW software and pick up a SPL sound meter and a calibration microphone from cross-spectrum, which calibrate those instruments against an NIST traceable microphone. That is about a $250 buy in.
So for about $1500-2500, you can do all of your calibrations for setting up stereo and audio equipment, sinking hours into optimizing it. Plus, once you show how it performs after, the wife might be cool with it. Now I'm not responsible for the cost of buying better A/V equipment after you start, though.
For doing computer displays, I'd just use displaycal which is a free 3D LUT generator, create the .ICC/.ICM for windows (and load that into your browser), then translate that into a file for madVR plugin for MPC-BE for media (I think Kodi can use it also), and finally a reshade LUT which can also be used with filters for gaming.
It is an addiction and if you like things to be perfect, you will love it. Plus there is a learning curve, meaning that until you learn enough about how to calibrate, each time you do it you will find new ways to get it better. The art comes in after that when balancing the trade offs between the optimized pretty charts and other aspects of image fidelity.
But you'll have months between having to pull out the equipment to do it again.
Then you also get to pull out your old movies and watch them as new. And with HEVC content for HDR, you get to calibrate for SDR, HLG, HDR-10, and Dolby Vision for each newer TV, so not just a single calibration for the device.
But that is my addiction (I need to do it again on one of my TVs).
And I prefer a controller. Can't play competitively, so I focus on campaigns. Just haven't done keyboard and mouse since high school and get owned either way, so...Robbo99999, Convel, Rage Set and 1 other person like this. -
It's actually very much possible to overclock Ryzen processors, it's just that AMD does as much of that already as they safely can, leaving less on the table for anyone else. AMD itself describes PBO as "an opportunistic automated overclocking mechanism found in various AMD processors that pushes the system power budget beyond its rated specifications in order to allow Precision Boost to act more aggressively and achieve higher performance". See that? Right there, it's overclocking out of the box. Doing the same thing manual overclockers do -- adjust voltage and frequency to gain performance. The difference is that it can do so on the fly and respond dynamically to ambient conditions.
I don't know about you, but I think clocking up a 3.5/4.7 GHz processor to 5.3 GHz all core is doing pretty nicely. Sure, it took liquid nitrogen to do that, but that's all part of the challenge, right? -
No, actually I would consider that pretty lame and functionally limited. If it takes LN2 to go from 4.7 to 5.3GHz it's not worth it. At least not to me it's not. Challenging yes. Return on the investment, not so much. If you'd like to think it is because they left less on the table for overclockers rather than a functional limitation, that's fine. It's always nice to have a rationale available in case someone asks for an explanation. It's all the same in the end no matter how you want to look at it. Kind of like glass is half-full or half-empty. Has the same amount of water in it either way.
It's OK bro. I'll just let it go. I'm fine if you want to call BS on it, with the understanding that you're of a different mindset and can't relate well to people with a different perspective. If putting a Zen CPU on my water chiller and tuning it only yields, at best, a 10% overclock, that's just not good enough for me. And, I won't want one or support the brand because of that. Too boring as far as I am concerned.Last edited: Nov 25, 2019Papusan likes this. -
I am having a lot of "stupid" fun playing COD on Xbox and PC....with a controller AND keyboard/mouse combo depending on who I am playing with. While the keyboard/mouse players have an advantage, I have been beaten quite badly by some controller users when using my keyboard/mouse and I would consider myself in the top 25% in competitive shooters.DreDre, Papusan, ajc9988 and 1 other person like this.
-
You mean with more cores? Or doesn't core counts matter?
Did I miss something?
electrosoft and Mr. Fox like this. -
You'd have fun against some of my friends that were competitive halo players years ago. Those cats could do stuff with a controller I couldn't imagine. Compared to them, when I'd play with them, they could get pinned down because I wasn't moving fast enough (in the games that allow friendly fire, there were times they would kill me just to get past tougher areas quickly without dying). LolRage Set likes this.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.


