Beast! 38ns is right up there with some of the best 9th-10th gen latency and I'm sure you can tune lower with time. And you have the IPC advantage. The loss of 2 cores does not really matter, the difference is tiny, but the single core performance for gaming is what matters. Unfortunately most reviewers just took it out of box and hit go. Can't wait to see how badly they make Alder Lake out to be with it's "slow gears". Then when we get our hands on it we will tune it into a proper beast.
-
electrosoft Perpetualist Matrixist
The EVGA Z590 Dark is designed and tweak for Gear 1. Now we're seeing a truly great board show what Gear 1 can do even in the early stages. I'm sure you might be able to drop that even more.
As you've found out @tps3443 , 11900k is a good chip and has great performance as long as an 8/16 architecture will fill your needs.
I've been singing the praises of the frankenchip for quite awhile (look back even when I first started testing the 11700k) as has @Talon and declared it the winner vs 5800x months ago. I still stand by that assessment....as long as you have the VRMs and cooling of course.
SierraFan07, tps3443, Mr. Fox and 1 other person like this. -
The world is taken directly from Silicon lottery... "(the 11900K is essentially a binned 11700K, so with the 11900K we’re binning what has already been fairly heavily binned)".
11700K is an i7
Beat or chew on that one, LOOL
I remember when Intel added the i9 moniker for their 8th gen 6 cores high end Mobile BGA. I become darn disgusted and expressed it highly.
Yep, Intel followed same recipe as for the BGA chips. But should have been branded as i7. The higher surname moniker as 11900K can show the differences pretty well. I have no problem paying the premium for the better bin, but name it for what it is.
Edit. Have fun with your new HW brother
Last edited: Sep 29, 2021Clamibot, electrosoft, Ashtrix and 2 others like this. -
Frequency determines performance, and performance determines which cheesy moniker they get awarded! Lol.
The 11700K only supports 2933Mhz natively on the IMC with 1:1 ratio, 3200Mhz is a gear 2 option for 11700K. The memory controller is weaker, and the silicon is worse. Intel also removes thermal velocity boost on the 11700K. So it’s an i7 lol.
I’ll take the better silicon and better IMC!
As for i9 moniker, I say the 11900K qualifies. This is a very good indication of what the future Intel chips may offer.
I know they’re close in performance. But IMC and silicon quality is very important here for 11th Gen. 11700K is still great chip for the money.electrosoft and Papusan like this. -
Yep true, but to be honest PC DIY space is the ONLY thing that doesn't have that mobile garbage insane high level paid garbage trash shilling.
My personal reviewers whom I often rely on, always multiple sources.
As far as mainstream - GamesNexus, Hardware Unboxed (HU has some AMD steer for both GPUs and CPUs so note that, esp Nvidia feud they had) both these guys do bone stock PL1, PL2 tests with gaming and other benchmark suite. Same for AMD afaik. HU also does nice Monitor reviews. OptimumTech Ali also does OCed in the reviews / not the crappy bone stock config.
For OC - Luumi for deep in depth guides solid content as a whole. Buildzoid for Mobos, Memory kits and a ton of other content. Der8aur on some of interesting stuff.
For written content - TechpowerUp is usual standard, OC and stock etc & Guru3D for nice written content with charts. Anandtech as well for their technical overview and some in-depth architecture details.
But Anandtech still do some BS esp I've seen in their Apple hardware be it phones or their new M1, too much reliance on SPEC scores. Also their total trash biased crap review of Apple Mac Mini M1. They actually compared ST performance on the M1 vs 10900K vs 5950X but when it comes to SMT/HT they deliberately did not put those at all and guess what did they put ? Ryzen and Intel low voltage garbage processors,just that Intel's first experimental yield trash silicon from 10nmSF (SuperFin) at 28W TDP max, and Zen 2 based Ryzen 4800U, that's also another power starved processor on x86 architectures probably because they are in the same wattage bracket. But if they wanted to avoid a high power consumption socket / processor they should have not even included the desktop non-BGA trash real processors. Every single PC DIY guy knows x86 processors scale linearly with Multicore Hyperthreading / SMT and power consumption due to high clock speed and more complex I/O complexes and cache.
Here are the ST benches and SMT benches on the Anandtech aspect.Rage Set, Clamibot, Mr. Fox and 1 other person like this. -
HeHe. Intel added a couple of extra features for the 6 cores i9-8950Hk over the i7-8750H, but this shouldn't be awarded with an i9 tag as they throw in for the BGA junk. Sorry, but this doesn't fit in my book. Not that i say an 11900K is an bad product
Last edited: Sep 29, 2021 -
The whole i9 thing was ruined with 8950Hk. Such an overpriced piece of crap cpu lol. Not a bad cpu I guess… But they were so expensive it was insane.
HEDT was i9 and mainstream took that for some reason. Now it’s kinda been worn out lol.
I guess if you did have a 8950HK though, you could own any i9 from
The time. Those things were like $4,000-$5,000+ That’s a costly BGA laptop. -
Because they knew the dumb-dumbs would buy it based on nothing but the i9 branding. It didn't need to be any good because, dammit, it's an i9. LoL.Rage Set, Papusan, tps3443 and 1 other person like this.
-
It's really hard to find any benchmarks that highlight the real performance difference of the i9 11900K in games versus the i9 10900K.
It seems like everyone is either trying to make the 11900K look really bad, or they don't know what they're doing. The benchmarks I'm finding are testing both CPUs in GPU bound scenarios, so of course there won't really be a difference.
For those of you who actually have the 11900K, what is the real performance uplift under CPU bound scenarios? I'm a high framerate gamer, so I'm more likely to run into CPU bottlenecks than GPU bottlenecks (provided I have a powerful enough GPU). -
electrosoft Perpetualist Matrixist
-
haha. Your right.
I run my 3090 Kingpin HC on the 520 watt bios daily. I usually run around 2,160-2,175 core and 23GBPS on the memory.
^With that being said, I’m almost always cpu limited. I was going in to an 11900K with low expectations. I read some good stuff from a few enthusiast who own them, and tweaked them a little. So I figured for actual realistic usage in gaming. (This is probably the fastest and cheapest option available right now) and it has really proven to be exactly that. I could feel the speed difference immediately on the 11900K. Now, I have been on Intel 14NM for about 5 years straight. And all of them have very similar R15 single threaded numbers. 6600K desktop @5.12Ghz to a 7820HK laptop @4.7Ghz to a 8086K laptop @5.0Ghz, 8086K desktop @5.3Ghz, another 8086K cpu [email protected], 7980XE desktop @5.0Ghz, 10900K [email protected]. All of these processor(s) performed similar in R15/R20 single threaded. Same cpu’s with slightly different flare to them lol.
Intel finally improved their IPC, and I could tell! In cpu limited situations especially you’ll see a drastic difference in frame rate.
Most of the reviews do show GPU limited situations. Accept for the flight simulator review, it shows the 11900K managing a solid 20%+ lead over a 10900K stock for stock. This game only uses a few cores (Like a lot of games do) and this gives you an idea of what is possible.
Anyways, now that I’m on this new Z590 Dark with the 11900K, I have increased my DDR4 memory bandwidth by +35%. And I have also reduced my DDR4 memory latency down to 36ns. This thing is wide open in games now. I was happy before, but now I’m just kinda in absolute disbelief why this chip is so poorly regarded by so many reputable reviewers and people.
I run 2560x1440P 165HZ monitor, and this thing is pushing the hell out of my overclocked 3090KP, which is anywhere between 13-19% faster than a box stock 3090 FE.which just makes the CPU bottleneck even worse, as the GPU usage number is even lower then a standard clock droppin 3090FE…
I’m happy with it though, I really am, a lot of reviews give the chip a bad rap..Not sure what all the bad memory latency stuff is about either. That’s certainly not the case at all.
These chips also did receive a microcode update, and this was after all of the initial launch day reviews came out, and that microcode apparently boosted performance even further. My IMC is average (3733Mhz Gear 1 stable) and at this point I could care less, I don’t see how faster memory would even help me anymore. This is a lil rocket chip, or aka rocket lake.
livecode jsonClamibot, electrosoft, Mr. Fox and 1 other person like this. -
I looked up some info about this fantastic sweet little graphics card. But it doesn't look so promising for gaming
I can't find it in 3Dmark search. Maybe because it's a rare card for
gamers?
Last edited: Sep 30, 2021Rage Set, Clamibot, electrosoft and 1 other person like this. -
I will be surprised if there is a listing for it on HWBOT.Rage Set, Clamibot, electrosoft and 1 other person like this.
-
It's on the bot https://hwbot.org/hardware/videocard/geforce_gt_705/
The GT 705 - EVEN WORSE Than a GT 710
Could be worse. Gt 605 OEM was the last one with 512MB vram
https://www.techpowerup.com/gpu-specs/geforce-605-oem.c359Last edited: Sep 30, 2021Rage Set, Clamibot, electrosoft and 2 others like this. -
OK then. I'm surprised, LOL. They must have added it to the database back in the day when they still had admins that were actually engaged and paid attention to things that matter.
I had this greeting on my crash dummy OS this morning. So, I took time to respond.
-
electrosoft Perpetualist Matrixist
EVGA has turned off their queuing system as demand has probably even overwhelmed them into the land of, "most likely not going to deliver for the next year."
-
Maybe they'll start a new queue when the Super cards come
Not a good idea continue the old queue for the 1th gen Ampere. It will become a mess
Best Buy to Sell Nvidia RTX 3000 GPUs in Stores on Friday pcmag.com
https://www.pcmag.com/news/best-buy-to-sell-nvidia-rtx-3000-gpus-in-stores
-------------------------------------------------------------
New Reports indicate Amazons New World-MMO is again bricking NVIDIA Geforce cards guru3d.com : 09/30/2021 09:28 AM
Amazon's long-awaited massively multiplayer online game New World was released on Tuesday. We may call the game a resounding success because, according to the Steam database, it has already surpassed the maximum of 700,000 concurrent users. It's unfortunate that for some players, the experience has been fraught with difficulties.
Intel back on trask with more normal SKU naming (more phone cores)
https://hothardware.com/news/lenovo-teases-legion-9000k-alder-lake-desktop-pc
------------------------------------------------------------------------
VBS IN WINDOWS 11 - UL Benchmarks indicates a loss of performance
http://forum.notebookreview.com/threads/windows-11.836230/page-66#post-11121081
Last edited: Sep 30, 2021 -
Thank you for this information. This is exactly what I was looking for.
The discussion about bad latency regarded the increased inter-core latency compared to previous generations. I don't know how much it actually affected performance though. Any increase in latency is bad, but if the IPC gains are enough to mask it and provide an additional performance uplift over it, then it's not a big deal.
From your own experience, there was in fact a significant performance increase, therefore the bashing against the 11900K is unwarranted. It does exactly what it was made to do: game. This is not a productivity CPU like it seems everyone wants it to be. Even the 10900K isn't really a productivity CPU, but it can be used as one. I'm starting to regret purchasing a 10900K, but I can't use an 11900K in my X170SM-G yet anyway. I listened too much to the negative voices this time around.
You could also probably get additional framerate gains by using AMD GPUs from now on. Nvidia's drivers have an overhead that significantly decreases performance in CPU bound scenarios. Nvidia's GPUs are better for higher resolutions since they're more powerful in an absolute sense, but AMD GPUs are better for higher framerates in CPU bound scenarios. -
11900K has lost stock to stock performance vs 10900K on GN suite and HU suites of gaming and production. And everyone probably did the same thing and blasted the CPU which got famous everywhere and made Ryzen as the best and nobody ever talks about it's WHEA problems when it was released and the CPU RMA stuff and how the boost clocks were downgraded over the time with AGESA BIOS updates. CPU RMA was shocking to me, which is unacceptable in my book, those got fixed apparently now but USB drop out still remains.
Coming back, at TPU 11900K it's on par with CML in gaming (only when run in proper G1 memory with fully unlocked power limits) but never beats 10900K even with OC to 5.0GHz and also consumes insane power - 10850K at TPU is maxed out at 300W, 11900K with ABT and full unlock goes over 350W and 400W even. Next, Ali's video review with unlocked power limits where 11900K is equal to 10900K / 5900X and sometimes it trades blows in gaming then in Production it gets beaten by the CML but AMD here though crushes both as expected.
As for the Inter core-latencies which effects the IMC on the 11900K heavily. I've seen OCN 11900K thread all of them abandoned Gear 1 and started posting only about Gear 2 benching their main aspect so makes sense, this was on ASUS Apex XIII and Maximus XIII which were popular there by that time EVGA Z590 DARK was not even out. Then again once the EVGA Z590 DARK came, there were so many improvements needed to be made. Note, by this time ASUS Apex was already leading because so many users got the ASUS mobo vs EVGA's which was very late, Apex had that lead time. The reason I mentioned this is, EVGA Z590 DARK people still have some issues that they needed to be ironed out, means the BIOS needs updates. Some of them cannot even run the DRAM at 3600MHz G1, some could, some gave up and moved to Gear 2 on EVGA forums, a few folks tried so many kits as well, you can check for yourself. And also about EVGA, they really need to tune BIOS well for 10th gen too on Z590 like all other companies.
Gear 1 is definitely helpful for games and preferred. As for the RKL die, it's bigger than CML so it dissipates heat faster but it puts out a ton of heat too. So when OCing 10900K vs 11900K, the RKL will output more heat that's for sure needs a beefy cooler.
I do not think 10900K is an inferior product at all. It definitely is class leading in some games and some 11900K pushes forth but matches 10900K due to IPC gain. The only real inferiority of 10900K is lack of PCIe4.0 that only matters if you play old games at super high frames at 1080P resolution, since high fps needs PCIe4.0 bandwidth. AND once we run NVMe SSD on 1090K, the PCIe lanes split to x8 link of 3.0 speed. Bonus - for 11th gen is it has only single PCIe4.0 SSD vs AMD X570.
As for Clevo laptop on memory I cannot speak much, but the DRAM on those laptops will be limited on 10th gen anyways due to SODIMM sticks, we cannot run 4000MHz+ C15 speed there unfortunately, 11th gen I think you can run G1 3200MHz low latency. I think it will be a side grade for you from 10900K to 11900K.
For the SMT productive workloads there's no contest, 11900K loses out to 10900K due to loss physical cores. And AMD as usual. For a flagship product there needs to be some multitasking to be honest. At-least to me because It's a dead end socket and here I have a lot of REMUX files which I need to run Mkvtoolnix on and remove the extra Audio files. For that a fast multicore processor is needed. Plus you can run more programs easily in the background too.
RPCS3 is where 11900K trumps everything, on that PS3 emulator only Intel parts run fast, perhaps due to the Monolithic design and no latency issues like Ryzen where I/O die plays a central role and notice, despite having massive IPC increase on the AMD parts, they cannot beat Intel on that emulator.
The major sour taste probably came from 2 things - loss of Physical cores and binning an i7 to an i9 vs 10900K and high power consumption / heat due to the 14nm process node backport, if they made it on this new 10nmESF / Intel 7 like ADL I think they could have gotten it at lower power consumption perhaps even add an extra 2C4T too. But Intel is greedy pig, they know they wanted to get the ADL big little garbage silicon onto laptops and dumpster thin light junk first, so that was the plan for them and don't forget Win11 M$ money bank, and then slowly ramp up with LGA1700 silicon. Also they even have the Sapphire Rapids silicon too which is already in their client hands. Means they have to relegate a big % of this yields to 11th gen parts, and since it does well in gaming and people will buy anything, and threw it onto the LGA1700 socket as a last hurrah to milk out 14nm with the extra SKUs for SI market and prebuilts. So all in all It's not profitable for Intel to make the RKL on 10nmESF / Intel 7. I hate Intel for this mishap than the CPU.
For OCers it's a disappointment as well, since it's 8C16T max and there's no way it can beat 10900K in Cinebench or any production workloads and definitely nothing useful against Ryzen 5000, so all in all it became the refresh which everyone hated. Then you factor in the Zen 3D refresh on top with ADL etc... Makes sense if you ask me, considering all the aspects..again 11900K serves it purpose well despite it's flaws. .Last edited: Sep 30, 2021 -
https://www.igorslab.de/en/liquid-m...ow-the-magic-burns-in-safely-goes-tutorial/4/
He got this to melt ( 3.47C lower than Kryonaut 2018 ver at 250 watts? ). Results are similar to liquid metal except easier to remove but harder to get it to work? -
-
electrosoft Perpetualist Matrixist
That is a very fair assessment from start to finish.
GN and HU ran the 11th gen gimped. GN absolutely drives me bonkers with that banal BS. As @tps3443 alluded to, once opened up and tweaked properly, the performance gains are very meaningful. I said as much months ago. MSI, Gigabyte and Asus boards I used immediately open up MCE or ask you if you want to open up MCE (or in the case of MSI, they ask what cooling you're using and adjust PLs accordingly).
When pushed to their logical extremes, 10900k is the king of gaming overall still. This just shows the power of the 5-6yr old architecture (which Intel milked till it was dry). 10900k is a great product. I said it before, but if Intel had managed to offer a 10nm rkl with a 10/20 design, a lot of arguments would have never happened. For myself, personally, it was the better lows with the 11900k that stood out when I'm already gaming on the edge with dips below 60fps with the 11900k. When my 5800x went low....it went LOW but it also had spurts of higher fps than the 11900k but running 4k @ 60hz puts more emphasis on keeping fps up to avoid noticeable stutter/chunk than pushing high fps gaming.
I think 10th, 11th and 5000 series are all fantastic chips but tweaked vs tweaked 11900k > 5800x. 5900x/5950x are the logical choice for anything meaningfully multi threaded. I mean it just isn't even up for debate. Anything multi-threaded and the 5900x and 5950x will absolutely smash the 11900k.
ABT is an absolute crap shoot due to variance in silicon. I've never seen a chip go over 400w, but I definitely saw one of mine hit 330w. When using the same frequency of 5.3 all core and dialing the pull and you have one chip pulling 370w+ and the other 290w that is insane but even at stock auto the difference can be 30w+ too.
Depending on use case / workflow, 11900k is an upgrade, side grade or downgrade definitely. If compared to the 10900k, I don't think the gap is significant enough to engage buyers remorse in just about every scenario.
If Intel could have went with a 10/20 11900k and named it i9 I'm sure they would have but they couldn't and that's understandable on a 14nm fab. Calling binned chips i9's? Ehhhh, that's where I grab their hand and get my ruler. They could have made an i7-11700k with 8 threads, no HT that would have ran much cooler. That would have put some space between the chips like they did with the 9700k and 9900k. Not the best answer, but at least more differentiating than what they opted to do which is pretty insulting. I do think if they had sold it on a 10nm process they could have managed 10/20 but as you said correctly, Intel is a very greedy pig. All of them are. That is the nature of their existence. Shareholders and profitability above all.
Single threaded benching, 11900k it is a monster which for me truly shows an architecture's true power. I've always thought this way. Multi threaded is just a cluster of single cores working in tandem. But with that being stated, for those who look for the big multi core numbers it isn't even in the ballpark with AMD...not even in the parking lot....If I am focused solely on as much multi threaded performance as I can possible extract for benching, workflow or both, I'm not even looking at Intel in any way, shape or form. I'm focused squarely on the 5900x and 5950x.
Zen3D and ADL should provide some more exciting times ahead! -
electrosoft Perpetualist Matrixist
In the end, I think they're both good enough and close enough on average that it is going to come down to your workflow to potentially see a downside of using a 10900k vs an 11900k. IMHO, in the X170 chassis, a binned 10900k is a luxury while a 11900k it is a must if you want any chance of running a fairly cool system not thermal throttling. The 11900k is a hot monster that requires more tweaking, tuning and binning to get it running as well as a 10900k will in the SM. -
electrosoft Perpetualist Matrixist
-
This is at 4,800Mhz (Gear 2) CL18-18-38 on the Z590 Dark. My memory is capable of better than this, but I run this ^ for stability
Gear 2 seems totally fine to me! 42NS with 75GBPS+ of memory bandwidth is certainly decent. However, I’m shocked by Gear 1 latency. My 11900K managed like 35NS latency at 3733Mhz, (I couldn’t hit that type of latency on my 10900K for some reason) Best on my 10900K and Z490 Dark was usually 37-38NS. And I tried all the bios versions.
I keep hearing people say that the prior Gen 10
Core was faster. An overclocked Comet Lake (10/20) CPU is indeed faster in some “multithreaded” applications. (Barely faster)
I still have a 10850K here that does 5.2Ghz all core reliably, and with fully tuned out memory, and cache the multithreaded performance is 6.5% faster than a 5.1Ghz all core 11900K.
This 10850K was fast though (2,900 R15) It is quite an impressive chip, considering that it’ is bargain bin silicon after all. I ran 5 core loads at 5.3Ghz and all 10 core loads at 5.2Ghz.
One thing really strange is, I couldn’t push the 10900K/10850K bandwidth as good as my 11900K is putting out though.
(11900K memory performance)
3733@35NS (GEAR 1)
4800@42NS (GEAR 2)
This is on Z590 Dark bios 1.02. I am gonna install bios 1.05 and see what it’s like. (This latency is wild though) and bandwidth is something else. Certainly not bad.
Last edited: Sep 30, 2021 -
At 3:10 into the video and again at 17:00 Luumi validates that my dual rank 32GB G.SKILL kit with Samsung B-die (16GB modules) suck on Ryzen because all dual rank memory does. The Ryzen IMC can't handle high memory overclocks with dual rank sticks. But, maybe the X570 Dark could change that? Still watching the video, but it's good to know I am not alone in my pain, LOL.
It's going to be very difficult to say no to this mobo. It makes me hate my Crosshair VIII already and I am not even done with the video.Last edited: Oct 1, 2021 -
-
Finally got around to installing Linux. POP_OS! with Cinnamon.
Ashtrix, electrosoft, Papusan and 1 other person like this. -
electrosoft Perpetualist Matrixist
What voltage are you running for your memory? The Dark Z590 really is fine tuned nicely for gear 1. More than I've seen from most other boards. Gear 2 isn't bad either, but while the numbers are great real world translation still lags sub 5000. -
electrosoft Perpetualist Matrixist
Best Buy had a drop on Wednesday and I managed to snag a 3070ti. I was able to apply my 10% off birthday code and 5% rewards on top of that made the total cost ~$570.00. They only do in-store pickups still on GPUs. I haven't been playing WoW hardly at all for weeks now. I've been playing the snot out of Diablo IIr. I think it will handle that just fine
. Might be selling my KPE3090 so we'll see.
-
14 hours run for the Gold
https://hwbot.org/submission/482807..._gt_705_14h_0min_32sec_544ms?recalculate=true
https://hwbot.org/submission/4828082_papusan_gpu_reference_frequency_geforce_gt_705_1125_mhz
Last edited: Oct 1, 2021 -
While that’s a great alternative option. You sure you really wanna do that?
-
I’m impressed with the performance of the Z590 Dark. Never in my life have I ever seen a motherboard change a CPU like this. Someone did a great job on this thing.
Comparing actual stable memory setting between the 10th Gen and 11th Gen. My 11900K + Z590 Dark is actually beating the latency that my 10900K+Z490 Dark KP was capable of.
So far I have managed to get DDR4 5,000Mhz in Gear 2 with 1.575V memory voltage. This is with timings CL19-20-20-42. It’s actually very close to being stable.
I do the quick and easy memory stability testing.
I launch this sniper game that pops and stutters like crazy of anything is wrong with your system memory or memory timings lol. If it doesn’t skip, I will run the system as normal playing Rust etc (couple hours) if it manages to get through those items first, then I run HCi Memtest.
The most I could get on my 10900K and 10850K on the Z490 Dark KP was 4933Mhz-4960Mhz on the memory. (Nice to see the memory wasn’t the limiting factor here)
I will get some latency and bandwidth numbers up tonight.
Also, I want to test Gear 1 VS. Gear 2 in games and apps. So far, Gear 1 seems to perform better in R15/R20 etc. This platform is easily gonna take another week or daily tinkering until I figure out what option I am gonna run daily. Memory testing is so time consuming.
Last edited: Oct 1, 2021jc_denton, Clamibot, Papusan and 1 other person like this. -
electrosoft Perpetualist Matrixist
I've already played through CP2077 and WoW is quickly dying for me this Xpac as the 9.1 patch is a snooze fest and I don't see things getting better (wife is still hardcore in the last Xpac). I hate to see expensive hardware being underutilized. I'm pretty frugal overall. I'm the type that drives budget cars, leans towards small domiciles and to say I dress down and love coupons would be an understatement. I have never understood dropping insane amounts of money on expensive clothes or shoes. I give myself a bit of leeway with computing technology but still keep track of yearly buy/sell with the hopes of breaking even (or close to it but I usually run a small loss YTY) while getting to enjoy my addiction...er....hobby.
But just like with the Aorus 3070 (now in my wife's system) I may end up installing it and going, "NOPE!" and run back to the KPE3090.
Ashtrix, jc_denton, SierraFan07 and 4 others like this. -
-
-
electrosoft Perpetualist Matrixist
Holy Watts Batman....can't wait to see some results.
How's the coil whine? -
It is on air! I may or may not get the only water block available for this card.
EDIT: I can now see why these 6900 XTU's can go head to head with a 3090. I am glad I used my 1600W PSU for this test bench and not my 1000W.
It is going to be limited by the air cooler, I can tell you that straight away. Again stock, the hot spot is already at 85 to 87C.Last edited: Oct 1, 2021Ashtrix, jc_denton, electrosoft and 2 others like this. -
Let's see if HWBOT has anything to say about me using Linux. I read the rules and they are silent about Linux being the OS, LOL. They only mention W10 restrictions because of its functional defects.
https://hwbot.org/submission/4828196_mr._fox_cinebench___r15_ryzen_9_5950x_5276_cb?recalculate=true
This is not using the chiller and it beat my 7980XE high score by roughly 25 points. Also with the 5950X running 500Mhz slower than the 7980XE.
Ashtrix, jc_denton, SierraFan07 and 3 others like this. -
https://hwbot.org/submission/482820...r11.5_ryzen_9_5950x_58.72_cb?recalculate=true
I didn't submit these because the scores are not high enough to beat my prior submissions.
Ashtrix, jc_denton, electrosoft and 2 others like this. -
-
More Gold from the "new" old
https://hwbot.org/submission/4828241_papusan_3dmark___fire_strike_geforce_gt_705_483_marks
https://hwbot.org/submission/4828246_papusan_3dmark___cloud_gate_geforce_gt_705_4162_marks
https://hwbot.org/submission/4828227_papusan_aquamark_geforce_gt_705_102086_marks?recalculate=true
https://hwbot.org/submission/482821...tion___1080p_xtreme_geforce_gt_705_133_points
https://hwbot.org/submission/4828268_papusan_vrmark___orange_room_geforce_gt_705_187_marks
https://hwbot.org/submission/4828259_papusan_vrmark___cyan_room_geforce_gt_705_81_marks
https://hwbot.org/submission/4828270_papusan_vrmark___blue_room_geforce_gt_705_40_marks
Last edited: Oct 1, 2021Ashtrix, jc_denton, Spartan@HIDevolution and 4 others like this. -
electrosoft Perpetualist Matrixist
6900 is a rasterization monster especially at 1080p and 1440p. I really do like the specs and look of that Asrock!jc_denton, Spartan@HIDevolution, Rage Set and 1 other person like this. -
In with old, out with the new.
jc_denton, SierraFan07, Rage Set and 1 other person like this. -
Doesn't matter. Gold it will be
https://hwbot.org/submission/4828277_papusan_3dmark2001_se_geforce_gt_705_37347_marks
https://hwbot.org/submission/4828283_papusan_3dmark05_geforce_gt_705_11582_marks
https://hwbot.org/submission/4828289_papusan_3dmark06_geforce_gt_705_6279_marks/
Last edited: Oct 2, 2021Ashtrix, jc_denton, Spartan@HIDevolution and 2 others like this. -
-
Looks very good! I don’t think the 7980XE was running 500Mhz slower though.
The 7980XE does 1 point per 1 Mhz in R15. 4,800=4.8Ghz 5,000=5.0Ghz.
You were breaking 5,200 in R15 on your old 7980XE at 5.2Ghz.
I was managing right at 5,000 R15 at 5.0Ghz.electrosoft and Spartan@HIDevolution like this. -
It was. I am unable to bench the 5950X at 5.0GHz. The Linux readings are off because the sensors are bugged. Notice that Cinebench is reporting over 7.0GHz, LOL (we wish). The 5950X is running 4.7GHz on half the cores, 4.65GHz on the other half as set in the BIOS. 5.0GHz is the limit I set for the core ratio to help keep the VID higher than core voltage.
So, yeah, I beat my 7980XE high score with the 5950X running 500MHz slower.
Below is an example of why it is so hard to take Linux seriously as an overclocker. I loathe Windows 10/11 more than words can say. Absolute trash. I want to move to Linux, but the "enthusiasts" that run Linux are enthusiasts about Linux, not hardware. When you compare the view count to the post count in both of the threads that I created trying to find evidence of life on the other planet, the ratio is pathetic. Water, water, everywhere and not a drop to drink. It's getting better very, very slowly. But, it is not likely that it will ever be worth mentioning to overclockers because Linux enthusiast largely don't care about overclocking, performance tuning or benching. They are mostly like the lowest common denominators on the Windoze side, only with more advanced technical skills (because they need them to run an OS like Linux). The community activity is about as stimulating as watching paint dry.
Last edited: Oct 2, 2021 -
Soooo, comparing the best of my watercooled 6800 XT vs an air cooled Asrock Formula OC 6900 XT at stock (nothing adjusted in the Radeon software), the 6900 XT beats my 6800 XT. That shouldn't be a surprise. However, I know the 6900 XT has more in the tank, especially if I get it on water.
I am happy I got this card. It is indeed a monster. Out of the box performance on the P vBIOS pushes this card higher than the majority of 6900 XTs. Asrock didn't go the conservative route, power wise. My only hang-up is that warranty limitation. I know Asrock can't enforce it in the US but it would a battle nonetheless if I watercooled the card and the card died somehow. Still debating.
EDIT: To answer your other question, no coil whine and its on an open bench too. My 6800 XT does exhibit coil whine in certain benchmarks though.Last edited: Oct 2, 2021
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.