Yeah I'm not sure how Nvidia thought this was for miners, even miners don't want "mining" gpu's because once they are not worth mining on you are stuck with a crap conversation piece or contributing to e-waste. You could argue miners just want what ever value is left on them and I would agree but also keep in mind that the second hand market has always been beneficial.
Almost all my AMD cards are second hand, and while I suppose it's technically true that all my Nvidia cards are second hand as well it wouldn't be fair to not mention they were closer to a hot potato lol
Honestly if I could buy the 570 x2 cards or even a 5700 x2 if it existed I probably would mostly because they are still good options but also I would be under the impression that they are just bin rejects. If priced accordingly of course
-
@electrosoft
@Mr. Fox had a GEM on his hands!!
This 11900K is 1.195V at 4.8Ghz all cores 148 watts in R15!!!!
0.919V in the bios after resetting with default setting.
(No delid on this chip at all)
HOLY COW! This is a great sample so
Far!! With default testing it really blows my chip out of the water.
Now it’s time for some overclocking.
Last edited: Nov 29, 2021Clamibot, Ashtrix, electrosoft and 3 others like this. -
I am so thrilled that it appears to be a nice silicon sample. Since I never even took it out of the plastic clamshell, I had no idea how it would turn out. I am very happy for you, bro. Had I not decided to sell it, that CPU would have lived a very boring life powering my work computer and doing very mundane things that a stock 6700K could handle without breaking a sweat.
After the difficulty I had installing Windows 7 on the X570, the constant irritation of USB ports that cut in and out, horrible RAM latency, nonstop WHEA errors; and, now a Z590 system with the same level of OS install difficulty, M.2 SSDs drives that are invisible to Windows 7 and BSOD issues except for when the M.2 ports are disabled in the BIOS, I find myself on the brink of not caring any more. It feels like pulling a chair up to the table to enjoy a juicy Mesquite-grilled ribeye steak only to have someone urinate on my plate as I reach for my fork.Clamibot, Ashtrix, tps3443 and 1 other person like this. -
How does one manage to come across such a perfect 11900K? This 11900K that I received from @Mr. Fox is not only a GEM, but the IMC is a GEM too.
It will post 4,000Mhz Gear 1, and still climbing!! My last 11900K would clap out after 3,882.
And consumes a mere 0.920v stock in the bios. My old 11900K would use around 1.025V.
This super sample 11900K only pulls 148 watts in R15
This is an extremely good. Probably as good as it gets really. I’ve only dreamed for such a chip.
Is this real? Lol.
Clamibot, Kana Chan, Papusan and 1 other person like this. -
LoL. Just my luck, eh. My first ever silicon lottery winner and I sell it dirt cheap.tps3443 said: ↑How does one manage to come across such a perfect 11900K? This 11900K that I received from @Mr. Fox is not only a GEM, but the IMC is a GEM too.
It will post 4,000Mhz Gear 1, and still climbing!! My last 11900K would clap out after 3,882.
And consumes a mere 0.920v stock in the bios. My old 11900K would use around 1.025V.
This super sample 11900K only pulls 148 watts in R15
This is an extremely good. Probably as good as it gets really. I’ve only dreamed for such a chip.
Is this real? Lol.
Click to expand...
I am glad someone I know and care about can enjoy it. Who knows what kind of goofball might have ended up owning it had it sold on Mercari or Craigslist before you snagged it.Last edited: Nov 30, 2021Clamibot, Rage Set, tps3443 and 1 other person like this. -
The new 11900K does 5.3Ghz all cores at 261 watts with 1.371V
^ What a BEAST!!
I’m running some really terrible B-Die memory right now as a temporary solution. So once the Corsair Dominator Platinum 3600CL14 DR kit shows up, I should be able to get some nice numbers up. I’m hoping to tune out a really good 3900 CL13 which would easy compete with even the fastest DDR5 setups in gaming.
I’m gonna be super careful when Delidding this chip. I can run this new 11900K at 5.1Ghz with auto voltage, and still use less power than my other 11900K (While it’s stock!) That’s 300Mhz faster on all cores, less power, and less voltages. If that’s not impressive, I dunno what is lol.
This is a nice chip!
I spent along time playing with the IMC. But I tested the core OC ability just for a couple minutes. Based on [email protected] the chip should run 5.4 even without being delidded. It hasn’t even reached that voltage demand point yet. I imagine 5.4Ghz at 1.405-1.425V would be easy. And after a delid , this will put it right in the territory of a 5.55-5.60Ghz for daily OC.
I know 5.5-5.6 sounds crazy. But it’s just that good.Last edited: Nov 30, 2021 -
Not quite as good as my best score on chilled water, with a higher CPU and GPU overclock, but still very good.
This is pulling just short of 1100W from the wall and my 3090 KP peaked at 865W.
-
Mr. Fox said: ↑Not quite as good as my best score on chilled water, with a higher CPU and GPU overclock, but still very good.
This is pulling just short of 1100W from the wall and my 3090 KP peaked at 865W.
View attachment 197827
View attachment 197828Click to expand...
OMG. That’s some ridiculous GPU power lol
Nice! -
Indeed. This one the GPU pull about 970W... https://hwbot.org/submission/4675148_mr._fox_catzilla___1440p_geforce_rtx_3090_43317_markstps3443 said: ↑OMG. That’s some ridiculous GPU power lol
Nice!Click to expand...
That is with the CPU at 5.5GHz and the GPU at 2235 on core and 11,500 on memory and 1.425V on the GPU.Clamibot, Ashtrix, Papusan and 1 other person like this. -
electrosoft Perpetualist Matrixist
I'm not even sure if EVGA is going to release a KP 3090 TI, but if they do I expect it to clock in at $2300-2500. GPU prices have officially reached a point where I no longer have a desire to own or use a top dog model as the prices are simply outlandish. Just a few years ago I would balk at paying >$600 for a GPU. I know selling off 3-4 GPUs covered the $1200 I paid for the 3080 Strix from the Newegg Shuffle but it still grinds my gears.Papusan said: ↑Bro Fox. Will you replace your 3090 Kingpin card with a refresh if EVGA find out they will release a successor?
Never buy a non Ti (as etc 3090/4090) if this rumors become reality. Or If you only want the very best cards then you'lll miss out one year of gaming on the new arch, LOOL
NVIDIA GeForce RTX 3090 Ti confirmed to feature Micron’s 21 Gbps GDDR6X memory videocardz.comClick to expand...
tps3443 said: ↑@electrosoft
@Mr. Fox had a GEM on his hands!!
This 11900K is 1.195V at 4.8Ghz all cores 148 watts in R15!!!!
0.919V in the bios after resetting with default setting.
(No delid on this chip at all)
HOLY COW! This is a great sample so
Far!! With default testing it really blows my chip out of the water.
Now it’s time for some overclocking.
Click to expand...5.3 all core @ 1.371 is pretty righteous for CB23!tps3443 said: ↑The new 11900K does 5.3Ghz all cores at 261 watts with 1.371V
^ What a BEAST!!
I’m running some really terrible memory right now as a temporary solution. So once the Corsair Dominator Platinum 3600CL14 DR kit shows up, I should be able to get some nice numbers up.
Man, I’m gonna be super careful when Delidding this chip. I can run this 11900K at 5.1Ghz with auto voltage, and still use less power than my other 11900K (While it’s stock!) lol. That’s 300Mhz faster, and less power. If that’s not impressive, I dunno what is lol.Click to expand...
That's awesome! Let's see how far it goes with your monster cooling setup! Make sure to loop it and see where it bonks out as for single runs I could get ~ 1.365 testing in my buds M13 but any type of creeping heat (as you know with RL) introduces instabilities and 1.365 for single runs ended up needing ~1.395 to loop CB23 and pass RB 2.56 Stress but neither of us have the setup you have so you should really be able to press that puppy and keep it cool.
That's exactly what happened as I climbed the 11900k ladder when you realize how much better the new chip is or how worse the original one you had was. Excellent cooling can make a mediocre chip look good and a good chip look great. I'm happy for you man! Can't wait to see how far you can push it. With your past luck with certain things you deserve a good break and fortune!
Personally I'm limping along on this Gigabyte Z590 that was part of the New Egg Shuffle bundle when I picked up the 3070 in the wife's system in August and it is just garbage and I'm done with the OC stuff for quite awhile now. This Gigabyte just solidifies I will never buy another Gigabyte board ever again as this is 3 in a row (2x Z390, 1x Z590) that have had problems. Seeing the massive swing in pull and stability going from the MSI to the Gigabyte and testing in an M13 that dialed down even further with the exact same chip does show there is merit to buying higher end boards with more refined and granular control but the costs add up. I would be curious to see how your chip runs in my Gigabyte for a proper analysis and comparison or to at least plug yours into an Asus AI enhanced board to see what SP it spits back.
In a "time out!" moment I ended up selling off just about everything including the 3060 I picked up, my Dark Z590 still sealed and returned my 12700k and MSI Z690 to take a much needed financial pause while this market swirls around the drain and Z690 BIOS release after BIOS release is issued to fix many problems. The holidays are here and the catalytic converter went on our Nissan Quest so we have a window to sort out our next group family vehicle till inspection where it will fail in spectacular fashion.
She lasted 12+ years and served us well.
Ill revisit the state of my tech in January.
And yes, please, PLEASE don't wipe out the smds on this one!
I'm super happy for you to get a proper 11900k to really clock in some killer benches.
DON'T KILL IT!
-
electrosoft Perpetualist Matrixist
Here you go @tps3443. You said you wanted one of those LTX Golden 10900k's.
https://www.ebay.com/itm/194522028227 -
And, if you use PayPal credit, it will only take 2 years to pay off a CPU that is alread two-generations obsolete. What a sweet deal! Although, if it overclocks as well as advertised, it might take another generation or two for the incumbents to catch up. Competing with 20 threads running 5.7GHz without having to employ sub-zero cooling is like trying to stop a runaway train.electrosoft said: ↑Here you go @tps3443. You said you wanted one of those LTX Golden 10900k's.
https://www.ebay.com/itm/194522028227Click to expand...
Since it seems to be an extraordinary sample in a rather fragile ecosystem, it would be worth waiting to see if a more refined process or new tool surfaces to do so with less opportunity for a mishap.electrosoft said: ↑And yes, please, PLEASE don't wipe out the smds on this one!
I'm super happy for you to get a proper 11900k to really clock in some killer benches.
DON'T KILL IT!
Click to expand...Last edited: Nov 30, 2021Clamibot, Ashtrix, electrosoft and 3 others like this. -
This 11900K is a 5.3Ghz CPU easy. It runs 1.353V per bios and CPU-Z (Auto Voltage)
5.4Ghz the voltage begins to creep up. However the cpu still passes through R20 runs at 95C each core even at 5.4Ghz. This chip is not phased by heat. My other 11900K starts going wonky if only more than a few cores are beyond 75-80C range at once.
This CPU can runs 5.4Ghz easier than my last 11900K could manage 5.2Ghz.
It’s so crazy how different this cpu is from my other 11900K. It’s like a different processor model entirely lol.Clamibot, Mr. Fox, electrosoft and 1 other person like this. -
If you happen to have a spare IHS ready or you intend to direct die and you don't mind grinding down the original Intel IHS down, you could sand down the center and then pull the rest out ( won't wipe the SMDs that way ).
Mr. Fox likes this. -
Wasted money if you want points in 3D scores or Globals in 2D. Wasted money even in normal usage as gaming. $1.700 would pay of a 12900K+MB and ram sticks.Mr. Fox said: ↑And, if you use PayPal credit, it will only take 2 years to pay off a CPU that is alread two-generations obsolete. What a sweet deal! Although, if it overclocks as well as advertised, it might take another generation or two for the incumbents to catch up. Competing with 20 cores running 5.7GHz without having to employ sub-zero cooling is like trying to stop a runaway train.
View attachment 197829Click to expand...
You are still not forced to use Win 11 with the new Hybrid chips from Intel. Yep, still no need going with two years old Intel chips to avoid have to use Win 11.
A sligtly oc'd 12 gen with phone cores vs. [email protected] in TS Cpu.
Last edited: Nov 30, 2021 -
I'm not even able to buy Radeon 5600x which costs closet to 700$ and Rx 6000 series even 6600xt costs a 1000$ including tax. Damn, my AGA has been collecting dust.Papusan said: ↑Bro Fox. Will you replace your 3090 Kingpin card with a refresh if EVGA find out they will release a successor?
Never buy a non Ti (as etc 3090/4090) if this rumors become reality. Or If you only want the very best cards then you'lll miss out one year of gaming on the new arch, LOOL
NVIDIA GeForce RTX 3090 Ti confirmed to feature Micron’s 21 Gbps GDDR6X memory videocardz.com
Edit. Or just use a printer, HaHa
https://www.techpowerup.com/forums/posts/4656934
Could be Nvidia want a bit more from mining craze before it dies. Can't let AMD steal of their profits.
View attachment 197824
Sapphire GPRO X080 cryptomining card with Navi 22 GPU and 10GB memory leaked, costs 750 EUR
AMD board partners are actively selling GPUs to miners, except they do not disclose this information publicly to avoid complaints from their fans. Those mining cards are not listed on any manufacturers’ website, so the details on them are not always available. Luckily for us, El Chapuzas Informatico gained access to two datasheets from Sapphire, featuring GPRO X080 and X060 models.
I wonder how much of the graphics cards from Nvidia and AMD have gone directly to the miners (right out from the factories). I guess we will never know.Click to expand...tps3443, electrosoft, Papusan and 2 others like this. -
Agreed. That was a stupid price even for the day when 10900K was the latest and greatest. Only a moron would pay that much (300% of MSRP) for something that will be obsolete in less than a year.Papusan said: ↑Wasted money if you want points in 3D scores or Globals in 2D. Wasted money even in normal usage as gaming. $1.700 would pay of a 12900K+MB and ram sticks.
You are still not forced to use Win 11 with the new Hybrid chips from Intel. Yep, still no need going with two years old Intel chips to avoid have to use Win 11.
A sligtly oc'd 12 gen with phone cores vs. [email protected] in TS Cpu.
View attachment 197833Click to expand...Clamibot, tps3443, electrosoft and 2 others like this. -
electrosoft said: ↑Here you go @tps3443. You said you wanted one of those LTX Golden 10900k's.
https://www.ebay.com/itm/194522028227Click to expand...
Since the seal is already opened, I think one will need an ASUS mobo to check the SP rating and have skill enough to push that CPU to that far and see if it's actually legit LTX sample or not. But damn that price is really insane, too much for a 2 year old CPU. $1700 is a lot of money, we can get a full Xeon with used Dell server chassis and a small homeserver running for that cash. Tech depreciates so fast. Better bin ourselves, buy a lot from retail and get the ASUS board running and check the SP and keep the best one.
I was just checking the ASUS forums like usual and seeing if there's any fix for the ALC408x Realtek USB Audio problems (Static buzz AND Lag/latency, its on GB and MSI as well). Apparently no fix yet, and it has been a year already. The best part is all Z690 mobos have the same Audio Codec. Except ASRock. And you know even worse ? some people disabled the nasty POS in the BIOS and it shaved off 25 secs of Boot time. UGH. MSI have it fixed ? some are reporting so..In a "time out!" moment I ended up selling off just about everything including the 3060 I picked up, my Dark Z590 still sealed and returned my 12700k and MSI Z690 to take a much needed financial pause while this market swirls around the drain and Z690 BIOS release after BIOS release is issued to fix many problemsClick to expand...
However the worst is not that....It's that dreaded WHEA again... on the Intel Z690 platform @Mr. Fox looks like what you said is 100% truth. Newer is getting ridiculous nowadays. I really feel stupid when I had the chance for 9900K and Z390 DARK waited and expected more from 10C rumors and what not I really feel bad, exactly where TiN left after working on the platform at EVGA. Time is ticking and have to get a machine with Win7 up and running
.
I have more bad news, Crosshair VIII Dark Hero, which I asked you long back why didn't you opt when you were mentioning the boot issue (Apart from the PCH heat up on Crosshair VIII Hero don't worry Maximus XIII Hero has same PCH heatup lol). Apparently, that mobo is also having exact same problem. And not fixed yet. That was the only mobo worth money on X570 because of passive cooling, ALC1220, no Intel 225V, Win 7 works, 8x SATA ports, solid looks. BUT a big problem. WTF ?
Now the new Windows 10/11 BS issues, along with RKL, pair it with the ALC codec add the Intel 225V Foxville LAN which is still getting problems in initializing on even brand new revision (It had unfixable problems on 2 revisions for an entire year of 2020), biglittle earth saving cores BS, DDR5 joke etc etc....btw Intel has a FW update for the I225V it seems and it fixes the issues on ASUS and GB.
In the ROG forum some say ASPM in BIOS and Windows power fixes some do not and for some PCIe 3.0 is a fix. What is this nonsense really. Intel has some issues, AMD has another whole can of worms and issues like USB, WHEA and crap firmware along with no documentation etc. Looks like tech is taking a dump or am I being too OCD to notice all issues only ?
Click on the links below, there are more reddit pages where this is being reported.
WHEA-LOGGER ID17!!!help!
Poor guy, look at the BS Intel reply below...
New 12900K build, getting thousands of WHEA 17 errors a minute张金羿, You are very welcome, we are glad to help.
"I consulted Gigabyte a few days ago but they haven’t responded to me yet, so I still have a few questions", perfect, that was the best thing to do in this scenario in order to confirm from their side what might be the source of the error.
"If this WHEA error points to a problem with the motherboard's PCIE interface, shouldn't my graphics card be unusable?", In theory, yes, but keep in mind that the error could be related just to a portion of the motherboard's PCIE, meaning that does not necessarily indicate that the graphics card will work or not or that the port is defective, it could be just an error related to it but that still allows it to work.
"Do I still need to contact my graphics card supplier?" Since the graphics card seems to be working properly, then we would suggest to wait for Gigabyte's response. Still, if you have the opportunity to do so, it is a very good option to try, maybe at this point to check directly with them if they have some sort of reports about this issue that could indicate what might the root of the problem in this case.Click to expand...
I’ve tried the following:
- disabling PCI-E link power state management seems to reduce the amount of WHEA's, but it still crashes under heavy load
- Next, I removed the GPU and ran the system off of the iGPU. No crashes, no WHEA's, running prime95 and cinebench without any freezing whatsoever.
- Next, I set the top PCIE slot at gen 3 in bios – same issue persists.
- Finally, plugging the GPU in the lower PCI-E slot works fine without issues. But now my ASUS 3090 TUF OC is running at PCIE 3 and lower bandwidth – this is a bandaid and not a fix.
Problem seems isolated to the top PCI-E slot. I tried reseating the card and still had the issue. Problem persists using a riser cable or direct plug in. Also, problem exists for many different people.
When I swap all components back to my backup computer (Z390 Gigabyte Designare, 9900K CPU) I have zero issues and full performance.
Event Viewer shows the following details
A corrected hardware error has occurred.
Component: PCI Express Root Port
Error Source: Advanced Error Reporting (PCI Express)
Primary Bus
evice:Function: 0x0:0x1:0x0
Secondary Bus
evice:Function: 0x0:0x0:0x0
Primary Device Name
CI\VEN_8086&DEV_460D&SUBSYS_86941043&REV_02
Secondary Device Name:Click to expand...Last edited: Nov 30, 2021Clamibot, Rage Set, pathfindercod and 4 others like this. -
electrosoft Perpetualist Matrixist
I was posting the 10900k more as a jokeAshtrix said: ↑Since the seal is already opened, I think one will need an ASUS mobo to check the SP rating and have skill enough to push that CPU to that far and see if it's actually legit LTX sample or not. But damn that price is really insane, too much for a 2 year old CPU. $1700 is a lot of money, we can get a full Xeon with used Dell server chassis and a small homeserver running for that cash. Tech depreciates so fast. Better bin ourselves, buy a lot from retail and get the ASUS board running and check the SP and keep the best one.
I was just checking the ASUS forums like usual and seeing if there's any fix for the ALC408x Realtek USB Audio problems (Static buzz AND Lag/latency, its on GB and MSI as well). Apparently no fix yet, and it has been a year already. The best part is all Z690 mobos have the same Audio Codec. Except ASRock. And you know even worse ? some people disabled the nasty POS in the BIOS and it shaved off 25 secs of Boot time. UGH. MSI have it fixed ? some are reporting so..Click to expand...
as I couldn't see paying anywhere near that for a 10900k or at this point more than $500 and that would have to be an SP100+ chip.
And yep, I still have all those issues with the 408x on my Gigabyte Z590. I had zero audio issues on my MSI but it was a budget board that used a lesser audio chip. I especially love the random thumps from my speakers when starting and booting up. The lag is always nice and especially when it goes off the rails and goes full blast sounding like the old TV static on bath salts....Clamibot, Rage Set, pathfindercod and 3 others like this. -
electrosoft said: ↑I was posting the 10900k more as a joke
as I couldn't see paying anywhere near that for a 10900k or at this point more than $500 and that would have to be an SP100+ chip.
And yep, I still have all those issues with the 408x on my Gigabyte Z590. I had zero audio issues on my MSI but it was a budget board that used a lesser audio chip. I especially love the random thumps from my speakers when starting and booting up. The lag is always nice and especially when it goes off the rails and goes full blast sounding like the old TV static on bath salts....Click to expand...
It would be fun that’s for sure. Honestly you can buy binned 10900K’s on Reddit. There was a SP123 on there for around $1,200 or so. I don’t advise it. Other than the simple fact it’s just pure fun overclocking them lol.
All of the golden sample 10900K’s are apparently all SP115 or greater though.
Some worse than others. I received a message of a guy saying he had a “bad sample” and it only did 5.6Ghz on all cores. (I would not consider that a bad sample, but these gold LTT chips are super good and will do 5.7Ghz on all cores with direct die and proper cooling apparently.
After testing that 10850K, it would take a super good bin to make it worth it over that 10850K. (By the way, that 10850K is actually in route to @Mr. Fox ) so I’m curious to see what he thinks of it. And find out what the (SP rating is per his Asus board)
I know the LTT gold chips are not worth it, however before 12th Gen dropped it was literally the fastest gaming processor one could own. I tried my best to find one for a reasonable price, and it didn’t work out. 5.7Ghz on 10/20 would be pretty fun to test out!!Last edited: Nov 30, 2021 -
Mr. Fox said: ↑Agreed. That was a stupid price even for the day when 10900K was the latest and greatest. Only a moron would pay that much (300% of MSRP) for something that will be obsolete in less than a year.Click to expand...The world we live in is more messed up than it ever has been in my lifetime. Absolutely nuts. Governments (especially ours) are currently in a state of unprecedented leadership corruption and incompetence. People are more dishonest than they ever have been. Bad people are more common than ever before. Standards are at record lows. Challenging the status quo and calling attention to what is messed up is either viewed as criminal or racist. Stupid has never been more common. Technology is right in the middle of this screwed up mess. Expecting tech products to be of high quality and reasonably priced, offering anything that remotely resembles value, and the absurdity of the notion that an average consumer possesses sufficient intelligence to identify a problem, are probably all unrealistic under these circumstances. But, all of the above is still totally inexcusable.Ashtrix said: ↑@Mr. Fox looks like what you said is 100% truth. Newer is getting ridiculous nowadays.Click to expand...
-
This is the new 11900K right here. It consumes over 110+ less watts than my other direct die 11900K.
This is running the exact same profile in the bios too. 3,878 Gear 1 CL14 all is identical. And managed to sip through R15 at 310 watts at 5.4Ghz.
This chip is so amazing. Once I delid this 11900K, I’m gonna be running like 5500Mhz minimum on this thing. Probably 5600 on a few cores cores (Easily)
I have not tried 5.5Ghz yet. I feel confident it can do
It without the delid. But I don’t want to abuse it with high heat yet, not before Delidding anyways. This is a special 11900K.
Apparently it came from Walmart. Walmart is know for bullying companies around and getting better quality products all for reduced cost lol. Maybe I’ll get my next CPU from Wal-Mart.
-
electrosoft Perpetualist Matrixist
With your setup and that chip, once you go DD, it is going to be an absolute monster.tps3443 said: ↑This is the new 11900K right here. It consumes over 110+ less watts than my other direct die 11900K.
This is running the exact same profile in the bios too. 3,878 Gear 1 CL14 all is identical. And managed to sip through R15 at 310 watts at 5.4Ghz.
This chip is so amazing. Once I delid this 11900K, I’m gonna be running like 5500Mhz minimum on this thing. Probably 5600 on a few cores cores (Easily)
I have not tried 5.5Ghz yet. I feel confident it can do
It without the delid. But I don’t want to abuse it with high heat yet, not before Delidding anyways. This is a special 11900K.
Apparently it came from Walmart. Walmart is know for bullying companies around and getting better quality products all for reduced cost lol. Maybe I’ll get my next CPU from Wal-Mart.
Click to expand...
310w @ 5.4 all core and topping out @ 76c is absolutely righteous. I wouldn't say it is sipping through as that is still some beefy wattage no matter what.
I absolutely agree do not abuse it with high heat or vcc till then to be safe.
Not as good as a few over on the oc forums as I think Sugi had 5.4 all core pulling in the 290's but it is damn good and he purposely picked well binned chips while you just randomly bought one from @Mr. Fox and scored a nice one.
How is the IMC @ 4000 G1 going?Clamibot, Mr. Fox, tps3443 and 1 other person like this. -
Was that with multiplier put on 54 or with lower clock speed pushed up with BCLK clocks?electrosoft said: ↑310w @ 5.4 all core and topping out @ 76c is absolutely righteous. I wouldn't say it is sipping through as that is still some beefy wattage no matter what.
I absolutely agree do not abuse it with high heat or vcc till then to be safe.
Not as good as a few over on the oc forums as I think Sugi had 5.4 all core pulling in the 290's but it is damn good and he purposely picked well binned chips while you just randomly bought one from @Mr. Fox and scored a nice one.Click to expand...
@tps3443 could you also post numbers with 54x on all cores? And enjoy your SL chips
Spartan@HIDevolution, Mr. Fox and tps3443 like this. -
Yeah it’s a pretty darn cool 11900K!electrosoft said: ↑With your setup and that chip, once you go DD, it is going to be an absolute monster.
310w @ 5.4 all core and topping out @ 76c is absolutely righteous. I wouldn't say it is sipping through as that is still some beefy wattage no matter what.
I absolutely agree do not abuse it with high heat or vcc till then to be safe.
Not as good as a few over on the oc forums as I think Sugi had 5.4 all core pulling in the 290's but it is damn good and he purposely picked well binned chips while you just randomly bought one from @Mr. Fox and scored a nice one.
How is the IMC @ 4000 G1 going?Click to expand...
I was only testing bios post and windows boot with the memory in gear (1) 4000Mhz, I can’t actually run any good ram settings/configurations yet though. I sold off my 4x8GB DDR4 4000CL15 kit. And my temporary ram kit is just terrible terrible terrible, it’s DDR4 4000 CL21-25-25.
I’ve put off any memory testing until my dual rank Dominator Platinum 3600CL14 set shows up. I’ll be able to do something then.
So far the IMC is looking great compared to my other 11900K though, but I won’t know for sure until I get my new ram.Last edited: Nov 30, 2021electrosoft and Papusan like this. -
Yeptps3443 said: ↑So just x54 multiplier you mean?Click to expand...
You could of course add the sub 104 BCLK on top in a second bench to compare the wattage
Spartan@HIDevolution and tps3443 like this. -
Now I really want the Premamod BIOS on my Clevo X170SM-G to get an update to support Rocket Lake CPUs so I can buy this CPU from you and stick it in my laptop. Man that'd be amazing!tps3443 said: ↑The new 11900K does 5.3Ghz all cores at 261 watts with 1.371V
^ What a BEAST!!
I’m running some really terrible B-Die memory right now as a temporary solution. So once the Corsair Dominator Platinum 3600CL14 DR kit shows up, I should be able to get some nice numbers up. I’m hoping to tune out a really good 3900 CL13 which would easy compete with even the fastest DDR5 setups in gaming.
I’m gonna be super careful when Delidding this chip. I can run this new 11900K at 5.1Ghz with auto voltage, and still use less power than my other 11900K (While it’s stock!) That’s 300Mhz faster on all cores, less power, and less voltages. If that’s not impressive, I dunno what is lol.
This is a nice chip!
I spent along time playing with the IMC. But I tested the core OC ability just for a couple minutes. Based on [email protected] the chip should run 5.4 even without being delidded. It hasn’t even reached that voltage demand point yet. I imagine 5.4Ghz at 1.405-1.425V would be easy. And after a delid , this will put it right in the territory of a 5.55-5.60Ghz for daily OC.
I know 5.5-5.6 sounds crazy. But it’s just that good.Click to expand...
Single core performance is what matters the most in the majority of my use cases, so this beast would make my laptop last longer for me.Spartan@HIDevolution, electrosoft, tps3443 and 2 others like this. -
-
I reduced the bclk to 100.00 and increased multiplier to 54.Papusan said: ↑Yep
You could of course add the sub 104 BCLK on top in a second bench to compare the wattage
Click to expand...
I personally prefer bclk overclocking with rocketlake especially with my last 11900K, because the silicon was not that great, so bclk would help it drastically. To the point where it either could or could not do something. This new chip doesn’t seem to care one way or another.
Last edited: Dec 1, 2021Clamibot, Mr. Fox, Papusan and 1 other person like this. -
Yeah it’s really great for gaming, and single application loads.Clamibot said: ↑Now I really want the Premamod BIOS on my Clevo X170SM-G to get an update to support Rocket Lake CPUs so I can buy this CPU from you and stick it in my laptop. Man that'd be amazing!
Single core performance is what matters the most in the majority of my use cases, so this beast would make my laptop last longer for me.Click to expand...
Just wait until I delid this thing. My last 11900K dropped 10-12% in power consumption while running it stock, after delid. I am assuming the same will apply to this better sample 11900K.
That should put it right at 275 watts at 5.4Ghz during an R15 run. Or only 130 watts while running stock.
Now I really understand why @electrosoft was chasing these really low power consumption numbers with these things.Last edited: Dec 1, 2021Clamibot and electrosoft like this. -
This new 3090 Ti (3090 Super) looks more and more true. More mess from Nvidia... Geforce 497.09 graphics driver released. Supports the GeForce RTX 2060 12GB. Yep their old Turing cards now come re-hashed with more vram than 3080.Papusan said: ↑Bro Fox. Will you replace your 3090 Kingpin card with a refresh if EVGA find out they will release a successor?
Never buy a non Ti (as etc 3090/4090) if this rumors become reality. Or If you only want the very best cards then you'lll miss out one year of gaming on the new arch, LOOL
NVIDIA GeForce RTX 3090 Ti confirmed to feature Micron’s 21 Gbps GDDR6X memory videocardz.com
Edit. Or just use a printer, HaHa
https://www.techpowerup.com/forums/posts/4656934
Could be Nvidia want a bit more from mining craze before it dies. Can't let AMD steal of their profits.
View attachment 197824
Sapphire GPRO X080 cryptomining card with Navi 22 GPU and 10GB memory leaked, costs 750 EUR
AMD board partners are actively selling GPUs to miners, except they do not disclose this information publicly to avoid complaints from their fans. Those mining cards are not listed on any manufacturers’ website, so the details on them are not always available. Luckily for us, El Chapuzas Informatico gained access to two datasheets from Sapphire, featuring GPRO X080 and X060 models.
I wonder how much of the graphics cards from Nvidia and AMD have gone directly to the miners (right out from the factories). I guess we will never know.Click to expand...
http://forum.notebookreview.com/threads/nvidia-geforce-driver-497-09-whql.837248/
------------------------------------------------------
@Ashtrix
Intel 12th-Gen Core Alder Lake Architectural Benchmark techspot.com
E-Cores Suck for Gaming
In short, the E-cores are a mistake for gaming, and if called upon they will reduce frame rates. So for gamers the 12900K and 12700K are 8-core/16-thread CPUs and nothing more. The E-cores might be able to help with background tasks, but frankly on the desktop they'd be better taken care of by two more P-cores.
See also... The reason Intel’s limited interconnect between the E-cores is to make them more efficient, both in terms of power usage and the amount of die space they require. For sequential workloads like what we see with rendering, for example, where there’s very little core-to-core communication, the E-cores work well and this is why Intel used SPECrate2017 to make their Skylake efficiency claim.Last edited: Dec 1, 2021Ashtrix, Clamibot, tps3443 and 1 other person like this. -
electrosoft Perpetualist Matrixist
There is such a wide variance in power consumption and heat with them it is astounding. So when you manage to get one or two that are literally pulling 100w+ less at the same settings it boggles the mind let alone stock settings not only pulling less watts but more importantly (for myself) less heat and is not a bleeder.I've seen chips pull low and medium watts yet run hotter than chips pulling more. For potential laptop use that is a deal breaker.tps3443 said: ↑Yeah it’s really great for gaming, and single application loads.
Just wait until I delid this thing. My last 11900K dropped 10-12% in power consumption while running it stock, after delid. I am assuming the same will apply to this better sample 11900K.
That should put it right at 275 watts at 5.4Ghz during an R15 run. Or only 130 watts while running stock.
Now I really understand why @electrosoft was chasing these really low power consumption numbers with these things.Click to expand...
The 9900ks I had in my P870TM pulled an ok amount of watts but watt for watt ran 10c+ cooler than the 9900k golden chip I had and that made it perfect for my laptop use.
I applied the same criteria this time around for the 11900k but with the no time table for Prema on the X170KM and looking like 11th gen will never be added to the X170SM anytime soon....ugh. I feel like I'll end up eventually upgrading to a 12th or 3D and placing this cool hand luke on the shelf to stare at for an extended period of time and it might not get used at all unless I pick up a KM and choke down the limitations.
With Alderlake desktop not likely appearing in laptop form, the 11th gen will be the end of the line for a true DTR notebook. :-(
What does your V/F curve in XTU show?Last edited: Dec 1, 2021 -
Papusan said: ↑This new 3090 Ti (3090 Super) looks more and more true. More mess from Nvidia... Geforce 497.09 graphics driver released. Supports the GeForce RTX 2060 12GB. Yep their old Turing cards now come re-hashed with more vram than 3080.
http://forum.notebookreview.com/threads/nvidia-geforce-driver-497-09-whql.837248/
------------------------------------------------------
@Ashtrix
Intel 12th-Gen Core Alder Lake Architectural Benchmark techspot.com
E-Cores Suck for Gaming
In short, the E-cores are a mistake for gaming, and if called upon they will reduce frame rates. So for gamers the 12900K and 12700K are 8-core/16-thread CPUs and nothing more. The E-cores might be able to help with background tasks, but frankly on the desktop they'd be better taken care of by two more P-cores.
See also... The reason Intel’s limited interconnect between the E-cores is to make them more efficient, both in terms of power usage and the amount of die space they require. For sequential workloads like what we see with rendering, for example, where there’s very little core-to-core communication, the E-cores work well and this is why Intel used SPECrate2017 to make their Skylake efficiency claim.Click to expand...
For people who don’t own a 3090, or GPU at all This is really a great option, if it’s not priced higher. I bet the memory is good at overclocking. That’s all I am interested in lol, these new 2GB modules. So I guess the whole water cooled back plate market will disappear in to the abyss again.Papusan and electrosoft like this. -
Don’t expect that this new milking SKU will be available with normal price tag. It’s meant to please gamers that is angry due Nvidia prefer miners for 3000 cards.tps3443 said: ↑For people who don’t own a 3090, or GPU at all This is really a great option, if it’s not priced higher. I bet the memory is good at overclocking. That’s all I am interested in lol, these new 2GB modules. So I guess the whole water cooled back plate market will disappear in to the abyss again.Click to expand...tps3443 likes this.
-
Which is odd, I have some but my 5700 XT's smoke RTX 3xxx in terms of efficiency.Papusan said: ↑Don’t expect that this new milking SKU will be available with normal price tag. It’s meant to please gamers that is angry due Nvidia prefer miners for 3000 cards.Click to expand...
-
electrosoft Perpetualist Matrixist
I always do appreciate your "glass half full" approach to things.tps3443 said: ↑For people who don’t own a 3090, or GPU at all This is really a great option, if it’s not priced higher. I bet the memory is good at overclocking. That’s all I am interested in lol, these new 2GB modules. So I guess the whole water cooled back plate market will disappear in to the abyss again.Click to expand...
-
I think @Mr. Fox is gonna tell us the SP rating of that 10850K soon. I’m so curious. I imagine it would have to be somewhat reasonable. That CPU pulled around SUB 150 watts stock. And around 300 watts at 5.3Ghz. For a 10850K that is pretty astounding I think.
Mr. Fox, Clamibot, Papusan and 1 other person like this. -
electrosoft Perpetualist Matrixist
Never underestimate the power of proper cooling and a proper dialed in motherboard.tps3443 said: ↑I think @Mr. Fox is gonna tell us the SP rating of that 10850K soon. I’m so curious. I imagine it would have to be somewhat reasonable. That CPU pulled around SUB 150 watts stock. And around 300 watts at 5.3Ghz. For a 10850K that is pretty astounding I think.Click to expand...
Having used three different Z590 motherboards (well two with occasional access to the Asus) over the last 7 months, the differences are crazy. What crashed and pulled on the MSI at 199w trying to dial in Vcore ran without a hitch on the Gigabyte with the Gigabyte pulling 20w lower at stock. That same pull (~179w) on the Gigabyte dropped to ~160w on the Asus M13. Both the Gigabyte and Asus (I'll assume) also had/have room to spare to dial it in even more. Mix that up with such a wide variance of silicon quality and it is pretty astounding. For granular control, while they may handle the heat and pull, budget boards are cheaper for a reason when you really want to start dialing in everything and you want a proper motherboard to take it being a problem out of the equation (EVGA and Asus have entered the chat....)
Given that the first 11900k I used was a dog pulling ~224w at stock CB23 (Same chip that hit 370w+ @ 5.3 all core) on the MSI, but plugging it into my boy's M13 just confirmed it as it spit back an SP63 after flashback and mc updating but did manage to run the same settings @ 206w. Using my current chip, I could tell before even taking it over to his place it was miles better and plugging it in, flashing back and updating the mc and it spitting back SP95 just confirmed it.
I know you don't have an Asus board, but at least with XTU we can see the first part of your V/F table.
Between the three boards (MSI Z590-A Pro, Gigabyte Z590 Aorus Pro AX, Asus Z590 M13 Hero), it was no coincidence each board in order ran cooler and tighter than the previous using the exact same CPU.
I hemmed and hawed looking at the Z590 Dark to crack it open but I know end game for me is to pop this 11900k in an X170. I like SP ratings but I like testing before even knowing to see if results correlate to rating and then getting a chance to pop it in to see.
If I move to 12th gen, outside of benchmarking, I'll be turning off e-cores and overclocking the snot out of the P-cores and give them full access to that 30mb of cache.
Historically, I've had the best results using EVGA motherboards over the years but YMMV. There is NO WAY I would have used this gigabyte board except it was part of the combo deal. -
New parts!!
2x16GB DDR4 3600CL14.
![[IMG]](images/storyImages/D53-F30-C4-824-D-4-C65-95-A5-5-CB870363479.jpg)
I like the Matte black finish.
Last edited: Dec 1, 2021electrosoft and Papusan like this. -
electrosoft Perpetualist Matrixist
Dom Platz!
-
DDR4 vs DDR5 with [email protected] how it affecting Cpu and Comb score https://www.3dmark.com/compare/sd/6279383/sd/6272669#
DDR5 https://hwbot.org/submission/485111...___sky_diver_geforce_rtx_2080_ti_99347_marks/
DDR4 https://hwbot.org/submission/486678...__sky_diver_geforce_rtx_2080_ti_100429_marks/
Spartan@HIDevolution, Ashtrix, Rage Set and 4 others like this. -
pathfindercod Notebook Virtuoso
DDR5/12th gen is Intels version of M$ Vista.
Spartan@HIDevolution, Ashtrix, Mr. Fox and 4 others like this. -
electrosoft said: ↑Dom Platz!
Click to expand...
Yeah! They had a sale on these. $200 bucks for Corsair Dominator Platinum 32GB Samsung B-Die was not too bad.
Corsair mailed them directly from Taiwan, and they made it to the NC in 3 days flat. Not bad for free shipping.Spartan@HIDevolution, Mr. Fox, electrosoft and 1 other person like this. -
Seems as though Dual rank is much harder to stabilize than single rank is!! This is my first time experimenting with dual rank memory.
(Only dual rank memory I ran before did not work at all) (Which was due to being Oloy brand) -
electrosoft Perpetualist Matrixist
2x single rank almost always is universally easier to overclock and stabilize vs dual rank and/or 4x but that dual rank will pay dividends when benchmarking and usually gaming too at lower clocks than single....it's just getting to that point.tps3443 said: ↑Seems as though Dual rank is much harder to stabilize than single rank is!! This is my first time experimenting with dual rank memory.
(Only dual rank memory I ran before did not work at all) (Which was due to being Oloy brand)Click to expand...
Spartan@HIDevolution, Papusan and tps3443 like this. -
electrosoft said: ↑2x single rank almost always is universally easier to overclock and stabilize vs dual rank and/or 4x but that dual rank will pay dividends when benchmarking and usually gaming too at lower clocks than single....it's just getting to that point.
Click to expand...
Yeah my IMC may not be all that fantastic after all, totally different than single rank. I think I’m hitting a wall at around 3,800Mhz or so. The system will post fine with much higher frequencies but it’s just freezing during the windows spinning circle. And the reset button on the motherboard doesn’t respond, I have to flip the switch on the PSU.
I will say this though, dual rank is incredible!! I’m already breaking in to 36NS at only 3733Mhz. Very impressive so far!
So far the CPU core is incredible, the cache is phenomenal as well. The IMC is now beginning to show its weaknesses with Dual Rank DDR4. But, I can live with that. Going to tune out the best performance I can.Spartan@HIDevolution, Clamibot and electrosoft like this. -
Last edited: Dec 1, 2021Spartan@HIDevolution, Ashtrix, Rage Set and 2 others like this.
-
electrosoft Perpetualist Matrixist
It's hard to lock down a good bin that has good clocks and imc but it looks like a killer frequency chip that runs much faster and cooler than your other 11900k but seems to have a weaker IMC. Did you test your first 11900k with the dual rank for comparison?tps3443 said: ↑Yeah my IMC may not be all that fantastic after all, totally different than single rank. I think I’m hitting a wall at around 3,800Mhz or so. The system will post fine with much higher frequencies but it’s just freezing during the windows spinning circle. And the reset button on the motherboard doesn’t respond, I have to flip the switch on the PSU.
I will say this though, dual rank is incredible!! I’m already breaking in to 36NS at only 3733Mhz. Very impressive so far!
So far the CPU core is incredible, the cache is phenomenal as well. The IMC is now beginning to show its weaknesses with Dual Rank DDR4. But, I can live with that. Going to tune out the best performance I can.Click to expand...
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.
![[IMG]](images/storyImages/71-C6314-C-3-BAD-4-DCA-9739-3-C7-E3-D50-CFAD.jpg)
![[IMG]](images/storyImages/7-D6-FD189-F23-F-4-CB1-872-E-565-B3-B5-E75-F5.jpg)