You're trying to overclock your cpu. This is an overclocking thread. You're not hijacking anything.
Is the 10600K delidded? That is probably the only way you're going to extract more performance out of that chip.
That voltage is way too high. I wouldn't go over 1.45 and that is reserved for benches. 1.3 to 1.35 is the highest I'd go for daily.
-
-
Well, that is a lot better than this Ryzen 2600X I am using while the 5950X is being delidded. It is not bootable above 4.2GHz on all cores (literally makes a 00 Q-code for CPU failure) and it takes 1.352V for that wimpy wuss-boy clock speed. And, I can't get the memory higher than 3400MHz even with 1.550V. It's a real turd, and that is sad because it seems like a "golden turd" for that particular CPU. It's a really lousy SKU I guess.
What kind of Cinebench score do you see with it? It is 6 cores/12 threads, exactly like the 2600X.
Golden Turd: https://hwbot.org/submission/4837272_ ...pretty sad... but #1, LOL. About half the time it shuts off before Cinebench finishes.
I agree... 100%. Totally on-topic.electrosoft, Ashtrix, Papusan and 2 others like this. -
Thanks for that reassurance. I just felt like I was starting to make an excessive amount of posts.
My 10600K is not delidded, but I'm going to be popping my 10900K into my X170SM-G soon for (hopefully) significantly better results. I didn't really plan on doing much to my 10600K. I was using it to take my new laptop for a test run. It just happened to come with my system, and it was cheaper to get the laptop with that CPU and the 10900K I bought on top of that than to spec the laptop out with a 10900K from the shop.
I'm going to apply liquid metal to my 10900K and install a full copper IHS over it. Every increase in cooling capabilities counts!
Golden turd, LOL!
I've been running only 4 cores to see the absolute highest speed I could achieve on the 10600K. With 4 cores running at 5 GHz, my CB15 score was 1070. I suppose I should run it again with all 6 enabled. -
-
Yeah I had to use 1.45v for 4.3Ghz on my 2700x and just left it that way since. I think I dialed it back down to 1.38 and 4.2Ghz since its not my main machine anymore.
-
Thanks man, I really appreciate it. It came pre-delidded, but unfortunately the person who sold it to me has a different definition of "delidded" than what it normally means.
The CPU arrived without a lid, with the solder still on the die. I acquired a Rockitcool copper IHS to have an IHS for the 10900K (which thank goodness comes with quicksilver solder remover), and I also got some liquid metal. I'll be cleaning up the solder and then installing this CPU later this week.electrosoft, Papusan and Mr. Fox like this. -
@Papusan Apple's new Silicon is here.
Apple M1 PRO and M1 MAX check out the graphs below..
Apple's new TSMC 5nm SoC is 57Bn transistor design, yeah massive. And very advanced vs TSMC 7N / Samsung 8nm a.k.a 10nm / Intel 7 a.k.a 10nmSF.
Now the performance, this time they actually included the comparisons BUT note it's the unknown benchmark / relative performance graphs of Apple.
i7 11800H 8C16T crippled BGA Junk - PL2 at 70W PL1 at 48W with 4.6GHz max boost clock. "Hey it runs games at full FPS so it's good" - The usual Youtube crowd just like GN, and everyone running a 11900K at damn stock PL1 and PL2 anyways Apple claims they are faster than this CPU. Ignore the 4C processor as it's a 15W junk Cezzane APU trash. Horrible, if they compare with 7700K I wonder what is going to happen.
For the GPU performance their laptop comparison was mobile castrated trash RTX 3080 GPU, checking NBC performance for that laptop it's having a max 28-31K of Firestrike GPU score so an RTX2080 class performance in a laptop SoC ?
Will Anandtech claim this as the Universe's most powerful laptop ? Also it's the world's first laptop with a ****ing notch. Also their laptop has a 140W power brick.
Edit - Already started at Anandtech..simply copy pasted the comparison laptops and called it a day without any mention of TSMC 5nm nor a real x86 processor nor a good laptop like X170SM nor when M1X takes full load how efficiency drops, it's evident on all CPUs on the planet from iPhone to Snapdragons to Intel AMD processors and Nvidia AMD GPUs any performance load = more power required. Physics basics but that doesn't matter. Apple is the lord of their money flow. Let's shill to max. Upon pointing out they added the TSMC part in the die size and transistor count..Last edited: Oct 18, 2021electrosoft, Clamibot, Rage Set and 2 others like this. -
electrosoft Perpetualist Matrixist
-
The problems I can see with it at first blush, besides the origin of the suspicious "benchmark" graphs, are:
- It is made by/for crApple
- It is sold by crApple
- It runs on OS X
- It won't overclock (and they won't let it)
- It is in a thin/light MacBook chassis
- Based on all the above, it's gonna suck... real bad
Ashtrix, SierraFan07, Clamibot and 1 other person like this. -
electrosoft Perpetualist Matrixist
I thought the same thing. It looks like an MSI card that got splashed with bleach. I do not find it aesthetically pleasing at all.
My favorite cards for aesthetics are:
1. Nvidia 3090 Founders Editions - Still my favorite but that coil whine....oy
2. K|NGP|N 3090 - I still love the look of the Hybrid and the flip up screen
3. Asrock 6900XT OC Formula - Something about the curves, lines and colors floats my boat
4. Gigabyte Aorus 3090 LCD edition - Love the integrated LCD screen and overall look
5. Sapphire Toxic Extreme LC 6900XT - Reminds me of the Asrock OC but a different take -
electrosoft Perpetualist Matrixist
The very first time I had to delid a 9th, I used a plastic razor (never metal) to shave away the solder as thin as possible then a touch of liquid metal and agitated it to get it off and finally some Fitz to make it shine like a mirror.
That quicksilver solder will make it easy peasy to remove. -
Alright boys, just pulled a trigger on an old K610m so that I can start benching in Win7 for better scores
I have no clue why these things were made, as they seem to really suck, but I guess they found some use during their lifetime, and for me right now
It still blows my mind that Nvidia has finally cut driver support for both Kepler and Win7. I feel like it was just a few months ago where I was learning about tech, and how used Kepler-based cards were still good budget performers if you couldn't get your hands on a Pascal card because of that crypto craze that was kind of the predecessor to the situation right now back in 2017-2018.Clamibot likes this. -
Its just so they can throw something into the systems for volume enterprise sales.
Despite how these large corporations had at one point (legend has it) shown great tenacity to get where they are. Now many of these IT Directors, Sales Managers, Acquisition supervisors and Project Managers know very little to nothing at all about the tech they are purchasing because they are at the end of the day, A manager with purchasing power.
For instance I go through a great deal of Precision 5540/5550/5560 5810/5820/7810/7820, 3630/3640/3650 systems. Maybe one of those will be used for an MRI machine the rest will go into storage until I pick them back up 4 years later for decommissioning. The Full tower systems all have no iGPU, so they throw in a base model GPU just for display output.
Recently they ordered a dozen 7820's with 10K price tag monitors and Rebranded 6700 XT's (for medical usage). They returned the monitors and towers and kept the 6700 XT's
Its annoying when people think they are being smart when really they are telegraphing their intentions, not that I care its their corporate account to begin with. -
Why do you need a K610M to bench Windows 7? I bench Windows 7 with RTX 2080 Ti and 3090 with no issues.
A little known secret is: older NVIDIA drivers almost always have higher benchmark scores. Because newer is better newer. Unless you have some crappy game that need a newer driver to fix its bugs, having the newest driver is also totally unnecessary for gaming. Sometimes drivers get crappier because the games they are released to support are crappy. It might make the game(s) be less buggy, but it doesn't make the OS run better... often the opposite is true.Tenoroon likes this. -
Yeah I remember it used to be the basic mode of operation, if everything is working then dont update. Adding a variable to a sound equation makes little sense.
Back to my earlier post our IT Director was complaining that management was getting on his case about the low sales from decom systems. Me and my bosses boss went over to check it out and low and behold a fully gutted EPYC server and 2 other File Servers worth about 40 grand each. -
The 1070 doesn't get along with the firmware of this computer very well. I'm lucky enough to have BIOS access, but if I turn Legacy boot on, I can no longer boot into any drive, and the BIOS starts to become finnicky. I've tried installing Win7 without legacy boot to no avail, so it's just easy for me to spend 20 bucks on something I can just plop in whenever I want to benchmark stuff.
-
This "golden turd" will run 43.50GHz with 1.565V...
Max core temp was 69°C in wPrime 1024M, so it's not getting hot.
https://hwbot.org/submission/4837739_
https://hwbot.org/submission/4837741_
Ashtrix, 4W4K3, electrosoft and 7 others like this. -
It looks like the small wimpy fan on the GT 705 have failed (that little fan make an differences for stability). Yep, the graphics cards runs a bit hot
I wouldn't run it as this if it was a 3090, LOOL
https://hwbot.org/submission/4838193_papusan_3dmark03_geforce_gt_705_15116_marks?recalculate=true
https://hwbot.org/submission/4838157_papusan_3dmark___night_raid_geforce_gt_705_1886_marks
https://hwbot.org/submission/4838158_papusan_3dmark___sky_diver_geforce_gt_705_1767_marks
https://hwbot.org/submission/483819...6p_geforce_gt_705_1787_marks?recalculate=true
Last edited: Oct 20, 2021Ashtrix, Spartan@HIDevolution, electrosoft and 4 others like this. -
electrosoft Perpetualist Matrixist
Giving the Gigabyte 3070ti from BB a workout.
Zero coil whine (3 for 3 with Gigabyte cards, 2 for 2 with EVGA cards...all the rest had whine I've tested).
Using Geforce Experience optimized profiles, nothing is going to save Ray Tracing in WoW in many locations. 26fps in certain areas with RT enabled Ultra settings. Same area with the 3090 is ~43fps. But with the GFE optimized settings, turning off RT and I'm getting 60fps everywhere with the 3070ti @ 4k Ultra. I'll fly out to Ardenweld later for the final test.
I had to re-activate my subscription to try out the settings since I haven't played in awhile.
GFE also has a Diablo II resurrected profile I'll try later but so far everything is looking up.
Ashtrix, Papusan, Rage Set and 1 other person like this. -
electrosoft Perpetualist Matrixist
Lol....a turd that is the best of the best amongst a pile turds...
-
This is exactly what I did. That solder comes right off. Very soft.Clamibot, Papusan and electrosoft like this.
-
Running Farcry 6 maxed out at 2560x1440P. Averaging around 150FPS. Usually 138-158FPS.
This game runs well on a 11900K and 3090kp.
The reviews of this game showed massive cpu bottleneck. And the AMD cards were running the show. Well, these numbers would say otherwise.Clamibot, Papusan, Mr. Fox and 1 other person like this. -
electrosoft Perpetualist Matrixist
Stock Timespy 3070ti run. I can't get over how quiet this card is. Even with ye ole paper towel tube, I can't hear anything. At least on the other cards I could at least pick up very light sounds if I was right up on it pinpointing certain areas but this 3070ti makes zero noise.
I like how even Timespy CPU test couldn't push the 11900k past 56c, but considering CB23 can't push it past 59.... SP95 ice cold edition. I'm seriously hoping @Prema one day will add 11900k support to the X170SM-G.
-
A couple of ribbon strips fixed it. Stripsed on an old fan below the graphics card so I'm back on tracks with this little beauty
https://hwbot.org/submission/4838540_papusan_catzilla___720p_geforce_gt_705_1368_marks
https://hwbot.org/submission/4838541_papusan_catzilla___1080p_geforce_gt_705_752_marks
https://hwbot.org/submission/4838543_papusan_catzilla___1440p_geforce_gt_705_387_marks
-
electrosoft Perpetualist Matrixist
Retooled 3070ti build as it has passed the WoW/D2r litmus test. Threw a light bar back in there for shiggles. Took the 420mm and tried to move it up top (since the specs say it will fit), but it really doesn't fit at all up top and you would at the minimum have to remove the rear fan so back to the front it went.
I'll be putting my KPE 3090 up for sale on the Fleabay in a few days. Prices are climbing for some reason on flea bay to $3k and higher even for used models and it doesn't get enough use from me to keep it. Plus October is the end of my household fiscal yearly budget where I lick my wounds, clear out a lot of stuff and prepare for the new year. I already have it back in its box so I won't be tempted to keep it.
-
More Gold from the old for @Prema Team
https://hwbot.org/submission/4838623_papusan_3dmark11___entry_geforce_gt_705_1302_marks
https://hwbot.org/submission/4838625_papusan_3dmark11___performance_geforce_gt_705_744_marks
https://hwbot.org/submission/4838624_papusan_3dmark11___extreme_geforce_gt_705_220_marks
Spartan@HIDevolution, Ashtrix, Rage Set and 2 others like this. -
electrosoft Perpetualist Matrixist
Looks like the 12900k, even on a 10nm fab, is still potentially pulling monstrous power:
https://videocardz.com/newz/intel-c...mance-cores-reportedly-consumes-330w-of-power
"The leaker claims that the Core i9-12900K with a clock speed of 5.2 GHz and voltage of 1.385V has a power consumption of 330W, which is 2.64x more than the ‘stock’ PL1 value of 125W. Unfortunately, no proof of this power use has been provided."
I know the 11900k can pull 330w too in the right conditions. The 12900k was clocked at 5.2ghz all core. I've had a few 11900k's pull 330w up to 370w doing 5.2ghz all core and that's on a 14nm backport.
I would have expected (hoped?) the 12900k would be a bit less thirsty (1.385v) and Watt crazy (330w) at those clocks. That is full on 11th gen 11900k territory.Ashtrix, Clamibot, Rage Set and 1 other person like this. -
Imagine this chips with 16 real cores. Of course Intel had to put in 8 baby cores with no HT and no AVX512 Instructions. I'm sure Intel will try use its new hybrid chips as bait for all it's worth. Their future is still the OEMs and lapjokes/tablets and 2 in 1.
And expect their Alder lake mobile chips will run hot. They have to if they shall compete with AMD and 16 real cores mobile chips (16C/32Th) https://videocardz.com/newz/amd-ryzen-7000-mobile-series-might-include-16-core-zen4-raphael-h-chips
DDR5 Will Probably Cost 50% to 60% More Than DDR4
Yep, full package if you want the new hybrid chips will cost you an arm and a leg.
With the small cores they hope Apple will swim on the hook...
![[IMG]](images/storyImages/csm_Untitled5625_0532c9ad19.png)
Intel CEO praises the Apple M1 but hopes to win Mac business back with better processors
And then we have Microsoft. They also have the need for the small phone cores...
Microsoft could copy Apple with bespoke Surface laptop
https://hwbot.org/submission/4838635_papusan_3dmark_vantage___performance_geforce_gt_705_3148_marks
![[IMG]](images/storyImages/2579136.jpg)
https://hwbot.org/submission/4838636_papusan_3dmark_vantage___extreme_geforce_gt_705_1278_marks
![[IMG]](images/storyImages/2579137.jpg)
https://hwbot.org/submission/4838640_papusan_geekbench4___compute_geforce_gt_705_6431_points
Last edited: Oct 20, 2021Spartan@HIDevolution, Ashtrix, Rage Set and 1 other person like this. -
I've never been one to care too much how much power it draws and I don't really understand the logic when I see it presented as a matter for concern. As long as it is a good silicon sample, overclocks well and runs like a bat out of hell, nothing else matters. In fact, I think it's kind of cool and awesome to see a chip pulling 2, 3, 4 or even 5 times its rated TDP running balls to the wall. More watts has historically meant better overclocking and higher benchmark scores, and I don't really think that has changed (yet). Kind of fits my motto to "do more with more" in mockery of the popular "do more with less" mindset that almost never works out well anywhere one might attempt to apply it.
It gives me great satisfaction to look at the meter where my power cord is plugged in to see 1kW+ in a 3DMark 11 or Time Spy run, or to look at HWiNFO64 sensors after an epic benchmark run and see that my 3090 peaked at ~800W. And, I used to love seeing that my 7980XE peaked above 1kW, or that my 10900K/KF peaked out over 500W. Sadly, I haven't seen similarly insane power draw from the 5950X and I do feel that is one of its Achilles' Heel(s). I think it could "do more with more" but may not have more to give than what I am prying from it in a painfully tedious manner. Maybe it is a lot more fragile than an Intel CPU and I should be happy it doesn't if it wouldn't survive the exercise, but it is disappointing in some ways to see it not have the ability to run wild and free as long as you keep giving it the voltage it needs to get the job done and keep it cool enough.
One fact that I think professional reviewers, unofficial commenters and casual observers frequently forget is that Intel's TDP spec is for stock NON-TURBO base clocks; not a spec that anyone should expect to see honored or observed in an overclocked scenario. If you limit the TDP instead of popping the cork it's going to run like crap.Last edited: Oct 20, 2021SierraFan07, Ashtrix, Clamibot and 3 others like this. -
I seemed to have burned this CPU in very very well now. I am still greatly limited by water flow within my entire custom loop. I’m trying to push a single D5 on a large external rad.
Anyways, figured I’d share some tests. My 11900K cpu is 100% bone stock auto auto voltages. (It’s doing very good) besides the low water flow. I’ve been too lazy to install a second D5 pump.
R23 uses 194 watts. It was 186 watts last week. (Not sure what’s different) (I did switch thermal paste though)
R15 uses 185 watts max. (Auto voltages)
@electrosoft I noticed your 11900K dips below 5Ghz in the Timespy 3DMark GT1/GT2. For some reason mine isn’t doing this, not sure why.
11900K Stock/Auto voltages.
Thermal velocity boost= off
Intel ABT= off.
MCE= enabled/power limits off
DDR4@3867Mhz CL14-14-14-34-266-1T
VCCIO@ 1.550V
VCCIO MEM@ 1.550V
EVGA 3090 Kingpin Hydro Copper (520 KP LN2 bios) default settings/ stock.
Last edited: Oct 21, 2021Mr. Fox likes this. -
electrosoft Perpetualist Matrixist
Mine varies in CB23 too. ~173-183. Before I updated the BIOS it was 165w all on auto. Everything I do is on a Arctic LF II 420mm, no custom water over here along with an AIO known for so so flow. I knew the chip was righteous when even the EVGA CLC 360 wasn't even going max fans and just purring before I switched. On the EVGA a few of the previous 11900k's had the fans ramping up and down on the desktop between 25% and 50%. Those were the same ones pulling 330w and 370w @ 5.2 and on auto stock CB23 was hitting 68-71 with the EVGA CLC 360mm. Luckily you have a decent chip with pretty nice custom cooling.
I didn't even notice the dips but you're right. I went back and looked and I don't know why those dips were happening unless I had some stuff going on while running in the background or the 3070ti wasn't giving it enough to do contiguously . Right after I ran that earlier I ran the KPE 3090 for a sell verification capture when I list it at pure stock for a Hybrid (not Hydro). I just pulled that cap up and those monstrous dips were gone. I just reran it with the 3070ti and they're back. I then ran CB23 and it is right where it should be for all stock.
-
Yep That is a fantastic chip you have, I ultimately decided to keep this one. Especially with it being so close to 12th Gen, and I feel like my chances of actually getting a better one are pretty slim. So my current 11900K is going out Monday for a delid. Also, I am really surprised to see you selling that 3090KP. -
electrosoft Perpetualist Matrixist
I'm just not using it. This entire last 7-14 days it did nothing (in the Gigabyte's 3070ti's defense, it sat there sealed in the box for over two weeks). I don't cryptomine at all either. There are zero games on the horizon I'm looking to get. I'm sure there is a bencher, miner or more hardcore gamer that will put it to better use than I. It is an absolute beast of a card no doubt and gorgeous to look at and use but still. I just opened up the 3070ti today and re-subscribed to WoW because my daughter is playing again for the first time in over 18 months. I plan on tinkering with the X170SM-G this weekend after the HoneyDo (extensive) list is complete.
With your set up, cooling won't be too much of a problem with 12th gen. Those initial pull numbers really do feel like RL. -
Are you sending it to Rockit Cool?
They received my 5950X on Thursday and I haven't received any word from them since. I emailed earlier today asking if they could provide an ETA on completion and waiting to hear back.
Interesting enough, the AMD Ryzen delid service is no longer available on their web page. Accessing it shows an 404 Page Not Found error.Ashtrix, Papusan and electrosoft like this. -
I was originally trying to hold out for Alder Lake and DDR5 RAM before building a new system. Now I'm really glad I didn't and got my Desktop Killer when I did. Sky high prices? No thanks. Even if the price increase had a linear relationship with the performance increase, there's only so much money an individual is willing to spend on just one part.
Now I get to enjoy my new laptop as the market is devolving into a crapstorm. Hopefully things have settled by the time I want to build another system.
I think this weekend is prime time for messing with that 10900K I keep telling myself I'm going to install in my laptop. I keep playing games on it instead.
-
electrosoft Perpetualist Matrixist
RUH ROH!
Hope your CPU isn't the one that broke the camel's back.
Papusan likes this. -
electrosoft Perpetualist Matrixist
The good thing is the 10900k is sitting there waiting for you when you're ready.
I still haven't dug into my X170SM-G I picked up from @Terreos . It is still sitting in the box on the table. I'll get a chance to dig in what is now looking like Monday after reviewing this weekend's task list (repair deck, repair door, repair light fixture, assemble two desks, set up new desk and data drop for daughter's work at home station and finish configuring her mom's new laptop and repair/tweak her laptop and setup docking station, redo entertainment center and set up AppleTV...take them to the boardwalk and dinner Sunday
)
-
About that 11900K you wanted to put in the X170SM-G you acquired, is there any reason it wouldn't work? I thought Z490 motherboards all supported Comet Lake and Rocket Lake CPUs by default, or does the Premamod BIOS need an update to support an 11900K on the X170SM-G?Spartan@HIDevolution and electrosoft like this.
-
electrosoft Perpetualist Matrixist
I know the Alienware 51m r2 won't work with 11th as it clearly needs a BIOS update. I'm not sure about the X170SM-G but I would suspect it would need a BIOS update too.
Clevo and/or Prema probably won't release an update because of marketing/seller reasons as it could/would cut into X170KM-G sales as that seems to be the only logical reason as just about every motherboard maker of Z490 has updated their BIOSes for 11th gen support. I don't know of anybody who has tried yet.
If I dig in there to try, I'll probably consult @Mr. Fox to make sure I don't shank the mods he designed to be sure. -
The hardware mods should not be affected. You may need to purchase additional product for the thermals to put everything back together if you remove the heat sink and find things have hardened, but nothing major. You can PM me if you want to know what to expect before you dive it. It's certainly nothing you can't handle with your eyes closed.
Man, I sure hope that nothing went wrong, LOL. But, I communicated with James Holbrook before deciding to take the leap of faith on the delid. He assured me they had it down to a foolproof process and even showed me a video of how it is done. He offered to send me the tool and let me do it. I decided it would be better to let him do it for me since it is his process and if he breaks it, he buys me a new one. If I break it, it's on my dime. I have seen enough complaints about crappy silicon quality on 5950X that I would not be enthusiastic about rolling the dice in the lottery. Even though my sample wasn't awesome, it seems like there are plenty out there that make mine look good.
The tool itself if simple yet complex in that you need 3 hands to operate it. You have to use a heat gun and laser thermometer to heat the IHS to just the right temperature to melt the solder without burning up the CPU and activate the tool to pop the lid while heating it. Since I only need one done (and probably won't buy another Ryzen CPU in the future) it just makes more sense to pay him to do it for me.
I am sure it will turn out nice, but he is slow to respond to emails. Probably very busy and not checking emails often because he has bigger fish to fry. Rockit Cool is a small part of an establish machining and fabrication operation (if my understanding is correct).
If I don't hear something later today, tomorrow I may see if I can find a phone number under his main business name and give him a ring. I don't need it back yesterday with an order of fries, but I would like to know a status and ballpark ETA on when I should get it back. It certainly sucks using the 2600X, but it's not going to kill me if I have to put up with it for another week.Last edited: Oct 21, 2021electrosoft, Ashtrix, Clamibot and 1 other person like this. -
Another rumor category of ADL... i9 Turbo clock speed boost max leak from Videocardz that 12900K has only 5.0GHz max is probably not 100% accurate, it might be for 12700K notice the base clock is 3.6GHz unlike RKL which was at 3.5GHz and CML at 3.7GHz. However there's that 5.3GHz with TVB, ABT kind of boosting with all these processors. Anyways here's another fresh news on the OC capabilities and power consumption of the 12900K.
I think this ADL might be exactly like RKL, too hot and high power consumption to OC and tune for higher clocks at over 5.0GHz+ for RKL. Also E cores probably won't be overclockable, probably these P and E cores do not share same power plane. But a fixed set for E cores and P can be tuned. It will be really interesting to see the performance of this CPU with E cores disabled vs RKL and CML and of course against Zen 3. Also notice that 233W is probably PL2 it's better than 10900K / 11900K 250W on 14nm++
Ultimately this makes sense why Silicon Lottery quit. 10nmESF is too hot for the OC ceiling and hard to bin vs 14nm which already shot through roof for RKL's performance advantage over 10900K and it's clock speed with power scaling when OCed. And 11900K is basically already a binned 11700K. Looks like Intel is hitting some wall to me.
This means the SPR based HEDT won't be clocked super high, no doubt. Plus it has only P cores no BS E core drama or how @Papusan calls them as Baby cores and Lapjokes lmao... Will be mighty interesting to see both HEDTs duke out. Here's the AMD's new leak of Cloudripper / Chagall / Genesis peak / Threadripper 5000, coming through at 4.3GHz clocks.
Apple is done deal, I want to know what Pat Gelsinger is smoking really. He says AMD it's over and now wants Apple to be back on track. He is an engineer, he should know much better than any of ones out there esp he is the CEO of Intel now.
Apple's M1 Pro and Max are monsters in transistor counts...
M1X series - 52 Billion TSMC 5N, most expensive silicon for sure to manufacture.
Nvidia RTX 2080Ti / TU102 - 18.6 Billion TSMC 12nm
Nvidia RTX 3090 / GA102 - 28 Billion Samsung 8N / 10nm
Intel i9 11900K - 6 Billion Intel 14nm++
AMD's Zen 3 - 4.1 Billion TSMC 5N for each 8C Chiplet, so a 5800X would be 8.2Bn.
For fun add the Snapdragon SD888 - 11 Billion Samsung 5nm, Apple A15 - 15 Billion TSMC 5N, this one is without modem ! while SD is with their 5G modem included as Apple uses a separate Qualcomm X60 5G modem on their chipset in iPhone 13 with A15.
So look at that M1 Pro / Max it's on a whole another league completely, that's not counting the unified 64GB LPDDR5 memory. And Apple will come back ? He is dreaming hard. Apple got first dibs on the 5N at TSMC there's no one else who put that much amount of cash in the fab and they will tout next gen M2 on TSMC 3nm next year and push more efficiency figures, where it will obv work only in Apple Utopia where everything is inferior except their junk since our x86 parts scale with power and cores, and same for GPU design. While Apple junk has specialized workloads with direct silicon to software optimization.Last edited: Oct 21, 2021Rage Set, Papusan, electrosoft and 2 others like this. -
No doubt my Apple MacBook Air M1 16GB/1TB is impressive that it can actually run games smoothly at 30fps on a battery with no fans. (This is pretty cool) however, I like power and high fps. And an 11900K+3090 Kingpin destroys that device.
The X170 should be interesting.
Selling the 3090 Kingpin makes sense if your not using it much. However, the stock performance of the 3070Ti doesn’t seem right. My 2080Ti FE was managing around 24-25% higher graphics scores in Timespy.
What is the power limit like on your 3070Ti?Clamibot likes this. -
Yes rockitcool. I purchased the service about a week ago. And I wasn’t planning to mail off until next week. My screen says 404 and the service is gone as well.
I do hear rockitcool is horrible with responding to customer inquiries and messages. They also closed down the website and took a vacation not too long ago. They had some announcement saying. “We need a break” “we’re taking a vacation” “closing the sight for a week”
^ something like that. They may just be backed up a little. -
You mean scores from this post?
http://forum.notebookreview.com/thr...rs-welcome-too.810490/page-1250#post-11124594
2080 Ti FE review https://www.guru3d.com/articles_pages/geforce_rtx_2080_ti_founders_review,33.html
With 25% higher GPU score for stock you should see 17000 in graphics from the Guru3 review. This would put a stock 2080 Ti within top 50 on the bot with a proper Cpu overclock.Last edited: Oct 21, 2021Rage Set likes this. -
Yeah I got 17,600 TimeSpy graphics score with my 2080Ti FE
I broke 17,003 on the FE air cooler though.
@Mr. Fox was doing around 17,800 maybe even 18,000. I could not beat his water chiller lol.
The 2080Ti easily had 25-35% more performance potential. (They we’re heavily watered down)
This was the reason why a 3090 KP was the only upgrade path.Mr. Fox likes this. -
-
-
I am quite curious to see how far Apple will let that M1 Max SoC stretch its legs. The graphs show 30W but the cooling suggests a higher wattage.
We can't compare the Max SoC to anything else because you're right, there isn't anything else out like this SoC. A vertically integrated solution designed for specific workloads will always beat a "general" purpose x86 processor in those specific tasks. Aka, ASIC vs FPGA.
There is no doubt in my mind that x86 is dying and I am of the opinion, good riddance. Apple is years ahead of many of these chip designers, AMD, Nividia and Intel*. However, Intel is one of the few (Samsung is another if they can start to produce better SoC's) that if they gather themselves can compete and start going into the direction of ARM/RISC V. -
Ok, I gotcha. Yeah I only consider the potential of the hardware. Stock performance of most hardware isn’t the best.
Especially if you go look at any 11900K review LOL! This tells the story pretty well.Papusan likes this.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.
![[IMG]](images/storyImages/57-BC7-ED5-DDD7-4-C22-9-BB3-C07-B628-FA130.png)
![[IMG]](images/storyImages/90-A03-C19-0367-46-FB-B0-AA-CDEB8-EFA8640.jpg)
![[IMG]](images/storyImages/5-C307-D9-E-5143-47-DE-B460-80710-D9-A944-E.png)
![[IMG]](images/storyImages/0919780-C-8-D42-4-F1-C-BA90-D33000-C8793-E.jpg)