![]()
Intel Core i9-12900K "Alder Lake" Beats Ryzen Threadripper 2990WX at Cinebench R23 nT techpowerup.com | Today, 11:11
An alleged Intel Core i9-12900K "Alder Lake-S" sample is shown beating the 32-core AMD Ryzen Threadripper 2990WX HEDT processor at AMD's favorite benchmark, Cinebench R23, in its multi-threaded (nT) test. At this point it's not known whether the i9-12900K is overclocked, but the CPU-Z instance in the screenshot reads 5.30 GHz, which could very well be the processor's stock Thermal Velocity Boost frequency. The sample scored upward of 30000 points, putting it above the Threadripper 2990WX reference score in Cinebench.
The 2990WX is based on the "Zen+" microarchitecture, and released in 2018, but is a 32-core/64-thread chip that should have ripped through this rendering workload. The i9-12900K, on the other hand, has eight "Golden Cove" performance cores that have HyperThreading, in addition to 8 "Gracemont" efficiency cores that lack HTT. This benchmark was run on Windows 10, which lacks awareness of the Intel Thread Director, a hardware component that optimizes utilization of the two kinds of CPU cores. Windows 11 is known to feature better awareness of hybrid core architectures. The i9-12900K sample is possibly installed on a Gigabyte Z690 AORUS Ultra motherboard, and has 32 GB of DDR5-5200 memory (two modules, logically four 40-bit channels).
---------------------------------------------------------------
NVIDIA GeForce RTX 3080 SUPER, 3070 SUPER and 3060 SUPER get rumored specifications videocardz.de : 22nd Sep 2021,
3090 Super come with 2.5% more cores. Maybe faster vram as well. What a nice upgradeWhy bother make it?
![]()
-
-
Wow, let me find $2000+ USD for that GPU micro-upgrade right now, LOL.tps3443, Clamibot, Rage Set and 1 other person like this.
-
Here is another Joke. Nvidia seems to had a chance to grab a few more GDDR6X chips from Micron. I wonder how much more they are able to increase the MSRP price with this change.
And will this mean Nvidia will finally offer GDDR6x chips for their high end mobile Super cards? -
I'm the one who wants that SATA storage haha. No NAS here so going to stick it all in the main computer (right now using a USAP USB3.0 dock to power my WD Reds). I will try to build a small homelab once I have enough technical knowledge to do so. I genuinely prefer SATA SSDs to NVMe because of too much heat on latter and the high capacity, high endurance drives are only in SATA. MLC tech is going away, the last of it's kind is Samsung 860Pro, sadly they cancelled 8TB variant of it. Only 4TB is max, for future products we will get QLC, PLC and other garbage technology shoved in. Also the performance boost is relegated to few workloads only. For games it's even worse, there's no improvement except a few secs of loading times, very minimal. Also that new RTXIO / Direct Storage still has to be shown in action to prove it's worthiness, I hate how Intel abandoned Optane completely which caused one more company to leave the fantastic memory technology. Micron 3DX-Point is also gone.
The 870 QVO QLC SATA SSD is great on paper but once the data is filled in the SLC cache and all those tricks run out, the drive will choke hard and drop to Mechanical HDD territory. Ultimately making Mechanical drives more relevant in cold storage, which is where my WD Red CMR drives come in 5400 and 7200RPMs.
Intel finally caught up to AMDs SMT performance with phone cores ? after one year of Ryzen 5000 processors debut, I'm impressed to see the phone trash cores beating a HEDT chip but not really liking the stupid phone core BS direction on Desktop (yeah they are blown on power budget plus the bigger pie is those BGA trash dumpster silicon), as they are keeping the real 14C tiles for Sapphire Rapids Xeon and HEDT silicon.
I want Intel to prove their lineup, RKL had IPC improvement as shown in ST Cinebench but it failed to translate to the real life workloads and lost to the physical cores in the 10900K due to DRAM gearing and intercore latencies. Now I really want to see if this translates to real workloads AND DRAM scaling of DDR5 (pure beta product at this point of time) also vs the Zen 3D refresh on AM4.
I think Fall 2021 is going to be mighty interesting - ADL, DDR5, PCIe 5.0, Zen 3 Threadrippers, Zen 3D refresh for EPYC-X Milan and Ryzen AM4, Ngreedia pointless refresh to milk out the last drop from Ampere and Samsung 8N to peak and ride on the Crypto market...heck 3090 cannot prove it's 2x cost over 3080 for a mediocre 6-13% max improvement.
NVIDIA Ada Lovelace AD102 with up to 2.2 GHz clock, 384-bit interface including GDDR6X and over 80 TFLOPs in 5nm?
As we know how Nvidia changed their FP32 compute calculation system with the Ampere, that doesn't really translate the 1:1 performance comparison growth with the given TFs. Still looking at that it feels like an insane jump in performance AND power consumption, 400-500W by default if we look at this massive chip, factor in AMD's RDNA3 going MCM 5nm+6nm is also 500W+. It'd be great for those who are going to wait for 2022 Q4 which is the estimated launch window for both GPU brands, I'm ignoring Intel I do not have trust in them at all the latest rumor for Xe shows their target is at entry level to midrange which is why Ngreedia is refreshing RTX2060 lmao. Plus the Crypto market has to bust else the same thing will keep on repeating. A.k.a that Ethereum Proof of Stake change which was in discussions since more than 3-4 years at this point.
Unfortunately for me, none of these cards will run on Windows 7 and AAA gaming is even more depressing than ever due to political BS. One thing makes me happy is Consoles are going to be left in complete dust and smoked to oblivion.Last edited: Sep 22, 2021Mr. Fox, electrosoft, Clamibot and 2 others like this. -
This is the Patriot 4000 B-die with my custom overclocking profiles. The RAM that I ordered should be delivered sometime today. We shall see if it is a keeper or an RMA.
Prepared by Thaiphoon Burner Super BlasterAshtrix, electrosoft, Rage Set and 1 other person like this. -
G6X is actually cheaper than regular G6 right now because of shortages and because AMD only uses G6 and the 3060 has a lot of G6. That's why Nvidia made the 3070ti, so they didn't lose as much money on each card
-
I find it funny how the 3060 super is going to be more powerful than the 3060 TI. The 3060S is just a small step below the 3070.
Lesson learned from Turing and Ampere: wait for the super refresh of future GPU generations to buy a new GPU. You get a significantly better value that way.Spartan@HIDevolution likes this. -
Then you'll have a whole new gen graphics cards 12 months later
Not alll second gen cards give you a huge increase in performance. Skip etc first gen Amp 3090 for +2.5% performance gain with 3090 Super (if real) is quite stupid if you ask me
You'll loose 12 months with fun (if you was able to find cards at realese date
).
Yep, it depends what you go for. Wait one year for the 3070 Super will give you a huge gain, LOOL
Rage Set, Ashtrix, Spartan@HIDevolution and 1 other person like this. -
Nvidia's product line still makes little sense to me :/
Anyone mess with any of those x99 Chinese e-waste boards ? Was thinking about what to do with my 1620 v3's. Been looking into them today to see if they support bifurcation but nothing solid yet.Rage Set, Papusan and Spartan@HIDevolution like this. -
electrosoft Perpetualist Matrixist
That's much better and based on your 1.257 and 203w your chip is now running like the Best Buy one I picked up (1.260v and 204w w/ CB23). It required 1.525 bios, pulled ~1.539 under load and hit ~370w doing Cinebench R23 and thermal throttled pretty quickly on my 360mm AIO. Start climbing in .25 intervals and see what temps and pull (w and v) at each interval.
5 months ago puts that around late April, so that is pretty early and there have been subsequent BIOS releases from many of the top end manufacturers to tune up 11th support.
Start at 3600 Gear 1 with those XMP timings, make sure SA/IO is @ 1.4. It could come down to the motherboard and Z490 playing nicely with 11th gen. Did you try 1x, 2x, 4x DIMM combos too? If you swap back in your brother's 10850k and everything is hunky dory but the 11th is all over the map then you kind of have your answer and you could reach out to EVGA. I don't know if anyone on the EVGA forums has used the Z490 Dark with the 11th gen series. You could check there too. If the Z490 Dark just won't play completely nice with the 11th gen, a 10900k may be in your future. The EVGA Z490 Dark wouldn't be the first board to have quirks and issues with 11th gen. The one thing you do know is that it is a monster board with 10th gen.
This is G.Skill 4x8GB B-Die Gear 2(shudder) XMP stock settings 4000 stock settings 11900k:
-
Get an EVGA Mouse With 16,000 DPI for Under $30 tomshardware.com | Today
EVGA accessories have been lighting up our deals coverage lately. But the company hasn’t stopped at huge gaming keyboard discounts, as its ergonomic gaming mouse is on the chopping board too.
At Newegg, the EVGA X17 gaming mouse is now just $29.99 after a huge $50 price cut — that’s a 63% discount!Mr. Fox, Rage Set, Spartan@HIDevolution and 1 other person like this. -
electrosoft Perpetualist Matrixist
Anyone who says 11th gen has been "smooth sailing" is either lucky, runs complete stock or has some rose tinted shades firmly planted on their face. I've enjoyed trawling inside the BIOS as always with a new chip just as I completely enjoyed the 5800x. Something new and different.....warts included.
@tps3443 you might have reached a point of either going back to a 10900k or investing in another motherboard if you truly want to let the 11th gen shine and go with a Z590.
Thanks for the heads up! I ordered three of them (most you could order) as backups. We really like the X17s.Last edited: Sep 22, 2021Mr. Fox, Rage Set, Ashtrix and 1 other person like this. -
Yeah I played around with this bios on the 10850K and I could get some really good memory performance out of it. However, the 10850K was only overclocking and applying the frequency to 8 of 10 cores. After investigations the bios 2.01 is forcing core 9 and core 10 to run at 800Mhz. As it doesn’t see them properly. I found a work around, and managed to get a decent OC on all 10 cores with the 10850K, and on the memory. But this bios is just way unstable on Comet Lake. It’ll just power off or restart in games randomly.
^With 11900K, the 2.01 bios doesn’t have these issues, it is stable in games and works ok. But I have no memory bandwidth, and terrible memory latency haha. The motherboard also screams and squeals at me when I move the mouse lol.
I can’t get a gear 1 to stick at all, even up
to 1.525 VSA/1.450 VCCIO. I have the exact same memory as you. [email protected] (4x8GB) I only run two dimms so this memory is capable of some pretty ridiculous frequencies.
All of these problems point to Z490 Dark. I am tempted to just grab the Z590 Dark.
Anyone have any good discount codes for Evga? The Z590 Dark is in-stock!Rage Set, Papusan and electrosoft like this. -
electrosoft Perpetualist Matrixist
You can always use my associates code (or someone else's) to get a discount: JL6DSXQIKJ0DVZP
I'm not sure how much it is, but it's something.
You could also check your local best buy or micro center to snag a board like an Asus Z590 to also get an SP rating on it while getting it to work today. -
Not sure if you know or not, for 11th gen Memory VCCIO2 is the one which we should change, the VCCIO should be left to default.
-
I’m gonna probably go through the Amazon ordering process to try and land a good 11900K, and I will just buy/return the OEM Trays until I find a good one. This one may very well be ok though. This motherboard and bios just isn’t right to get any idea at all on the IMC, or the core overclocking.
I can’t find much on any other people running this setup either. -
Below a month and we will hear from Intel and Alder lake. You can’t get an cpu that works on your current MB?tps3443 likes this.
-
I heard the news!!! (NVIDIA TAKE MY MONEY)!!!
I went ahead and panic sold my 3090 Kingpin Hydro copper.
Just kidding around. Yeah that’s pretty silly of Nvidia.Rage Set likes this. -
My RTX3090 will beat the dog out of that new RTX3090. There is nothing super about it at all. (Maybe it’ll be super shiny) that’s what Nvidia did with last Gen super models anyways. (They made them super shiny)
Usually a fresh Gen graphics right at launch is the way to go. You get about 24-30 months before a new architecture is available. And they only get better and faster as the near the end of their life cycle with driver and other optimizations. -
Yeah I suppose your right. What am I doing? 12900K looks great.
I still have a 10850K here. So far the 11900K works at least. I just wish Evga would have actually tested this bios on the Z490 DRK.
They assumed no one would actually buy in to
Rocket Lake who owned a Z490 Dark KP, and they were right lol. Besides the (1) single person who did. (Me)Papusan likes this. -
You would be much better off going back to 10900K and sticking with the Z490 Dark. Less expensive, better performance and fewer headaches... not to mention 2 more cores, 4 more threads. Just a better product no matter how you slice it.
-
I agree. I am going to build out a rig with my 11900K and sell it. No point in keeping it.tps3443, electrosoft, Ashtrix and 2 others like this.
-
When I first ran this CPU I tested some single thread, IPC, etc. And I was impressed. After really getting in to checking it out really well, I am really struggling.
I love the way the CPU acts, and I like the IPC, I especially like how well it can manage the 3090 Kingin with a full OC on it in games. I am almost always CPU limited at 2560x1440P while gaming. If you take a stock boosting 10850K and then a stock boosting 11900K there is over a 20% IPC jump right out of the gate. This really translates in games. This is how I didn’t even realize my memory was running like a turd the first night.
However, it is not nearly at its full potential. It’s a very capable chip.
And my thinking is, (is the risk worth it) (is the additional time worth it?) risk and additional time being grabbing another motherboard with the correct chipset and giving it another chance. Tearing everything down to and rebuilding to possibly be let down.
@Mr. Fox
I originally wanted to just grab another 10900K. I wanted a good one though. It just didn’t work out.
Also, what is this I keep hearing about Newegg not returning CPU’s? Do they only exchange?Rage Set likes this. -
It varies by product what the return policy is. Some things they accept returns for and other things they will only exchange if they are defective. Stop and think about how many people are looking for nicely binned CPUs. What is there to stop a person from ordering several of them keeping the best one of the bunch and turning the rest of them back in for a refund? The seller is screwed if they allow it. The are not defective, just not the best of the bunch. They're brand new still but they can't sell them as new because the box has been opened and it has been installed in a motherboard. And if they charge a restocking fee and make the buyer pay for the return shipping then everybody calls them the bad guy for not absorbing the loss and pretending it never happened.tps3443, SierraFan07, Rage Set and 3 others like this.
-
If they keep that up it's going to bite them in the butt, if it hasn't already. Fool me once shame on you; fool me again shame on me. If they keep screwing their customers like that they're going to stop buying the first release and wait for the "$uper" version that the original should have been.
Heck, for all we know they're crippling the first version slightly for no reason other than having a repackaged version to sell for more money later. Or even if it's not more money, just a tiny bit better because they didn't cripple it to have something extra to sell. They already have plenty of evidence to show them that most of their customers are truly stupid. They're taking advantage of that stupidity.
It's an extremely dishonest approach to business. Sometimes one of the downsides to being the best is you might become arrogant, self-righteous, dishonest, and stop caring about the people whose money pays the bills.
What good is it if a man gains the world if he loses his soul?tps3443, Clamibot, SierraFan07 and 3 others like this. -
electrosoft Perpetualist Matrixist
-
With Turing they played the cards well, basically shifting the whole stack of the Silicon rank. 2070 got TU106, that was the sign on how bad Nvidia was literally pulling a massive rod under consumers. With SUPER refresh they made the GPU SKUs back to normal, 2070S got the TU104, and 2080S got TU102. With AMD's lack of competition that happened. But the game changed with Ampere. Ngreedia was literally forced to put all out, at-least more than what they wanted. They even had plans for 3080 with 20GB G6, not G6X, I saw an article of those cards being ended up in hands of a few folks in Russia for mining, they do not have any Drivers and the whole silicon doesn't have any support for games lol.
Now the inevitable greed strikes again, part of the reason is probably the Intel Xe entry+midrange to the mix, that's the only reason why they plan to bring Turing back again. But they know this cut and dry market is going to sell any GPU no matter what. Looking at AMD AIB cards damn, they are insanely expensive more than Nvidia cards which is where I do not get coz Ampere is a mining beast. I bet they hate RTX3080 so much, despite having a ton of SMs deactivated on GA102 silicon and price to performance is literally a 1080Ti this generation that will last for years. So refresh the whole stack and remove that, FE specifically. Since FE cards except Ti are not having LHR. But with LHR or not miners and demand will soak up the market, plus they cleverly capped mining at 50% not total 0%. Bonus is Nvidia got a cheap deal for Wafer cost at Samsung 8N.
Ultimately vs Turing, this Ngreedia company not just makes AI leadership and GFX compute beasts no wonder their thinking is next level - SUPER refresh in drought market, Wafer costs, LHR 50% cap + CMP HX cards - Triple profits here. Add dumpster silicon to Nintendo Switch OLED milking refresh lmao, that was horrendous ugh.
The real shame is how their vBIOS is so damn chastised with all such powerful silicon. Unfortunate.Last edited: Sep 23, 2021 -
ohhhh ok. Well at least I can exchange it.
Thanks for sharing.
Also, is it possible that I have an abnormally bad IMC? I’m thinking probably not the case. I managed to get gear 1 working. However. I can only manage 3100Mhz.
So I am literally running 3100Mhz Gear 1 at CL10-11-11-30-240-1T-16TFaw. Anymore frequency than that, and It’ll just throw code 55. I have 1.400V on both VSA and VCCIO
The cache does 46-47 range though. Not sure if this is actually stable. But it runs Rust for 3-4 hours. -
Tinkering with the new RAM. It has the same Samsung B-die IC part number (which is excellent) as the Patriot Viper Blackout and the 2*16GB G.Skill Ripjaws V dual rank sticks.
Rage Set and custom90gt like this. -
Very nice pair of sticks bro. I wonder if I should get something similar or stay with the 4800 8GB TridentZs I have.
-
When the big boy fails vs its little brother
Yet another overpriced GPU and in limited supply. But nice to see Nvidia screwed up own GPU line-up, LOOL
Asus ROG Strix LC GeForce RTX 3080 Ti Review: The Fastest Card We've Ever Tested
Extreme performance and lots of RGB
![[IMG]](images/storyImages/YSaCRPPqpfhuhYz2HV2A2j-1920-80.png)
![[IMG]](images/storyImages/ZyAC4JYk8zaJABgBct2bp7-1920-80.png)
-
It is difficult to say. Even though all three of my memory sets have an identical Samsung B-die IC they do not overclock the same (yes, binning variance sucks). The Patriot kit is actually better than either of the G.SKILL kits, but I RMA'd two identical Patriot kits because they were rubbish. The dual rank 2*16GB G.SKILL kit also sucks compared to the 2*8GB G.SKILL and Patriot modules. I don't know if it is due to being dual rank, higher capacity, or both. I suspect it has both things working against it. The 32GB kit won't run 4200 with 1.600V and XMP default CAS on the X570, but it ran as high as 4500 CL17 on the Z490 Dark.
I used the 32GB kit in the physics run below.
https://www.3dmark.com/3dm11/14615935 | https://hwbot.org/submission/4822915_
The 32GB kit requires at least 6 for tCKE and 2T. It will not boot with 1T or tCKE 1. Again, I am guessing because it is 2*16GB and dual rank. Perhaps more than the 5950X memory controller can cope with.
Question: does your Gigabyte BIOS have adjustable tREFI anywhere? I've turned the Crosshair VIII BIOS inside out and it appears there is no access to change tREFI. I have always run with that maxed out on Intel systems.
Last edited: Sep 23, 2021 -
Well I’m glad you managed to get some low latency and good bandwidth. Now I am in your situation.
My G.Skill sticks continue to impress me though running CL9-11-11 @3066Mhz. After the situation with my last memory purchase of $370 dollars. I put new memory purchase away entirely. I did return those bogus Oloy sticks to Newegg, so at least I will get a refund.
I’m contemplating a Z590 board. I can see a faint light at the end of the tunnel with a well tuned 11900K. At least my platform works so I can run it and decide on what to do for a bit.electrosoft, Mr. Fox, Rage Set and 1 other person like this. -
The more expensive card beats the cheaper card. This is normal right? -
The Big boy should still beat its smaller sibling
And the desktop cards ain't castrated equal as the mobile graphics cards (some 3070 heavly beat 3080 due castrated TGP).
tps3443, Ashtrix, Mr. Fox and 1 other person like this. -
The 3090 FE is heavily castrated as is the 3080Ti FE.
I could just barely match a 3080Ti FE with my 2080Ti in test like Timespy Extreme. Because my 2080Ti could pull 550+ watts. While the 3080Ti’s 335 watt power would just collapse and boost falls.
The box stock 3090 FE only has 350 watts. Not to mention, it has always been common for AIB Ti models to beat Titan cards in the past.
The 3080Ti is spec’d very similarly to the 3090. They aren’t that far off. So you can’t compare them with one on water with a high TDP availability, and the other burning up with its 100% 350 watt TDP. We all know which direction that’s gonna go.
I mean, a better comparison would be my 3090 vs that 3080Ti lol. My card probably has a cheaper MSRP. Or similar msrp. And both are water cooled.
But even then, the 3080Ti is fast. And there’s nothing anyone can do about it. It just has half the memory. When all else is equal. Power, cooling, etc. and you run them head to head. There isn’t really a gap at all.
The 3080Ti FE is only slower because it has a 335 TDP. Vs the 3090 FE is 350 TDP. So when you remove Nvidia power politics. The 3080Ti is gonna outperform the 3090.SierraFan07 likes this. -
pathfindercod Notebook Virtuoso
-
Well, "low latency" and "good bandwidth" are relative terms when talking about Ryzen and Intel 11th Gen. Coming from amazing Intel products to the current flagship processors, I will use the terms "better" or "improved" relating to the very tedious and time consuming progress I have made. Compared to X299 and Z490, the latency is poor and the bandwidth is OK, but not great.
I am giving up on the 32GB dual rank G.SKILL kit. I burned a ton of calories and those sticks just do not place nice with the 5950X. They're OK with just a little bit of tuning, but have minimal overclocking and timing tweaking potential compared to what the single rank 8GB G.SKILL and Patriot sticks have. The dual rank performs better at the same clocks and timings, but gets left in the dust due to the limited tweaking potential. The single rank 8GB sticks end up outshining the performance of the more expensive and higher capacity dual rank modules for that reason. On the Z490 and X299 systems the dual rank is better for some reason.
Speaking of progress on the 8GB sticks, I made a little bit of headway. I think this is the functional max unless I run loose timings and send the latency off the charts (65ns+ will high clocks and looser timings). I programmed it to a custom XMP profile so that I don't have to remember all of the settings. Seems like 4333 CL16 is the sweet spot unless I can find something else to tweak.
4000 CL 15
4333 CL16 and IF@1900
4333 CL16 and IF@2000... It really blows that setting IF above 1900 causes WHEA errors. Performance and latency dramatically improve at the expense of errors and random glitches.
4400 CL16
Attached Files:
Last edited: Sep 23, 2021tps3443, Rage Set, Ashtrix and 1 other person like this. -
-
Remember the 3090 have more cores + run with faster GDDR6X vram. The 3080 Ti could utilize 25W more power. Still a failure that the so called weaker card beat the best Nvidia can offer. Thats just sad
This give some flashback and reminds me too much about all the different
castrated high end Ampere mobile cards.
Edit.
In stock. Without the Norwegian Tax aka U.S prices the card would cost around 1900$
https://www.komplett.no/product/118...s_Cards&utm_content=nVidia&utm_term=in_stock#
Ps. A lot of Ampere graphics cards for sales here home. I expect more inventory also other places of the world and this means prices has to drop before than later.Last edited: Sep 24, 2021Spartan@HIDevolution, tps3443, Rage Set and 3 others like this. -
I’m not following your point here. The RTX3080Ti and RTX3090 are nearly identical in performance. Power is the primary factor for how fast one will perform, even the 3080 non Ti is close to a 3090.
So saying an AIB 3080Ti with higher power limits, and better cooling is faster than a 3090 is pretty obvious is all I am saying.
I can go flash a Galax Galax HOF 3080Ti bios on my 3090 Kingpin, and wouldn’t see the difference at all lol. I could probably still break 15,300ish to 15,500ish Port Royal.
The 3090 only offers double the Vram as a beneficial feature. However, it’s still cheaper than that Strix LC 3080Ti which makes the Strix a bad deal in my eyes.
The 3080Ti was proclaimed as the worst value ever at $1,199. I always thought it was a pretty good deal. It’s a 3090.
Also, 25 watts is substantial. Not to mention it has half the memory modules to feed. That 25 watts separates them a lot. The 3080 only has 15TDP watts less than the 3080Ti. And only 15 watts separates the 3080Ti FE, and the 3090 FE.Last edited: Sep 23, 2021 -
Ngreedia's absurd BS design choices are at play. The FE cooler is a massive improvement in build quality, ease of use, user servicing vs Turing BUT insanely castrated in terms of backplate VRAM cooling. Esp for the big boy 3090 GA102. I was shocked like WTF to see the aftermarket Thermal pad mods reduce temperatures from 100C to 70-80C, 20C+ drop from basic modding and repaste forget the afterthought that Nvidia did on FE to let the VRAM not have any good cooling solution on that high density PCB design, literally the whole PCB is just 20% of the Card volume. That's real insanity on Nvidia saving pennies on these high priced GPUs. I mean they know Samsung 8N is inferior over TMSC 7N, but they also know that supply problems means Nvidia Ampere would end up like RDNA2 cards in DIY marketspace. So they do these scum trash moves on consumers because of the damn demand.
But It's really inexcusable to make users modify a damn brand new cards like 3080 $700 / 3080 Ti $1200 / 3090 $1500 out of the box to do a thermal pad mod. They just let the cards run at high temps because G6X is fine at that mark, and if we do not do the mod, the core and memory will downclock and throttle the card heavily. I bet TPU didn't do a thermal pad mod on these cards as its stock performance scale, If we do a mod It would definitely perform better not just in games and benches but also in mining performance. Also note that Strix is insanely overpriced vs FE with LHR castration, reduced VRAM capacity, overall reduced SMs vs the price boost of $500 but ASUS Strix here got nice vBIOS and more breathing room than that power castrated and cooling starved FE card.
Tired of Nvidia antics, I feel robbed & hate to support them really...but what else option I had.Last edited: Sep 23, 2021 -
25W higher peak power but less cores and slower vram shouldn't make the very big differences (ps. the 3080 Ti STRIX LC was tested with stock vram speed). Performance scaling vs slightly increased power isn't there. The only way the 3090 will shine is with Ray tracing. And if the games you play need more than 12GB.. You can of course oc the 3090 for benching for more points on the bot (you need better cooling than stock = more $$$ out of your wallet). But thats not the same as gaming. The only real flaw with the 3080 Ti is the capped down vRam. 12GB is still too little. And all Ampere cards is overpriced.
Edit. 3090 Founders Edition cost 240$ more here home than the tested Asus card. And still perform worse or at best with OC it may come on same level in games as tested by Tomshardware. Yep the 24GB vram come at nice premium here but will be more future proof. As I said several times. Sad the big boy failed against its smaller brother. Not the way hardware should perform.Last edited: Sep 23, 2021Spartan@HIDevolution and tps3443 like this. -
Ok, I went ahead and purchased a OEM tray 10900K on Amazon.
electrosoft, Mr. Fox and Papusan like this. -
Well comparing AIB vs an FE probably isn’t the best example here. Because that AIB is overclocked.
I’ve got about 19% on that 3090 FE. So let’s compare AIB vs AIB lolLast edited: Sep 23, 2021electrosoft and Papusan like this. -
That card is terrible value. More expensive than my faster 3090 KPE. $2000 is what a 3090 KPE costs, and no amount of overclocking will beat a 3090 KPE and has half the vram.electrosoft, Mr. Fox, tps3443 and 1 other person like this.
-
Yep, but you won't get 3090 KPE or 3090 cards for 2000$. Maybe a few EVGA elite members in U.S. We don't even have a chance to get it here. + most of the cards... 3090 or whatever Amp cards don't follow MSRP. All 3000 series cards have a very bad value nowadays. As I posted I have to pay 240$ for less performance. Here home.Spartan@HIDevolution, Mr. Fox, tps3443 and 1 other person like this.
-
That’s how it works. You’re lucky to have gotten a 3090KP though. And so am I.
if it wasn’t for @Rage Set , I wouldn’t have one. And the KP block too for that matter.
I’m actually curious how many exist in the wild.
I’m certainly hanging on to this one
for dear life. Unfortunately It’s the only PC component I have that hasn’t let me down lol.Papusan, Spartan@HIDevolution, Rage Set and 1 other person like this. -
Smart move. Even if 11th Gen didn't have so many undesirable issues, the uptick in IPC is too small to compensate for the extra cores/threads in a CPU-heavy heavy workload. Even if the tray CPU is average in terms of ASIC and overclocking potential it is going to be a better option. Your previous point about IPC in single and low core count scenarios has merit, but that gets overshadowed by the list of disadvantages.Last edited: Sep 23, 2021Papusan, Rage Set, Clamibot and 1 other person like this.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.
