I have I guess some form of insomnia I take almost 750mg of diphenhydramine with a light snack on empty stomach to sleep at night so night time coffee is a no go but I dont mind it in the morning or afternoon after getting some water in me first. Probably should get a sleep study done, but my early exposure to anything in health care told me one thing, dont get health care. Too expensive.
Second time I was hit by a car was literally on the block the hospital was located at, still would've cost me 10,000 according to the drivers. So I walked until my sister nearly finished me off for walking to the hospital, and she took me to see my Uncle who is still a RN in the emergency wing of one of the hospitals in the area.
-
Sorry to double post but I have an inquiry.
I am trying to expand my knowledge pool in the IT space as I would like to use it to expand my resume and/or start a business. I need to repurpose my Precision 7810 that I am using now into a Proxmox virtualization box as I want to start playing with clusters since I have the following:
6x HP T630's
AMD Quad core 2.2Ghz
16GB DDR4
128GB M.2
1Gbe
Various other parts as well, but I have a huge supply of x99 parts, just not the motherboard. Just need something that can handle 4 GPU's natively (with risers) and/or functions with Bifurcation or has official or unofficial functions akin to Bifurcation.
Much appreciate any direction on this in advancetps3443 likes this. -
Yep, Nvidia really care about the gamers
Remember they helped Nvidia to become what they are today. See final conclution from the editors at bottom.
Put simply, the 12GB RTX 2060 was never intended to be sold to gamers, instead Nvidia is looking to capitalize on the current situation and increase supply to miners. The longer this goes on, the less important gamers will become to GPU makers as their intended market.
More VRAM, But for Who? Nvidia RTX 2060 12GB Review techspot.com
As of writing, it’s possible to purchase a 6GB RTX 2060 from retailers like Newegg for $620, which is nearly a 80% premium over the $350 they sold for back in 2019. We could find a single 12GB model on offer for $830. That's a 34% premium over existing 6GB cards, and it’s a similar story in other markets where we checked.
What We Learned
There you have it... the super unexciting 12GB version of the GeForce RTX 2060. Of course, had this graphics card managed to come in at a lower price point with strong supply we’d be super excited about its release, but that was never going to be the case. This product was always intended for miners.
If you appreciated this depressing review, spread the word and share this feature, so more gamers can learn why they should avoid the 12GB 2060.
------------------------------------------------------
Asus Admits Fiery Z690 Motherboard Flaw, Starts Recall Program tomshardware.com
Here's Asus' statement:
"To our valued ASUS Customers, ASUS is committed to producing the highest quality products and we take every incident report from our valued customers very seriously. We have recently received incident reports regarding the ROG Maximus Z690 Hero motherboard. In our ongoing investigation, we have preliminarily identified a potential reversed memory capacitor issue in the production process from one of the production lines that may cause debug error code 53, no post, or motherboard components damage. The issue potentially affects units manufactured in 2021 with the part number 90MB18E0-MVAAY0 and serial number starting with MA, MB, or MC.Last edited: Dec 29, 2021Clamibot, tps3443, Ashtrix and 1 other person like this. -
I was thinking of going with the 3970X (since it appears TRX40 is only going to have one CPU gen) and I noticed the prices are crazy. Even used 3960X's are going for $1400+. I wonder if I should sell while everything is still priced so high or keep it.
-
My [email protected] all cores pulls a maximum of 278 watts in Battlefield 2042 after several hours of playing. It is amazing how smooth this game is. This power consumption doesn’t seem bad to me at all.
11900K Max power= 278 watts
11900K Max temp= 59C-65C
11900K Avg temp= 48C
I run this CPU set to 5.5Ghz at 1.500V daily. I hope it continues running this well.
Last edited: Dec 30, 2021Clamibot, electrosoft, Papusan and 1 other person like this. -
Hard to believe the 12900KF is only $589 and readily available literally everywhere. I wonder how AMD is gonna compete when next Gen AMD CPU line up launches. Intel seems to be capable of keeping inventory flooded with 10th,11th, and 12th Gen processors. I am really curious how it’s gonna go, and if AMD will be able to maintain inventory.
-
If you consider how difficult it was to get Ryzen 5000 series up until the past few months, AMD has a lot of work to do. It is a good and bad thing. AMD procs are going to sell, especially the XT versions that are incoming. They are going to be hard to find. Bad news for anyone that wants to upgrade one more time on AM4. The good news is they don't have to worry about overpaying for DDR5 or Z690 mobos.
Remember, AM5 is coming next year, so Intel is going to have to deal with that. Just adding additional Ecores won't be enough. -
This is exactly what I was thinking. I think it’s gonna be rough for the next AMD launch. At least we have options now though. When AMD 5000 series launched, it really did take a minute for people to get one.
I will say this, It was refreshing to see that Intel 12th Gen was unable to be scalped and sold for outrageous prices. -
This is my 11900K bone stock, memory is at 4000 Gear (1) (CL14-15-30-296)
Max power usage in BF2042 is (122 watts) This is pretty darn efficient!!
It’s boosting all over the place 4.8-5.3Ghz on all cores.
And only 122 watts. It averages about 100 watts. This is with no power limits at all.
I remember my 5Ghz 8086K 6/12) in my laptop absolutely blowing past 140 watts easily in Battlefield V.
So, I guess Intel has come along ways with the old 14nm process.
-
Last edited: Dec 31, 2021Clamibot likes this.
-
-
I found this very interesting.
So, I know a lot of people (My self included) questioned the testing and practices of Silicon Lottery binned processors. But, maybe they were actually legitimate lol. I mean, they were charging like $950 for top tier CPU’s.
Now, I found this 10900K, which is a Silicon Lottery 5.1Ghz bin. Coincidentally it is also an SP101. (And this is not even their best binned 10900K) This SP101 would fall in to the top 20%ish category.
I know the guy is charging like wayyy to much money (Its merely for demonstration) . I just thought it was interesting and wanted to share.
https://www.ebay.com/itm/224767340409?epid=18038163465&hash=item34552d6f79:g:AC4AAOSw7UdhzospLast edited: Dec 31, 2021Clamibot, Papusan, electrosoft and 1 other person like this. -
electrosoft Perpetualist Matrixist
I'm not quite sure how he thinks he's going to get basically what he paid for it (minus fees and all that good stuff) for a two generations old chip. If I was going to pay $900, I would have just bought that seller's SP116 before.
If you look at his sales receipt, he paid an extra $50 for an SP100+ bin on top of the price which means Silicon Lottery didn't bin by SP but if you want to pay extra for a high SP chip that also fits their binning process, have at it.
At a certain point I'm sure there is a correlation between higher SP chips on average obviously binning higher and meeting their criteria but I'm sure some SP90s and high SP80s may have met their criteria too for certain bins.
Clamibot, Papusan, Mr. Fox and 1 other person like this. -
What?!! OMG. I’d hope they’d throw in a SP100+ for $50 extra (That’s like a $1,000 dollar 10900K now with tax). A good chip is a good chip in my eyes, now that I’m understanding more about it. My 11900K is nearly identical to Sugi0Luv’s SP89 11900K. You know, he also had a really magical SP104 11900K too? It was a true monster. Better than his and even my 11900K. However, the IMC was a dud on it. Which made sense why he kept his SP89 in its place. I messaged him about a week ago requesting some HWInfo runs on his SP89 11900K, and the resemblance to my 11900K is interesting.
People are insane for how much they ask for these processors though just for SP100+. There are some good deals available here and there. But not often.
I will admit that(10900K SP116) was a straight animal though.Clamibot and electrosoft like this. -
electrosoft Perpetualist Matrixist
I think I mentioned Sugi's 11900k some messages back about having a really good sample but having an ok SP which makes 11th (and now 12th to a degree) so frustrating as you want to get the trifecta. 2 of the 4 12900k's I tested wouldn't do 4000 G1 (they happen to be the two garbage chips in the lot SP81 and SP89). 3 of the (now 6) 11900k's I tested couldn't break 3733.
I'm going to retest my 11900k and put it back in the Asus motherboard this week to tweak the IMC versus the gigabyte which topped out at 3933 but that was with 4x8. I never tested 2x8 in any of the configurations except the 11700k which stalled at 3733 on the MSI Z590-A Pro.
And yeah, that SP116 10900k was a monster but not $850 monster.Clamibot, Papusan, tps3443 and 1 other person like this. -
Looks like this is the highest 2080 Ti score on their leaderboard for Vulkan @ 1440p.
https://gravitymark.tellusim.com/report/?id=73102844db1f3027c8a8ea400159ea274e3509e0
-
electrosoft Perpetualist Matrixist
I ended up picking up that SL SP 101. He sent me a V/F shot and that lower range is lights out SP103+ 4.8 - 5.1 which means it is going to be a monster chip in my X170SM-G. It starts to falter at 5.2 which is why you can see SL binned it only to 5.1 using their criteria but it will definitely bench much higher and is much better than the binned silicon in there now. I'm glad the holidays put a delay in using the BartX IHS as I would just be removing it and relidding the original lid for resale or reuse elsewhere.
I also figured might as well since I'll be returning the X170KM-G. Keyboard has the lighting issue (over in the SM forums) I didn't realize till I turned the lighting off last night to check the panel video quality and 3-4 keys won't go dark no matter what I do on CC. They'll turn to any color. They just are stuck on light green and won't turn off when trying to turn them off or when they keyboard times out.
More importantly the Gen4 slot is faulty. I wondered why it was delivered with the Gen4 1TB drive in the Gen3 slot but didn't swap it over till yesterday as I started to really go over everything with a fine tooth comb (see panel check above) since the holidays are just about over and I could get back to tech stuff fine tuning, checking and working with it.
Now I know why when I switched it to the proper slot and hardware glitches and problems came out of every crack. Even popping in a WD SN850 500GB Gen4 into the slot still had the same problems. As soon as it tries to install chipset drivers it locks up, freezes, shuts down or reboots. Gen3 slots work fine.
I won't even get into the complete 4 different length mismatched screws used to attach the bottom assembly....
Ashtrix, Clamibot, tps3443 and 1 other person like this. -
You bought the one on eBay? Or the Phone number of that guy I gave you? -
A lot people gladly pay +2500$ for 3090 and +2000$ for 3080@10GB so why not old over priced binned Cpu's? At least you get some value from the better binned Cpu's
Clamibot, electrosoft and Rage Set like this. -
electrosoft Perpetualist Matrixist
eBay one the Silicon Lottery SP101. We worked out a very fair price. -
Have you ever noticed the Rocketlake mandatory AVX 512 offsets ?
It happens in Y-Cruncher 2.5B, and the Aida FPU stress test.
I can set 5.4Ghz in the bios with 1.425V and (0 AVX2/ and 0-AVX512 offsets)
however, if I run Aida stress with CPU/FPU it forces clocks down to 5.2Ghz from 5.4Ghz.
Now, if I set [email protected] with a -1 AVX512 offset. The Aida CPU/FPU stress test forces 5.3Ghz and runs through just fine.
The only way around this is to run 5.5Ghz with a -1 AVX 512 offset. Which will then apply 5.4Ghz. But, I’m having to compensate for 5.5 with more voltage, just to get a truly stable 5.4Ghz frequency (In anything like AVX2/AVX512 etc.
Did you ever notice this?Last edited: Dec 31, 2021electrosoft likes this. -
Maybe they are attempting to offset a lack of QC with a little bit of variety. -
electrosoft Perpetualist Matrixist
Hmmmm, I have not but I'll take a looksy this coming week.
Yeah I'm not sure who re-certified the unit but they missed a few things. I'm very OCD about things like matching screws and parts. Whenever I've received systems that are missing or have mismatched screws, I keep 3-4 sets of 10-15 different screws (M2 to M3 usually and 2 to 12 length) to swap in matching sets if I can't find the exact matching screw in my collection. It may have also been a previous 10th gen spec too and they swapped in an 11th gen CPU without adjusting the storage properly so they didn't know the gen4 port was flaky? Maybe never turned off the auto RGB on the keyboard to notice it had some defective LEDs?
Trading Places is a fantastic movie! -
Intel don't like that their consumer chips offer AVX 512 features. And they are afraid that you'll fry your chips
There you have it.
But that's not all, because the remaining AVX2 instruction set is already strictly limited by Intel to a multiplier of x51 at the launch. While in applications without AVX2 such as Cinebench the set clock rate of 5.2 GHz, for example, is continuously available and benchmark high scores can be achieved, the picture looks different in applications such as LinpackXtreme. Even if you set the CPU clock statically in the BIOS, remove all performance limits and ensure sufficient cooling, the CPU throttles itself down to 5.1 GHz when executing AVX2 instructions. It doesn't matter how warm the CPU is, how much power it consumes or what the application is called. In HWInfo, the intervention of the clock limitation can be recognized by "IA: Max Turbo Limit - Yes".
Why Intel is limiting the clock rate for AVX2 has not yet been confirmed. One possible reason could be fear of degradation due to electron migration if the current is too high - a phenomenon in which the CPU slowly damages itself over time. To be fair, it has to be said that due to the generally very high heat density of the new CPUs, only very few users should use appropriate cooling in order to ever be able to operate such high clock rates with AVX2. But whoever uses a custom water cooling system, for example, or even decapitates the CPU in order to gain the absolute maximum performance from the new CPUs, will sooner or later come across this artificial limit of the otherwise "unlocked" CPUs. Incidentally, this limit was already in place with Rocket Lake CPUs and was probably just not really discussed,
Intel completely deactivates AVX-512 on Alder Lake - questionable interpretation of “efficiency” | News / Editorial igorslab.de
Intel is now completely deactivating “AVX-512” on all Alder Lake CPUs with an upcoming microcode update in new BIOS versions. Mainboard manufacturers were able to make the supposedly disabled instruction set available at launch, which resulted in a significant increase in performance for the P cores of the new CPUs . Now Intel is tightening the loop completely and, according to our sources, has instructed the motherboard manufacturer to completely deactivate the "unsupported" feature.
Just in time for the launch of the new smaller CPU SKUs and mainboard chipsets at CES next week, all previous Z690 mainboards should completely block the AVX-512 instruction set via a BIOS update. So far, one can only speculate about the motives. But it stands to reason that one would like to artificially create a purchase argument for future workstation and server products. Because enterprise applications in particular often benefit most from the acceleration provided by the AVX-512 command set.
Actually fully capable “consumer” hardware should no longer be able to have a say if Intel has its way. To protect our sources, we don't want to mention them.
Yep, Intel still haven't learned nothing from their market loss to AMD. The greed will always beat common sense.
Fixes for the fused in/implemented..."intended" AVX2 throttling... Fortunately, there are already remedies for both of the AVX hurdles discussed, the throttling of AVX2 and the removal of AVX-512. For example, Asus has implemented a patch in its BIOS versions for mainboards of the “Maximus” series that switches off the AVX2 throttling. The only important thing here is that the clock must already be set in the BIOS at boot time. A subsequent change via in-OS software would otherwise remain stuck in Intel's safety net.
More exotic methods are required to continue using the AVX-512, but nothing impossible. Community members have already managed to inject an older microcode version into new BIOS releases and thus effectively offer a modified BIOS image with AVX-512 support. Of course, there is always a certain risk associated with such unofficial BIOS versions, as an error in the image could, for example, damage the hardware. The use of such BIOS images is therefore always at your own risk! But at least this also shows that the deactivation of AVX-512 is reversible and that write protection has not been set on the microcode version - at least at this point in time.
Yep, they treat you the way so you can "feel you don't own" your own hardware. Not only laptop OEM enjoy screw own customers with disgusting solutions/features. Greed and stupidity goes hand in hand
Last edited: Jan 1, 2022Rage Set, Ashtrix, tps3443 and 1 other person like this. -
Wow, just wow. That's an extremely scummy move.
The hardware manufacturer has no right to restrict users on what they can do with the hardware they paid for. For whatever capabilities and features the hardware has, the consumer should be able to use all of them.
It'd be a different story if Alder Lake simply lacked the AVX-512 instruction set altogether, but the hardware is there. The consumer should be able to use it.Rage Set, Mr. Fox, electrosoft and 1 other person like this. -
Unlocked/unlimited features have lost some of its meaning. Isn't it nice, near 1/6 part of each core is useless regarding to what Intel say. Aka wasted space on the silicon they could use for more or bigger cores. The 12 gen die is about 35% smaller than 11th gen... They could easly offer a few more cores if they wanted with better arch and increased die size as previous gen chips. They still hold back performance as they did with 14nm +++++++ chips (the all do. Intel, AMD or Nvidia is equal). But they have to offer sligtly better performance for coming chips so why offer it a year before they have to? Just look at coming 3090 Ti or mobile 3080 Ti.
In our tests we could already prove that AVX-512 on the Golden Cove P cores is indeed more efficient than AVX2 and even allows more computing power with less power consumption. The only prerequisite for AVX-512 is of course the deactivation of the Gracemont E cores, which simply physically lack the transistors for this instruction set. But we’ve also seen in our tests that the e-cores only provide performance gains in very few individual cases anyway, if not the outright opposite by slowing down the cache/ring and delaying memory accesses. Does the “E” then really still stand for “Efficiency” and not rather “Error” or “E-Waste”? Wouldn’t a CPU with only P-cores and AVX-512 be the far more economic and ecological approach?
Final thoughts
Whether Alder Lake CPUs are more efficient in the standard configuration or only with P cores for AVX-512 depends very much on the use case. And that the hybrid approach with different cores is still relatively new and immature is also no secret. In time, many improvements can be expected, especially through software updates from Intel and Microsoft. But originally the trigger for today’s article was the literally incredibly low power consumption of the P-cores with AVX-512 and the question whether the efficiency is actually higher. And indeed, with the feature enabled, the efficiency of the P-cores is significantly higher in all benchmarks than without. In fact, the results are so clear that the instruction set can and should always be safely activated – if possible, of course.
You can't have your cake and eat itLast edited: Jan 1, 2022 -
I love this 11900K! Literally an amazing chip. This is what the 11900K should have always been. And, Intel should have done a better job on making sure the large price premium over the 11700K granted something “SUBSTANTIAL”
What a shame all 11900K’s couldn’t have been like this.
![[IMG]](images/storyImages/81-DB2159-32-EA-4168-855-B-EF33-C0-ABAB2-B.jpg)
I will say this though, my worst 11900K is head and shoulders over any 11700K. It’s pretty sad to see the HWinfo of most 11700K’s consuming around 1.450-1.500V for 4.9Ghz-5.0Ghz.Clamibot, Ashtrix, electrosoft and 3 others like this. -
This is why Intel never advertised AVX512 in the ADL press. Anandtech's Ian clearly highlighted that aspect and mentioned the decision is up to Intel to give a final say. ASUS did it anyway due to deep contacts for obvious reasons, no wonder the SP rating on ASUS is a niche thing, MSI started it now. Remember Papu also showed how ADL is having top rank HWBot scores with Windows 7 ? Similar thing. Close contacts manufacturers, they get full access to all the possible BIOS and Microcodes which normal plebians cannot get.
Now Intel axed it. Be it Segmentation, Electron Migration CPU degradation or anything.
Unfortunate part is, for this new BS E-Core garbage they are removing really useful stuff - Adding complexity to dual ring bus and making the Cache split for the E core and P core, More E core in the future LGA1700 socket, AVX512 axed, AVX2 restrictions... Simply because BGA sells a lot and makes more money than DIY.
Look at this news...AMD is also no saint.
AMD Ryzen 9 6980HX next-gen “6nm Rembrandt” mobile processor pictured
Rembrandt is on 6nm. It's basically Zen3 refreshed on TSMC 6nm for the craptop Jokebook BGA dumpster market. All the while there's no EPYC 7003, Milan on TSMC 6nm nor Threadripper 5000 refresh itself. The new Milan-X 3D-V cache based beast is also on TSMC 7N only, not even 7N+ EUV. We won't know if the new Ryzen 6000 / Zen3D-V processors will be on 6nm or not. Plus do not forget the damn IOD, it has those issues all over the place since Ryzen 5000 launch and it's not going to change at all to TSMC 7N it will be on 12nm GloFo only. I still am waiting to see how the 3D-V behaves with USB / PCIe and DRAM.
BGA gets all the love in this age.
That said...
Happy New Year 2022 to all !!
-
Happy New Year to you too brother.
BGA gets the attention (not love) because the zombie sheeple drink the Kool-Aid and stand still, quietly, while their wool is being shorn. Those of us that refuse to submit are portrayed as racists, domestic terrorists and rabble-rouser protagonists regardless of gender, creed or ethnic origin because their deceptive rhetoric requires no basis in fact or truth. It is because they say so, and God knows a bazillion zombie sheeple can't be wrong. Herd impunity (not a pun) for all who drink the Kool-Aid.Last edited: Jan 1, 2022 -
Yep, and the OEM find all ways to put in the BGA junk. Just add in Worlds most powerful and everything will sell.
Asus confirms a debut for its ROG Flow Z series of gaming 2-in-1s at CES 2022
The sad truth.
Tablets, laptops & PCs sales forecast 2025 | StatistaTheDantee, jclausius, electrosoft and 3 others like this. -
-
My little daughter study at the university. She have used an Jokebook due need for one day battery life. The junk have the modern USB-C connector and not real imput connector. Of course it went to disaster before this Christmas. Exactly as I said to her. The weak flimsy modern power intake connetor broke and she bought a new BGA disaster for the school work but with real barrel connector this time. But I'm sure also this junk won't stay alive long enough until she is finished with the university. Todays tech is pure junk. Especially Jokebooks and regardless of price point. I'm happy I don't have to use or buy this type tech. Thank God I don't have to be a part of this modern tech future. Sometimes its nice to be older and well above half in life on this earth
Spartan@HIDevolution, electrosoft, Rage Set and 3 others like this. -
Amen!
Gesendet von meinem LE2123 mit TapatalkSpartan@HIDevolution, Ashtrix, Clamibot and 2 others like this. -
What a disgusting tech. Everything is flimsy and weak. HW or the whole Jokebook is flimsy and you have to baby it because its so weak. But it won't help. A ticking bomb that will blow to hell at any day. No thanks. Let the kids have it. I'm too old for this
The nearest I will come is an smartphone.
Spartan@HIDevolution, electrosoft, Rage Set and 2 others like this. -
Without getting into my personal opinions about whether or not human-induced climate change is actually real or something else, the hypocrisy involved with the mass production of disposable garbage electronics by the same stupid idiots that claim to care about taking care of the planet is truly stupefying.
-
I am a kid (well technically not, but I just freshly graduated college so I am a kid compared to a lot of you guys
), and I still don't want any part of modern tech garbage.
I really wish BGA wasn't a thing. It's crap. I also wish my friends valued upgradeability, but they just want something that works. Most people seem to go for the cheapest thing that meets their needs, regardless of it's good or not, because it's good enough for them.
I can't say I blame them though. There's no reason to spend extra money on something that has extra capabilities you'll never use. Hmm, maybe we should make everyone a tech enthusiast. That should make the market turn back to socketed hardware. Companies will only produce what people want, so we need to try our best to show everyone we know the awesomeness of socketed hardware.
It's a lot easier said than done though.
People not interested in tech will continue to not be interested in tech.
I'm glad I grew up in a family with very high standards. Our standards aren't too high, everyone else's are just too low. There's no reason to not do your best, as that's how you make progress. Stagnation breeds mediocrity, and mediocrity leads to everything turning to crap.Tenoroon, electrosoft, tps3443 and 4 others like this. -
Gaming Tablet. Jumbo Shrimp. Silent Scream. Working Vacation.
https://www.merriam-webster.com/dictionary/oxymoron
If anyone wants a gaming tablet, I will give one away as a gift to the first person that presents a full-price offer on my ocean beach property on the east side of Phoenix.TheDantee, Clamibot, tps3443 and 1 other person like this. -
Happy New Year. Maybe 2022 will be better. Hopefully, we won't be desperate enough to look back on this as the "good old days" at some point.
Clamibot, Papusan, Ashtrix and 1 other person like this. -
Running the CPU KILLER!
“Death Stranding”
Honestly, the 11900K blows right through it! It’s actually smoother than my old aging 7980XE was, I’m surprised by this because the 7980XE did
Better than a lot of other processors. This little 11900K is chewing this game up well, as the game chews it up too lol.
5.5Ghz/4.6Ghz cache/4000Mhz CL14 DDR4 (Gear 1) on 8/16 much faster cores seems better. I have a frame cap of 162FPS even, and CPU usage steadily stays up in the 70-100% range on all 16 threads simultaneously.
Overclocking the 3090 Kingpin is wasted, as my monitor doesn’t benefit past 165 FPS. So, GPU is running stock. This game would benefit from a 1440P@240HZ though.
There is one cut scene where the game pushes all (16) threads to 100% and my frame rate went down to 128FPS. However, it was fluid. The CPU maxed out, and my GPU usage fell a little. This cut scene was not fluid on the 7980XE for some reason, it would always skip on that old beast in this one specific area.
Sony found a way to get a game to really push a CPU, and not hammer the GPU so hard. They did a great job I think. Because, this game runs incredibly well. I think this was Sony’s first PC port ever. And it can use any CPU no matter how many cores it has. And it uses every core perfectly evenly.
![[IMG]](images/storyImages/DCAD17-BD-911-E-4676-A099-EDD2-B3-FD9905.jpg)
If Intel would have made an 11900KS. This is it!Last edited: Jan 2, 2022 -
Bro Fox. Will you sell your 3090 Kingpin for the coming 3090 Ti flagship from EVGA ?
Yep, Nvidia's new yearly release cadence for graphics cards now also contain their flagship models. This was doomed to happen once nvidia announced the 3080 non Ti@10GB cards as their new gaming flagship.
They could have given you a full die last year, but the greed trump common sense.
EVGA RTX 3090 Ti Kingpin might cost more than original Kingpin videocardz.com
New details on the next-gen EVGA flagship graphics card.
![[IMG]](images/storyImages/EVGA-RTX3090Ti-rumors-768x507.png)
The rumors alleged that existing Hydro Copper water blocks for RTX 3090 Kingpin will not be compatible with the new model. It should also be noted that the power connector change is not the only major difference with the non-Ti model. The RTX 3090 Ti will also have half of the memory modules, as the new SKU is being upgraded to 16Gb (2GB) 21 Gbps memory.
The post further suggests that the existing Kingpin model might be discounted, while the new Ti-variant could be introduced a higher MSRP.
But 3090 hadn't the Ti moniker, so.... Nvidia just had to add it for its refresh. Same also for the top dog Ampere mobile cards.Last edited: Jan 2, 2022Rage Set, Clamibot, Spartan@HIDevolution and 2 others like this. -
electrosoft Perpetualist Matrixist
Back in the day with more, ah...."robust" laptops the power connector was always a separate module because it is a known point of failure due to its nature. Now? Everything is cheaply soldered directly. Best case you can resolder a joint. Worse case the soldered connector(s) breaks a trace deeper on the motherboard or breaks something else causing other problems.
When my daughter's much beloved Dell Studio 1749 (she used it for 8 years) had two power ports go bad, it was a matter of spending ~12-15USD to replace them and good as new since they were their own module. Same as the power on button. It had its own array and I replaced that too (along with upgrading her to a B+RG 1080p panel a few years after prices went down and she accidentally cracked the display). Only reason she tearfully stopped using it was because her games were no longer supported on the 1 GB ATI Mobility Radeon HD 5650.
She is now using a Clevo P370SM and her fiance is using a Clevo P170SM and doing just fine. When he switched from WoW to FF he said his 870m GPU was too slow. I popped a 970m in there for him to try out and he was good to go.
The idea of putting so many points of failure (CPU, GPU, Memory for starters) grouped together if one fails it all fails still irks me to this day.
You can't do these kind of things with the throw away generation of laptops. -
You think Optimus watercooling is gonna spend another 8 months developing a new Waterblock? And maybe they can just throw their current active cooled backplate KPE waterblock in the bargain bin?
I wonder if they’re gonna throw in the towel. All that work and R&D developing an active cooled back plate trying to make the best waterblock in the world, and here comes Evga releasing a 3090 KP with no rear memory modules.electrosoft and Papusan like this. -
electrosoft Perpetualist Matrixist
Wow, I mean it's a slap in the face to previous KP 3090 users but this is on Nvidia not EVGA for introducing yet another "top TOP tier" card now versus previous generations that when that top model came out that was it and everything else was shoehorned in below it (980ti, 1080ti, 2080ti).
EVGA has to offer a 3090ti to compete and it wouldn't make sense to offer 3090ti's and NOT a KP3090ti as that would not make sense but still this is all on Nvidia as stated before replacing the top tier with a slightly better top tier 16 months into this generation cycle.
I expect this to clock in at least $2499.99 and resell on eBay for $4000 initially.
Plus we still do not have any timetables on the 4000 series. Looking at historical data you would expect it to launch later this year but with AMD having basically caught up in raw performance and making leaps and bounds with SS and other things, Nvidia is reactionary more than ever and that might have something to do with the 3090ti too (along with a typical pandemic cash grab). -
I'm just saddened how the bar is set so low and that most people just accept it. I generally dislike when someone buys something, complains about using it, and then keeps using it and complaining. My thinking is that if I buy a product that doesn't do what it I need it to do, I will either get rid of it (returning/selling it) or I will modify/change it to my liking. The problem stems from people being either too lazy, or lacking in knowledge/information about the particular product (in this case laptops).
The being "too lazy" part is something that can be fixed, but the lacking in knowledge can't be entirely fixed. I fully understand someone not having enough time or desire to learn and understand a computer, but in this market, it seems you need to dig quite deep to find information about how your laptop should perform and the potential issues it may have. A good example of this would be when Nvidia got rid of Max-Q. Ever since Nvidia got rid of Max-Q, it has become significantly harder to understand the performance gap between laptop's with the same GPU as they can have the same name, but extremely different power consumptions. I feel the companies are doing this on purpose to make it harder to easily understand laptops. When a person doesn't fully understand something, it's easier to trick and subdue them.
The route the companies (and when I say companies, I mean Intel, Nvidia, AMD, and all the laptop manufacturers) have taken don't sit well with me whatsoever. Long gone are the days where buying a laptop was quite simple, now the companies have resorted to deception as to increase profit margins as much as possible.
On a side note from this whole rant, I have a new project I'm about to start. All I'm going to say right now is that I'm working to completely unleash the power of my 1070
-
Nope. They can bend over a plant a wet one on my 59-year-old heiny. Talk about biting the hand that feeds you... Green Goblin bastards. I probably won't give them any money when they launch their 4000 series scam either.
-
What’s even stupider, is that reviewers are going to compare 3090FE VS 3090TI FE. One has 100 more watts of power. So, it’s gonna portray this performance improvement that is not even real.Ashtrix, Papusan, electrosoft and 1 other person like this.
-
The Green Goblin will probably release a cancer driver update that cripples 3090 performance to make 3090 Ti look like a larger leap than normal. That's how they do things... deception, smoke and mirrors. But, I only use older drivers, and I have no plans to accept their DCH driver feces, so it won't be misleading for me like it will be for the sheeple. I suppose when I have to waste money on a newer GPU that I will be exploring the option of using NVCleanstall to makes old drivers forward compatible rather than new drivers backward compatible.Ashtrix, Rage Set, Papusan and 1 other person like this.
-
I hope they won’t do anything like this. I’m not buying a 3090Ti because, well I guess I already have one.
So, I think the 3090Ti FE is supposed to be like $2,999 MSRP. So this would make 3090Ti K|NGP|N probably about $3500? And the HC version will probably be $3799?
I know Evga can’t price them cheaper than the FE model right?Ashtrix, Rage Set, electrosoft and 1 other person like this. -
They could sell at a smaller margin, still making a profit (just not rape customers) if they chose to do so. But, it wouldn't surprise me if it costs EVGA more to produce them because they are a smaller company, with lower production volume. They may have less power and influence.Last edited: Jan 2, 2022Rage Set, electrosoft, Papusan and 1 other person like this.
-
I don't see Nvidia will add more than $200 on top of 3090. Nope, it won't happen. Everything above 1800 push it into $2000 territiory and this will just show how greedy they are. They are greedy but not stupid. Remember same amount vram and below 3% more cores vs. 3090. And Nvidia know it's AIB partners will push it well above MSRP. They'll let their partners run the show.Spartan@HIDevolution, Ashtrix, Rage Set and 1 other person like this.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.
![[IMG]](images/storyImages/C4991-B28-561-F-4-C5-B-8164-EA6-DF9465-D71.jpg)
![[IMG]](images/storyImages/6470-D005-3651-4689-AD11-3-C4837-C5-A5-A5.jpg)