Right, I get it, because it's 100% ray traced then it's the RT cores (& tensor cores?) that are getting the workover while the rest of the GPU is chilling in relative terms. Hence you could push your overclock to new heights in this benchmark and hence the high core frequency, also aided by no Watt/Power limitations because as you say was only drawing 300W.
-
Robbo99999 Notebook Prophet
-
3080 will be a short lasting 4K card regardless of AMD's sponsored title or not
Yeah, Nvidia did a good job making the coming midway refresh more tasty
They have learned their lesson from last year Super fiasco, LOOL
EK Water Blocks May Be Preparing a Thermoelectric CPU Cooler techpowerup.com | Today,
In a recent YouTube video from Linus Tech Tips titled "The Fastest Gaming PC in the World!...For Now!" a new cooler from EK Water Blocks is shown. The cooler is paired with a binned Intel Core-i9 10900K running 5.4 GHz and 360 mm front radiator. The cooler was largely censored but we can see a number of cables coming off the block including a PCI-E power cable which helps the suggestion that the cooler is a thermoelectric cooling device (TEC) utilizing the Peltier effect to transfer heat from the CPU. Thermoelectric coolers require significant power to run with the EK cooler in question being used in a 1600w power supply system. Thermoelectric coolers aren't a new invention but haven't taken off in the PC realm due to power and cost concerns so it will be interesting to see if EK is able to buck the trend.Last edited: Nov 4, 2020Rage Set likes this. -
I searched everywhere for more info
on it. I want one too, but for my 7980XE. I seriously want it. I heard that it’s heavy. Like 4 pounds or something. It looks like a traditional EKWB waterblock with a pretty larger peltier setup inside of it. But, I thought Peltier was proven ineffective? -
Keep your card around 40-45C and your RTX3080 would run that frequency in any load. Maybe even more.
-
It's more inefficient than anything. It requires craploads of power (hundreds of watts) to achieve temperatures that you could instead get with phase change cooling at a fraction of the power consumption.
-
My CPU is really inefficient too, running at 4.8Ghz trying to maintain an advantage against modern (16) cores with faster IPC. But my processor is over 3 years old, and I am willing to sacrifice power consumption for performance. Phase changing is extremely loud, and you need to prep the board for condensation too. Not to mention the cost of a phase changer. I am considering just going to chilled water, because there’s more left in my CPU, and I’d love to use it. A water chiller seems to be the easier alternative. If this EKWB peltier block actually works, it seems like the “User Friendly” idea of it all is the most attractive to me.
Even if the peltier uses tons of power, doesn’t a water chiller or phase changer consume like 500-1000 watts?
Does peltier technology actually work? A Der8auer video showed it was terrible at cooling anything at all. -
Rumor mill of the day: the 3080 Ti will have the same Cuda core count as the 3090, but 20GB VRAM instead of 24GB.
tps3443 likes this. -
Good deal. Will swap my 3080 for it if they keep the price at $999 or below. The 3090 was overpriced at $1499.
Slightly more memory bandwidth and 4gb isn't worth $500. Maybe Nvidia unlocks Titan drivers for 3090 though making it worth it for those with that type of work. Still think Nvidia needs to do $8
Zotac just cut the prices on their MSRP for their cards. If other manufacturers follow, then it might signal a price cut from Nvidia which is great. Maybe Nvidia planned this all along, launch first and make profits with higher prices, then cut prices when AMD launches their more expensive to manufacture cards.Last edited: Nov 4, 2020electrosoft, ajc9988 and tps3443 like this. -
I find it amazing that most AIB cards 3080/3090 are usually more expensive than founders edition models.
Now I’m not gonna lie. I am not a fan of air cooling on a GPU. Anything over 60C just isn’t really good enough plain and simple. But, I do think Founder Edition cards look the best and are the highest quality. They are heavy, and very robust.
I think Nvidia should undercut AMD and offer a 3080Ti for $949.99. But I truly believe with 20GB of Vram it’ll be $1,199.
I have a Gsync monitor, so I won’t be buying a AMD GPU. I look forward to the 3080Ti! Please be $1,049 or less lol.ajc9988 likes this. -
Im generally under 60c in anything that I tend to play, pumping 1.2v since I got it.Rage Set likes this.
-
You play at 4K 60Hz?
-
I agree with everything but the jebaited part. Nvidia got caught with their pants down. It's OK. Most people didn't believe the rumors or that AMD would put out anything competitive. Just like I didn't believe that argument when AMD claimed it with the 5700XT.
But I do believe nvidia is trying to find the best way to hit back! And that includes them trying to find the right Cuda core count, memory amount, etc. After the announcement of an AMD title needing 12GB, they said let's increase it. With people figuring out the core count conundrum I spelled out with the people dies and shader counts, I think they realized just give the 3090 die. And, as you mentioned, giving the Titan driver treatment to the 3090 would be easy enough.
But nvidia will finally put out something with a real jump in performance next gen. That is the amazing news. -
electrosoft Perpetualist Matrixist
Agreed, even saving ~$227 off of retail MSRP from BB, I still think it has a craptastic price
erformance ratio. I absolutely love the build quality and look though with that ginormous heatsink lit up with the logo. When gaming, GPU is so silent I ended up gutting all my case fans and taking the front, top and side off to make it a vertical open bench. All I hear is the NH-D15 fans when they kick up and it went from being the quiet part of my previous system to the loudest.
Price cut would be great if they had one on the 3090 (doubt it, but still) if they get one out before end of January 2021 (my return window). I've been down this road with Best Buy before. As long as you're inside the return window, just take your receipt in (or call) and they credit you. For myself, I applied a 10% coupon. If a markdown came, they will literally adjust the price to the new price, reapply the 10% and credit you the difference.
I'm not sure WHAT Nvidia is going to do, but they will respond or potentially bank on customer loyalty and product shortage, throw in a few more models and keep it moving.
Talon likes this. -
Is Nvidia actually considering lowering prices? Due to AMD competition?
-
electrosoft Perpetualist Matrixist
I highly doubt it but you never know. They'll just create custom SKUs to cram in there (Super's, TIs, etc...) to get close enough or beat AMD. The gap between the 3080 and 3090 is just too massive and AMD dropped the hammer with the 6900xt so expertly done (Pricing, leaked performance) Nvidia has to respond. -
Robbo99999 Notebook Prophet
Over 60 degrees celsius for a GPU is absolutely fine, no need to chase numbers lower than that unless you want every last fraction of percentile of performance for benching - but for gaming, air cooling is the best option for it's reliability / cost / simplicity & virtually no loss of performance.electrosoft likes this. -
You got a good price and I would go 3090 at that price. Great looking build. I really love the look of the FE card, always have like their modern and sleek designs.electrosoft likes this.
-
So, here is my post collecting Zen 3 reviews this morning:
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs -
Nice! AMD has finally taken down 5 year old Skylake and 14nm.
-
And Intel hasn't put out anything new in that long and rocket won't beat their current offering, while alder is going against zen4. Don't be cheeky without realizing Intel's been bilking with no innovation for half a decade.
Edit: which then asks why you kept buying the newest Skylake without any real innovation over and over again...
Edit 2: Now, had you just been like, hey, AMD took down the most recent chip, that is one thing. Hell, Zen 2 beat the 7700K last year. And was up there with the 8th gen. 9th and 10th did improve with the ++++++++ iterations. But I'm not ****ting on Intel not able to get off 14nm, taking way more power to lose to AMD Zen 3, at least not until you wanted to be a snyde little sh**.
You kept buying Intel at ever increasing prices as soon as it drops. You have Sunny Cove adding 18% IPC, while dropping 20% of cores for Rocket Lake. Good job Intel. Real innovation there.
We could play the fanboy BS all day. Let us not. Because that DOES NOT MATTER. What matters is performance, regardless of brand. LEARN IT, LIVE IT, LOVE IT!
So do not get started on the crap, please.Last edited: Nov 5, 2020 -
Rocket Lake coming in a few months looks interesting, rumored double digit IPC gains, will be a good gaming chip. But I might wait for DDR5, PCIE 5.0, and Alder Lake Big.little next year. LGA 1700 is supposed to last 3 gens so you can grow with the platform as a consumer which will be a nice change.electrosoft likes this.
-
Look it up. You do not even get willow cove backported. They did a backport of SUNNY COVE. That is around an 18% IPC. Look it up regarding the mobile chips they put out using Sunny Cove, where they skipped comparing it to cannon lake because that was such a crap stain. Like Ice Lake using Sunny wasn't. LMAO.
So, you get 18% in March, literally 5 months out. Way to be ahead of the curve, Intel!!! So great you arrive fashionably late!
But, oh, let's talk Alder Lake, which will FINALLY get Intel to 10nm++++++ (LMAO, but seriously, it is the third iteration of 10nm already with their "SUPER FIN" (said like superman)), and will have DDR5 and PCIe 5.0 going against Zen 4 with ANOTHER jump in IPC, DDR5, and PCIe 5.0. Let's hope Intel doesn't get delayed AGAIN!
And let's ignore the 1200 socket, jump straight to 1700.
See, if I wanted to be ****ty about Intel, I could. I am trying to show you there is no reason to be snide and act like a d**k. Because you saying "just wait" sounds as bad as AMD fanboys used to sound.
Now, Alder Lake is supposedly Golden Cove, so another jump in ST performance is argued, but current examples show lower boost clocks than current 14nm, although 4.8GHz is not bad and they may have a 5GHz ability on 10nm super fin by then. And that will be competing against TSMC 5nm, which comes with further performance increases, possibly continued work on frequency, etc.
So, do you want to talk about this like adults, or play fanboy? -
yrekabakery Notebook Virtuoso
I say bravo to AMD. They've been innovating, while Intel has been iterating, trickling moar cores on an aging process while letting power consumption run wild. The gains in gaming going from an 1800X to a 5800X, both octocores, is absolutely insane.
Rage Set, electrosoft and ajc9988 like this. -
Agreed. It has finally brought competition back and AMD is hitting on all cylinders. I'm still waiting for Zen 4, instead going to get either an Nvidia 3080/super or an RX 6800 XT this year, but Zen 3 is compelling.
I do look forward to seeing how Intel stacks up on the 10nm node versus zen 4. Intel should have 2 gens of IPC increase for single thread workloads on the node those cores were designed to work on. So, it is the best case scenario.
Unfortunately, Intel's 7nm for 2022 is already 12mo behind on yields, so they pushed it 6 months. And if that is going against 5nm or even 3nm TSMC, which will have great yields...
But it seems AMD is really reinvesting the money well. Zen 3 is great. Intel basically conceded HEDT for now with no Ice Lake on it and low 10 series x299 availability. Which I do not like that, because that can let AMD become complacent for workstations. And the 18-core Intel 10980XE vs the mainstream 5950X....
I am looking forward to seeing where things go in the next year or two.Rage Set and yrekabakery like this. -
yrekabakery Notebook Virtuoso
Using the example of Deus Ex: Mankind Divided, which is a game where Zen used to fall flat on its face.
Zen4 doubled over Zen+:
This is how Zen+ used to stack up against Coffee Lake:
Rage Set, Robbo99999, electrosoft and 1 other person like this. -
It's really nice to see AMD beating Intel. I've been waiting on Intel to finally release something worth upgrading to, but AMD has done that instead. Looks like my next build will be AMD based.
It'd also be nice if AMD can get their CPUs to be good at overclocking. That's the main reason I preferred Intel's CPUs. I don't like being limited. Regardless, the performance jump is enough to last me a good long while.Rage Set, electrosoft and ajc9988 like this. -
New feature allows you to not sacrifice the single core boost while doing an all core overclock. And Roman got the 16-core over 5.6GHz in his video this morning (check out his scores).Rage Set, Robbo99999 and Clamibot like this.
-
no 60hz is not fluid enough for me.
-
https://siliconlottery.com/collections/vermeer
Turns out they are doing at least one run of binning on the AMD Zen 3 chips, with the 5900X and 5950X. So if anyone wants a binned sample of either...
Also, for boost, even though the box on the 5950X says 4.9GHz, the CPU has up to 5.025, and the OS has 5.05GHz, with the 5900X being lower by 100MHz. But, just giving a heads up, as these will be outperforming the 10980XE on many things, along with the 10900K. So ...Rage Set likes this. -
electrosoft Perpetualist Matrixist
Absolutely. AMD has hit it out of the park with the 5000 series no ifs, ands or buts. I would LOVE to see a Clevo based 17" laptop with 5000 series support similar to the X170 series but with a few refinements. Imagine the quiet thermals of a 5600x laptop producing that level of gaming performance. Yes please.
This is what Intel gets for extending skylake for so far and so long and what was previously AMD applying the "moar corez!" method is now what Intel has left atm with the 10900k.
Bravo AMD.Rage Set, yrekabakery, ajc9988 and 1 other person like this. -
electrosoft Perpetualist Matrixist
True, if you're in no rush to upgrade, just sit back and wait for their response. It's a great time to not need to upgrade right now, but if you do need/want to upgrade in the here and now, AMD is clearly the choice. Last time that happened, I jumped ship to an Athlon over a P4 many moons ago.
Depending on 3rd party reviews and SAM, AMD may be the logical choice to go with right now for CPU and GPU combo upgrade if you need both.
I game now @ 4k, so it will be interesting to see how much CPU plays into it and how worthy an upgrade now or next year is worth it. -
electrosoft Perpetualist Matrixist
-
Looks like I spoke too soon. Roman achieved an overclock to 5.8 Ghz on that chip which is very impressive. This bodes well for the other SKUs.
There goes Rocket Lake's chance at retaking the single performance crown even under the most favorable conditions (if rumors of performance gains are accurate). It'll still lag behind Ryzen 5000.ajc9988 likes this. -
Roman was the only one to discuss the feature for single core, but likely is because of sponsored by ROG, so went through bios features of the Dark Hero.
But, if that is with quick LN2 overclocking, think of binned chips and selected for going cold.
Also, yeah, Intel's rocket cutting 2 cores so 8 core tops, with around 18% IPC, just brings back to back and forth, but with no real competitor to the 12 and 16 core on multi threaded workloads.
Then, when Intel drops their new chips, AMD does the 5600 non-X and a 5700X, undercuts the pricing of Intel's 4, 6, and 8 core offerings, which still have to price relative to AMD's, and maybe even slide the 5800x from $450 to $400 or $420, and Intel will have very little room to play with on price. Basically backing Intel into a corner and limiting sales.
And on top of that, drop a Zen 3 TR when Intel releases mainstream to take all steam out of their announcement, like how Intel tried to talk about rocket lake and was ignored by the 6000 series GPU announcement.
Intel just won't be able to catch a break until alder lake. And even then, that is going to have a new generation of Zen to also deal with. -
You mean Intel will finally manage the task have enough chips to offer? Aka no more shortage and the system builders can order all chips they want and get them yesterday
A dream situation. Intel will with this avoid all the whining about shortage of chip.
Last edited: Nov 5, 2020ajc9988 likes this. -
Yep I game at 4K 144Hz, I could have a far lessor CPU and it wouldn't make a difference. Rocket Lake likely won't make a difference for my gaming experience, and Alder Lake might also not make a huge difference. 4K is very GPU dependent as you know.electrosoft likes this.
-
electrosoft Perpetualist Matrixist
True, I remember watching this 9100f vs 9900k @ 4k a few months ago and seeing how many titles netted roughly the same fps (or close enough):
It will be interesting to see how the 6000 series pairs with Intel vs the 5000 series and SAM and see what it brings to the table @ 4k overall. -
You know what's funny. Stock of the 10980xe is slowly gathering around and I was very tempted, but that price has to go down now for me to bite. I'm thinking 800electrosoft and ajc9988 like this.
-
Yep. The performance isn't there for that price, even though I do like the memory bandwidth and PCIe lanes.
If the ssd to GPU feature supports up to like 9 or 10GBps, then you need like two SSDs to come close to saturating it, and that is of PCIe 3.0 doesn't bottleneck the feature. But that is 24 lanes right there for high end gaming. If you do another SSD for boot drive, there is 28. If you have 10GbE NIC for the local network, there goes another 4 lanes. And that is before any other add-in cards.
This is why I've argued, if you are rebooting what mainstream is, like Intel's 1700 pin platform for Alder lake, or AM5 for AMD, it's time to start thinking of 32-40 lanes off the CPU. Possibly even moving the chipset onto the CPU.
Considering MBs are now getting priced closer to HEDT boards' pricing anyways, we really do deserve more lanes for it.
But that is my pontificating on how things should be. For what is, I agree, the price needs to come down.Rage Set likes this. -
electrosoft Perpetualist Matrixist
That's the beautiful thing. AMD is a win win across the board whether you want AMD or you want prices and/or demand to ease for Intel chips.Rage Set likes this. -
That’s what I paid for my 7980XE. I’ve had it for probably 4 months now. While my toddler aged cpu doesn’t exactly win the race in single thread. It easily holds its own in multithreaded performance even to this day. If you shop around you can find deals.
At 4.8Ghz you can beat a 5950X by about 10% in multithreaded. I have nothing for it in single threaded performance though.Rage Set likes this. -
I completely agree. HEDT as we know is dead. For better or for worse, depending on which side you are on. AMD killed it with their Ryzen mainstream and I will bet that we will see upwards to 40 lanes on AM5 (or whatever it is going to be called). Intel, on the other hand, will still retain their artificial product segmentation. Even after all of this butt kicking, they [Intel] won't let AMD lead them into a race to the bottom.
I want to see who will give us four channels on their mainstream first.ajc9988 likes this. -
What kind of GPU do you use? I’m considering adding another 2080Ti, and going for 4K 144Hz.
I played around with (2) 2080Ti’s In Nvlink briefly months ago, and it actually worked surprisingly well at 1440P 165Hz. But my 8086K “At the time” was overwhelmed. And my GPU’s were only getting X8/X8.
I am interested in revisiting this with my current CPU, providing adequate PCI-e lanes, and 4K high refresh rate. The monitors are still pretty expensive though. -
Falkentyne Notebook Prophet
Shunt mod with MG chemicals silver paint was successful on my RTX 3090 FE. Couldn't stack shunts because the edges were lower than the black middle package, while shunts on some other AIB cards are flat, and trying to put on too much paint to make a connection to stack causes unwanted resistance.
Not sure exactly how much I'm drawing but at 114% TDP, in Heaven benchmark, the clocks hold steady at 2025-2070 mhz, even though PWR and VREL (sometimes vOP, or max operating voltage) is going on and off, probably because something is getting very close to the power limit. Assuming this is between 525-560W. (Core clock +135, RAM clock +600). 100% power target is somewhere between 466W to 525W. Not sure. I do know Call of Duty Modern Warfare doesn't get anywhere close to the power limit and reports about 70W less than before the mod.
Six shunts had to be bridged.
Rage Set, Robbo99999 and electrosoft like this. -
Looks like you could easily stack and solder on new resistors to me. I would much rather do that than use a metal paint or metal conductive pen. You can get very inconsistent results due to the paint moving or changing in consistency. “Getting hot” “Getting cold” “Or rolling around”. Or using too little, or using too much.
All you do is tin the edges of the new resistors with solder you want to stack, this gives them
a higher edge, and then hold them on the old resistor with tweezers and melt the solder each side at a time.
Also, get a nail file or sandpaper and sand the solder contacts on the resistors on the 3090 first. Sometimes there’s a polymer coating Or film on the solder preventing a good connection. And they simply won’t melt together.
That looks very straight forward to me though, you shouldn’t have any issues stacking shunts on that card, the resistors look to be the exact same. The metal edges are usually lower then the black center on all of the 2080Ti’s I’ve done this to.
Here’s and old 2080Ti I did. You can see, they don’t have to lay perfectly flat against each other.
You should be fine.
I’d scrape off that metal pen coat stuff and go at it again. You’ve got this @Falkentyne
Last edited: Nov 6, 2020 -
Falkentyne Notebook Prophet
Yes I scraped them down to the silver so the conformal coating was gone.
The reason I didn't stack the shunts is the same reason Sky3900 didn't stack them either. If the shunts were completely flat, we would have both stacked them. But when you are using conductive paint (I can't solder something this small, I don't want to destroy the card) and filling in the gap, you're still creating unwanted resistance. So it could be adding as much as 10 mOhms to the shunt, because the shunt isn't in direct contact--it has to go through the coating. That's the problem. Obviously this won't happen with solder, but you get the point.
Seems like painting had an effect of a 15 mOhm~ shunt. I got similar results as Sky3900
https://www.overclock.net/threads/r...orking-shunt-mod.1773780/page-7#post-28663330
https://www.overclock.net/threads/r...orking-shunt-mod.1773780/page-7#post-28663476
https://www.overclock.net/threads/r...orking-shunt-mod.1773780/page-8#post-28663588
https://www.overclock.net/threads/r...orking-shunt-mod.1773780/page-8#post-28664163Papusan and electrosoft like this. -
I know you obtained full potential from
a 2080Ti, so I’m curious as to how much faster performance your getting with the 3090?
Other than obtaining higher benchmarking scores aside, is the actual gaming performance improvement worth it, compared to your overclocked 2080Ti? -
Robbo99999 Notebook Prophet
Nice, let us know how you've set it all up (hardware/cooling wise & overclock settings) and how much extra performance you're getting over a stock Founders Edition 3090 in percentage terms (in relation to Graphics GPU score)....that would be really informative way to frame your overclocking progress into a practical perspective.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.