The 290X was a beast. I remember when they first launched. At the time AMD seriously had a leg up on nvidia, and there was real competition. Because the 290X was seriously fast. Unfortunately now with AMD, it’s almost like you have to compromise if you do buy a 6800/6900XT.
The 6800XT is fast sure, but there’s a huge asterisk involved. “No DLSS”
Then if you take the 2080Ti in to account which has DLSS 2.0, Once overclocked can match a 6800XT. It all seems kinda lack luster for AMD today. “
The GPU market is a mess lol.
-
If you are looking to save money and get something that works good on a super low budget, but don't care about branding...
https://www.ebay.com/itm/Pc-water-c...ans-Screw-kit-Anti-vibration-Kit/274622319170
Those are the parts I used on the second low-budget desktop for CPU binning. The kit includes a ton of stuff, most of which I have no use for. But, my temps are great (much better than the original AIO parts) and it was less than I would have paid for a radiator and fans. The guy is local so I picked it up even though shipping was free. He sold me a second radiator, 10 feet of tubing and 3 more fittings for $20. The pump that it includes is designed to install in 5.25" drive bays. I am sure it works fine, but it won't work in any of my setups. I had the XPSC pump and reservoir in my spare parts, so I didn't have to buy one.
Now I have to figure out what to do with the stuff I have no use for. It's all new and I hate to throw it away, but I have too much junk already, LOL. There is a bunch of stuff not shown in the photos or mentioned in the description. All of the items are too inexpensive to sell on eBay and probably not very popular. It even came with a generic GPU water block.
I was really skeptical that these parts were junk, but then I found this video. And, the seller agreed to let me examine everything prior to purchase. I'm not SUPER impressed, but I am not disappointed. It was definitely worth it due to the stupid low cost. The radiators alone were worth it. They're very nice radiators. The CPU water block is not fancy-looking, but it's doing a really nice job, too.
Last edited: Jan 22, 2021electrosoft and Papusan like this. -
yrekabakery Notebook Virtuoso
I have a hard time justifying the prices for the 6800/6900 tbh. Not only no DLSS and worse RT performance, but they also have a much worse encoder and worse performance at higher resolutions. They should be cheaper imo. -
electrosoft Perpetualist Matrixist
That's awesome! I agree with the fleecing. ~ $235 for the 4GB 1650 Super was insane, so I'm glad he opted to go with the 1060 6GB I already had on hand. He also decided to go with the 5600x over the 3600 for his CPU. Total cost for everything ~$975.00 so that comes in under his hardline $1k price point. I'll end up throwing in the fans, TIM, ties and anything else and give him my extra 30" display so he can meet his price point. -
They should be, but they're not because they can get away with it. EVERYTHING is overpriced now to the point of absurdity. Plus, you have some people so consumed by irrational hate for NVIDIA/Intel and infatuation with AMD that they're willing to pay too much to avoid the objects of their hate. It's really nice to see Intel and NVIDIA having serious competition though. Long overdue, and I hope it continues for a very long time. It really sucked not having any options even somewhat worthy of mention for so long.Papusan, yrekabakery and ajc9988 like this.
-
electrosoft Perpetualist Matrixist
Rasterization wise, they trade blows very well with Nvidia and if you genuinely need it they do have 16GB of VRAM. After that, everything is in Nvidia's favor IMHO. When I was coming back to the land of frugality, I weighed the pros and cons of the 6800 vs the 3070 over and over and settled on the 3060ti or 3070 even with only 8GB. -
Absolutely. They are a pretty tough compromise. Maybe if someone sold off their GPU during the whole panic sell thing. And they managed to grab one at MSRP, then ok you’ve gotta have something.
But I feel like DLSS is really gonna stick with current and future games.
I hope we see a nvidia super lineup on 7NM TSMC, then maybe there will be some GPU’s worth jumping to. -
I had the MSI Twin Frozr, took the fans off and just planted 2 120mm fans and achieved near similar results. That being said I have always wanted an accelero, but reference cards ran dry before I could get one for a 5700xt and never looked into if the Red Devil could house it. I got a case for 3 slot coolers with no 3 slot cooler lol
I guess I can look into that...
Not really, at the time all people cared about at the time was muh power consumption and muh dB's
I dont really care about new features from nvidia or AMD at the moment. I really liked PhysX and still do, but no one uses it. It frankly pisses me off that liquids are still wax or drunk "off screen". I dont know why but its been the one thing that has irked me since PS2 era.
Rasterization is all I care about at present, but that can change over time of course. For me at present I am just happy with trading blows with a 1080Ti for 300 less USD. I was given a discount since MicroCenter didnt have the card I reserved.Papusan likes this. -
Well, that does seem quite a bit. I paid full price for the 8x industrial 3000 RPM Noctua fans that I bought (in the ballpark of what you paid per fan). For the Vardar F-4 fans from EK, I bought their bulk, which are often pulls from demo machines from events like CES and computex and the like, or builds for product shoots, etc. It was like $8-10 each. Those fans were nice!
There is another set of fans I wanted to check out which seem alright and have a low cost. I think they are arctic. https://www.amazon.com/ARCTIC-P12-Pressure-optimised-120-Fan/dp/B07GB16RK7
If you want a plain fan, those look nice, especially with how low the amps are. You can chain like 5 of them and not have to worry. Matters if you are not purchasing the PWM splitters, etc. I forgot what the difference between the plain and PST CO variants is. But something to check out considering the price of the fans. Bonus (if like me) is that you have NO RGB!
I agree on the encoder, but disagree on DLSS and RT. There are limited titles supporting those features still at this point. It will matter more in 2-4 years, but Lovelace has 70% more cuda cores (which due to memory constraints realistically will only give 35% more performance), RDNA 3 will have multi-die, and Hopper will have multi-die, which the performance uplift on the latter two will be enough that I do not have faith in the 3000 series to have lasting power like the 10 series did.
I'd put my money on the 3000 series being somewhere like the 8 series or 9 series, making the 20 series Kepler. Lovelace will fix issues with the 30 series and likely split apart the Int/FP shared shader to have a single Int paired with 2 FP (that way it no longer is either/or on the FP or Int task). It would also be an easy way to increase the effective shaders, along with adding in more complexes, which they also announced which goes with a die shrink.
But, Hopper and RDNA 3, with multi-dies, will quickly make the current cards obsolete is my thinking on it, which I am betting Hopper will be the next 10-series, both due to the multi-die cards AND due to feeling a fire under their butts due to AMD having a product that is even remotely competitive right now, last seen around the 290X, funny enough.
My problem is I don't like that DLSS sharpens things not always planned to be sharpened. For example, if you are trying to make words on a package look rubbed and worn from time and exposure, yet the AI makes it sharp and crisp, you are missing out on the ambience that the artists intended. Doesn't make it bad, just makes it not quite there yet, IMO.
With that said, I do think AI and RT driven solutions are what we are heading toward. I just don't think that is the deciding factor. For example, I want a 3080 not due to those factors but purely on being the best at direct rendering at 4K. AMD cannot do what Nvidia can at 4K. Period. So the resolution still has me with them, even though I like the memory configuration of more ram on the AMD solutions better. So it is the enc/dec and resolution that effected my decision more than RT and DLSS. Those are just nice extras, but in no way drove my decision.
Red Gaming Tech, Hardware Unboxed, and others that have polled their users also find that RT and DLSS are nice to haves, but not the features that are driving their follower's decision making for product. This is with RGT HEAVILY pushing those features are the future while HU had the Nvidia letter because of it (which they still had dedicated coverage of RT and DLSS and did speak well of the performance).
So, the 6900XT pricing is high IF and WHEN Nvidia finally put out a card above the 3080. Even though the alleged 3080 Ti would have around a 4% performance increase (they are trying to decide to do one with 20GB that chokes on memory bandwidth and a 12GB card that would perform roughly inline with the 3090, but without memory capacity), that would make the 6900XT nowhere near worth the value (which it is past the curve on value/perf in my opinion anyways). The 6800 non-XT, similar issue except for ram capacity, which doesn't matter at the moment, and by the time it does, you have RDNA3 and Hopper out, which will also make it devalue a lot.
But that is my thoughts on it. It agrees it should be cheaper, but on a different rationality basis that focuses less on proprietary features and more on other data that was mentioned.Rage Set, Papusan and yrekabakery like this. -
Not going to happen. But, Nvidia is rumored to be putting Lovelace on 5nm Samsung next year.
Evidently, Samsung just wasn't ready to have their 7nm process doing mass yet. So Nvidia went with their 10nm.
https://www.anandtech.com/show/1469...v-plans-6nm-production-in-h2-5nm-4nm-on-track
As you can clearly see, going from 10nm to 5nm on Samsung is being on the first two variants of 10nm while jumping straight to the third refinement of 7nm at Samsung. This is a huge density jump and reduction of energy for Lovelace, meaning you can pack in tons more shaders, hence the 70% increase previously mentioned. So although TSMC won't be happening, 2022 will bring a large improvement even with Nvidia on Samsung, if the rumors hold true.
Edit:
And my oddball theory is that Navi 23 with only 32 CUs is meant to be the basis for experimenting tying 3-4 dies together. 40CUs would be so huge and draw so much power as to be unreasonable. 80 CUs was doing a big die for competition and they are not trying to use crossfire on the same GPU with two dies. But I haven't really heard much on a Navi 23 die, which would be around a 6500 XT, meant mainly to replace the aging 400 and 500 series cards. But tying together dies around 20-32 CUs would create a 40-128 CU card, with a 128 CU being a behemoth for processing. Just a thought. And then they would only have to build the dies for the lowest end, have an I/O die and large cache (hence the cache changes for RDNA2) in order to get ready for the multi-die chip, without needing NUMA awareness by the OS.
Edit 2: Seems it may be using TSMC 5nm. Now THAT with Lovelace could be very nice. Higher boost clocks plus the extra shaders.Last edited: Jan 22, 2021 -
yrekabakery Notebook Virtuoso
There's also these people that make me facepalm so hard. "The thing I want is out of stock, so I'll just buy something more than twice as expensive instead".
https://www.tomsguide.com/uk/opinio...-shortage-had-me-panic-buy-an-nvidia-rtx-3090Papusan, electrosoft, Mr. Fox and 1 other person like this. -
RTX2000 series has seemed to hold up abnormally well. The gpu’s were abnormally sandbagged with no competition on launch “Serious power limits”. My 2080Ti sits right between a 6800XT and a RTX3080 in all the synthetic benchmarks/ or in game benchmarks. So that’s pretty astounding to me.
To push a 2.5 year old high end GPU, and actually compete at the top with other high end GPU’s over 2 years later is amazing. Fortunately, nvidia left a lot of untapped performance in the tank on 2080Ti’s.
I feel like they aren’t gonna be doing this anymore with actual competition.electrosoft and ajc9988 like this. -
Healthy choices have been on the decline since everyone is being told to stay inside all the time lately...
Well I would hope for as much with the amount of money you have poured into the card by this point. It would be quite disappointing otherwise. -
The percentage gains from generations has sucked since the 10 series. Even Nvidia with the supers narrowed the percentage change from 2080 super to 2080 Ti to around 10%, roughly the 11% seen from 3080 to 3090, except memory bandwidth is a clear issue with that. The 2000 series was way overpriced and the 3000 series did not give that great of gains. Combined, it holds up well relative to that, but that isn't the same as saying it will hold up well.
Kepler looked fine until in hindsight and also having a competitive AMD big die back then. Maxwell is when a turning point started, and last time AMD had like the 290X (I think the 980 was out but the Ti was not yet). 800 series Maxwell was hot and cancelled for good reason. Talk about something they try to forget. But that 10 series, it just was a huge jump.
Since the 10 series, Nvidia's gains have sucked. They had no competition and no try. They stopped using even the 102 die on the xx80 series. They were screwing over consumers in a horrible way, so even if the hardware was decent, it was still a big green shafting.
And they won't leave anything back because they cannot leave anything on the table, whereas AMD DID leave clocks on the table, possibly due to only have a 3000MHz clock frequency limit. Think about that. AMD left some on the table whereas Nvidia did not. Only reason is because Nvidia got caught with their pants down.
But, with Nvidia, when they see themselves get in this type of situation, they play dirty AND they also innovate in a huge way.
So let's go back to Kepler. Nvidia sucked on the heat and AMD had the good 5000 series chips. They then did a node shrink and went for efficiency with the 7000 series and things went bad. That is when Nvidia capitalized, put out Maxwell, which was so efficient AMD kind of was cornered. AMD did a hooray event with the 290X across the street from an Nvidia event, showing its performance. Kind of asinine. So Nvidia responded by putting down the 980 Ti, then demolishing them with the 10 series. AMD's top end has been dead since that 10 series.
Here, my point is that even though it looks like it held up, if you look at the gains gen to gen, Nvidia hasn't given much of anything since the 10 series. Only analogous time period was when we were at Kepler/Maxwell timing. We have an insurgent AMD. We have an Nvidia which is reaching the bandwidth limits for memory (one cause of limits), along with a hybrid int/fp, whereas when they combine the standalone int from 20 series with the stand alone FP in the 30 series, except doubled, they will have a large potential for improvements. Lovelace is supposed to also have a cache rework, which could help with the mem bandwidth bottleneck as well. That would move it from 35% toward 50%, which is why I think of Lovelace as the 900 series and Ampere as the 800 series, even though the Ampere chips are better than the 800 dumpster fire. But, with chiplets, efficiency really goes crazy, and that is where I think we get another 10 series type jump.
So, I take your point. I just disagree due to a couple other things. Also, do not let my comparing them to crappy generations mean anything. While in the thick of a forest, you cannot always see you were in the thick of it; or rather things seem good until you see how good good could be (like getting into a good relationship after having a just alright relationship). -
I bought a $100 dollar waterblock for it. I don’t know what you mean. -
What do you mean by that? There isn't much you can do to a GPU to make it work better other than shunt modding and voltage modding and doing that is extremely inexpensive.
Last edited: Jan 22, 2021 -
I gold plated the PCB @Mr. Fox and instead of resistors, I soldered diamonds in place. The coolant flowing through the $100 dollar waterblock is special urine from a rare unicorn species. Getting 17,600 timespy graphics daily was extremely tough, this was no small feat, and all of these steps were necessary.
lol -
-
Oh yeah they were expensive. I traded some parts for this one though.
-
2080ti in and of itself is dandy but comparatively expensive adding a water loop compared to Reference or aftermarket air.
I feel its a bit disingenuous to say 2080Ti benching well against new tech release without the * to point towards the investment of the water loop, essentially maxing potential power dump in the process. I mean I really hope it does well with that investment into cooling is all lol
Of course if that * has already been mentioned then please dismiss, I dont catch all the dialogue here but I do try to keep up.
Making the trade makes that investment lighter on the wallet I would hope
Papusan, electrosoft and Mr. Fox like this. -
Ah, OK that makes sense. Comparing a 2080 Ti with excellent cooling that allows it to run abnormally cool to a 3080 that does not have the same is not an accurate comparison. Any measurements of performance differences will be misleading and inaccurate.Papusan and electrosoft like this.
-
Only at face value yeah, but the water loop is forever. Only need to buy the water block and your good to go.
Oh and thanks for spelling my handle correctly, its probably like 1 of 5 times someone goes out of their way to spell a word incorrectly. lolLast edited: Jan 22, 2021 -
Absolutely. But at the end of the day with PC parts the entire value for me is all about maximum performance potential. How fast can we make it go sustainable daily reliably in games and applications.
If I had a RTX3080 I wouldn’t bother with watercooling it, other than reduced noise levels nothing substantial can be pulled from one. A 3080’s stock performance was what I was striving for in a card I already owned lol. So that should say something about ones stock performance. They’re just hard to obtain. -
Unfortunately, you are correct. I wish you were wrong. The adults that are a product of the modern public education system are the most functionally incompetent, socially retarded, philosophically bankrupt people I have ever seen. It started way back when I was in school, but the years have taken their toll on society and it shows today more than ever. Thankfully, there are some shining stars that rose above status quo for each of their depraved, and progressively more depraved, generations. Spelling words correctly is a low priority for them and they berate themselves as a result of it. Their mindset allows them to make up words and invent new ways of spelling old worlds and the rest of us need to just shut the hell up and agree that it's good, LOL. The "educators" that ushered the hoards of dimwits into the late 20th and now 21st century need to be fired from their jobs because they are abject failures and should not be allowed to teach anything to anyone. I have relatives that teach. Some are the rare exceptions. Others contribute to the problem, and as much as I love them, they suck at their job and need to find a new line of work so they don't create any more zombies than they already have.
I am the type of person that cares about things being right. I apologize for typos when texting. I fix spelling, punctuation and grammatical errors in things I posted months or even years ago (if the post can still be edited) when I find them. I embarrass myself and cannot let it go. I am compelled to correct the mistake. Even if nobody else noticed or cared, leaving it uncorrected is not acceptable. Making mistakes is normal. Leaving them, or not caring, is messed up.
I agree 100%
It seems like you contradicted yourself. Stock is NEVER good enough no matter how great it is. If I cannot make it go faster, then there is probably no reason for me to buy it. When manufacturers do things to their products that intentionally interfere with me doing that, it makes me think very poorly of them and I lose respect for them and their products.
If watercooling it provided no benefit, then yeah, waste of time and money. But, it does if the watercooling improves the cooling enough. Modern GPUs self-overclock as temperatures decrease. That has both good and bad implications.Last edited: Jan 23, 2021electrosoft, Rage Set, Papusan and 1 other person like this. -
electrosoft Perpetualist Matrixist
Based upon your past messages and always desiring to push hardware to the limits, I call BS.
There is NO WAY you would just let the 3080 sit there on air. Just stop it.
-
So I think you might have misunderstood what I meant by not water cooling a 3080 if I had one. Obviously a stock air cooled 2080Ti VS my performance now, that is a huge difference. That “Potential” was the tipping point to watercool my 2080Ti. I wanted to squeeze that 25-30% extra, and sustain that in games. The TU102’s potential is what really made me want to invest in to cooling initially.
Now, if I would have gotten a RTX3080 instead of my current 2080Ti, “if they would have been in stock online” would I have invested in water cooling the way I am now? Would I watercool that 3080? Probably not, is what I meant.
Now, if I had a 3080 now I would most likely watercool it lol? I mean, absolutely! I have all
if the supporting components to easily do so. So yeah! I would chase every last Mhz, I would certainly
need to so I can get my monies worth haha.
Watercooling is extremely fun. And on top of it all, even if the gains are minimal like @Mr. Fox was saying, there are other advantages. Especially lower temperatures. And mostly the power usage goes way down.
My brother just got a 2080Ti, his first real higher-end GPU of his life. I am absolutely stunned how much juice his card sucks down, while at a much lower voltage than me.Last edited: Jan 23, 2021 -
yrekabakery Notebook Virtuoso
Yup, can confirm. The 290X dropped over 50W easily at stock clocks going from the reference leafblower running at 94C to the Accelero III running in the 40s.
Rage Set, Papusan, Mr. Fox and 1 other person like this. -
I always loved watching Jayz2cents video on watercooling the RX480. One of his best videos.
That card was struggling with heat while pulling 170+ watts. And he got it down below 100 watts of usage on water.Rage Set likes this. -
Good cooling is essential for an efficient system. Since electrical resistance decreases with decreasing temperature, we want our systems to always run as cool as possible, even if it's not absolutely necessary.
This is exactly the reason my dad got a 360mm AIO for his PC. Lower temps can decrease power usage substantially, especially at higher clock speeds.electrosoft, Rage Set, Papusan and 1 other person like this. -
Just imagine what kind of juice my 2080 Ti would be slurping up if not for the chilled water. I was pulling 1000W from the wall in Time Spy and 3DMark 11 on chilled water. The caveat is clock speeds don't hold up worth a damn if you get the GPU warmer than 40-45°C and you need to go another 25°C colder to get it to hold 2235-2250, so it actually ends up not pulling more watts on ambient water cooling because the clock speeds suck. But, it does use more power in a clock-to-watt comparison using ambient water cooling. And, it's also less stable with ambient water cooling.Last edited: Jan 23, 2021electrosoft, tps3443, Rage Set and 2 others like this.
-
Yeah my 2080Ti will manage 2,115Mhz sustained in Cyberpunk 2077 at temps of up to 46C. " This is running a virtually silent PC/ Minimal airflow and zero noise" this is pretty darn toasty at 46C. But with the door off, and siting just a foot away from the system, you cannot hear any noise at all.
But if I go to a maximum air flow setup, with high speed push/pull on all (3) 360mm radiators, D5 pump at full speed. I can literally squeak my 2080Ti down to 35C and it'll manage 2,145Mhz. And any hotter than that is totally unstable. The trade off isn't worth it.
More power doesn't help, more voltage doesn't help. It just needs to run cool to be stable. Temperature greatly controls the stability with silicon for sure!Papusan, ajc9988, Rage Set and 1 other person like this. -
.
Okay, I've been having terrible problems for weeks ... look ... I thought the power supply was there ... but it's not ... so I don't know ... what's going on ...
Last edited: Jan 24, 2021Robbo99999 likes this. -
Hey brother @makina69 I cannot see the images posted. Please edit the posts and try re-adding the images.Robbo99999 likes this.
-
Falkentyne Notebook Prophet
@Mr. Fox Here's your new video card.
I call it the "Out of Stock Snow White Edition".
https://videocardz.com/newz/galax-geforce-rtx-3090-hof-pcb-pictured-with-monstrous-vrm-designJohnksss, Robbo99999, Rage Set and 1 other person like this. -
With old hardware as 3770K and DDR3 I won't get much point in 3D benches
So it has to be as it is.
https://hwbot.org/submission/4665875_papusan_gpupi_v3.3___1b_geforce_gtx_1070_14sec_827ms
Last edited: Jan 24, 2021Johnksss, Mr. Fox, Spartan@HIDevolution and 3 others like this. -
Still a good CPU and good GPU even today! especially at only 8C temps haha.
-
2 C colder and down from 8 to 6
But still too hot when the bench finish. Easier if the bench have been done faster as the newer cards. They will be finished within 3 sec. Aka bigger chance too keep temp down.
https://hwbot.org/submission/4666043_papusan_gpupi_v3.3___1b_geforce_gtx_1070_14sec_689ms
And here come nr 5
4 is probably within reach. But it will hold hard (2189.50@42C is still too hot). Maybe another driver can offset it, HaHa
https://hwbot.org/submission/4666076_papusan_gpupi_v3.3___1b_geforce_gtx_1070_14sec_604ms
Btw. Todays normal continue from Futuremark UL
New big update of Futuremark 3dmark scoring lower and crashing constantly on stable clocks that i used before in older 3dmark suite
By chispy, November 27, 2020
Last edited: Jan 25, 2021Johnksss, Mr. Fox, Spartan@HIDevolution and 3 others like this. -
Yeah, I have found the last two releases of 3DMark to be very buggy. I finally gave up trying to bench their rubbish a week ago. I was not getting lower scores and even getting my highest scores ever, but at the end of each run their crappy SystemInfo scanner would stop working and show the CPU and GPU as unknown with the score being invalid. 3DMark has become rubbish like most other things. Companies everywhere are doing sloppy, half-assed work and think nobody is going to notice or care. When the garbage is free, there is not as much to complain about. When you have paid for it, you tend to expect a little more.
-
Never got time to test run 3dmark this year. Any minor OC was a instant crash but games used to run with minor OC. I thought my BGA firmware was the issue, haha.. I remember getting 200-300 pts after using paid version and CPU load was minimal with paid version. Now, I will not pay for it. It says 980M can't play newer games but I'm able to play most new games on FHD High settings w/o any stutter(newer DCH nvidia can cause stutter at bone stock).
-
It seems to be a psychological assault designed to gradually convince people they need new hardware when they actually don't. If 3DMark says you can't play newer games on a 980m, that's true if you are targeting 120 fps, but the program should specify that because 60 fps at 1080p is definitely achievable in newer games on a 980m.
My framerate target is 120 fps, so yeah newer games aren't playable on my 1060
.
-
electrosoft Perpetualist Matrixist
My EVGA queue popped for the Kingpin 3090 and it comes up with the new price (2069) with a $70 discount applied since I've been queued up since 12/16.
Johnksss, Rage Set, Papusan and 1 other person like this. -
I, too, got my notification and I'm wondering if it is worth getting it now. I know for a fact I'm only going to bench it somewhat and it will then sit. With all these delays and price increases, I have become jaded.
Should I get it or not, that is the question.electrosoft, Mr. Fox and Papusan like this. -
Brother @jclausius hit the nail on the head in his post here today.... "Over the past 5-7 years, the Tech industry has made a shift to getting it out NOW; thereby, rushing out buggy, buggy software and hardware before it is even ready! I think it is like this in a lot of different consumer markets, spurred by the "disposable" attitude of today's consumer"
And this sum it easily up... https://community.hwbot.org/topic/203354-error-trying-to-submit-results-of-pcmark10/
They go after the Phone jockey's nowadays
No time for proper coding.
UL Benchmarks The score for my phone in the app is too low....... -
How much do you guys think the Galax HOF 3090 will be? I wish they’d release it with a factory waterblock and not some silly air cooler.
-
electrosoft Perpetualist Matrixist
What are you running now? What are your goals? Are you a, "next best thing?" kind of hardware owner or do you buy and retain for 2-3 years?
I ordered one for my boy. He got his first one in last week? (or the week before). He's happy with it. No coil whine versus his FTW3 3090 he just sold yesterday. His long term goal is to block and link them. They're not binned, but he did 2200 out of the box with some minor adjustments. He doesn't do LN2 or anything crazy.
Price wise, with the recent markups across the board on almost all 3090's, outside of the FE, they all cost relatively the same it seems now for decent 3rd party models for 3090s. I was just on Newegg and even CDW and they all around 2k or higher. Even PNY's 3090 is like $2200. -
I find myself in a constant state of bouncing between loving the 3090 K|NGP|N and questioning what kind of stupid decision it was. The only reason I still have it is I'd probably lose money selling it, and I hate selling anything at a loss. It's an incredibly awesome GPU at an incredibly stupid price, and I feel super-stupid for buying it because of that. If I could turn back the clock and have a do-over, I think I would choose to skip it.
Probably another example of an incredible product at a horrible price. Probably with no-so-great warranty and customer service. I expect it to cost more than the K|NGP|N and be more difficult to acquire. HOF GPUs have always been kind of like unicorns. Never available if you want one. And, I don't think I have ever seen one with a waterblock. They are all air cooled except when an LN2 pot is strapped to one. At least, that is what it seems like to me. They are such limited production there is no incentive for anyone to manufacture a waterblock for a Galax HOF GPU.Last edited: Jan 25, 2021Rage Set, Papusan and electrosoft like this. -
electrosoft Perpetualist Matrixist
So I went ahead and ordered the 3090 KP for delivery tomorrow. I'll get to unbox and play around with it although I have NO IDEA how I'm going to mount and use it in my DG-77 as the front is taken up by the CLC 280 and the top would only work if I pop the top in some weird hybrid (fans outside, radiator inside) build.
More pressing, since WoW is my mainstay, I've noticed some VRAM issues in regards to special effects rendered out.
I noticed even with the exact same settings (Ultra 10, 4k, RT MAX), a lot of the extra glow and details (especially in Bastion) are missing with my 3070 8GB from when I had my FE 3090 24GB.
This really came into focus with my wife's system. She runs a 9100f (such powah!) , 16GB and was running a 1060 3GB @ 2560x1600. I sold the 1060 3GB and replaced it with my spare 1060 6GB (after realizing it just wasn't quite fitting my bro's SFF like I wanted during a few mock ups) and with the EXACT SAME SETTINGS suddenly instead of distance draw items drawing in as you approach even at max view they now just appear. Special effects (weapon glow, chests glowing, AoE effects) are now showing up. FPS also went up noticeably even though both cards test almost identical in TimeSpy. I ended up unboxing the sold card before shipping to swap them back and forth and yep, exact same settings the 6GB shows much more graphical detail including a much better distance draw. AB shows real time use using almost all of the memory (a bit under the static dynamic allocation) and using as much as possible on both.
WoW has a new function where you can select your target fps and it will dynamically adjust to attempt to keep within your fps window. I have it turned off. I don't know even with it off if this is crossing over with real time rendering adjustments based on VRAM regardless of settings.
When I get the KP 3090 in tomorrow, I'll be running the same tests to see how much WoW @ 4k ultra 10 RT max actually grabs and is actually using 3070 8GB vs 3090 24GB. I'll also run some tests with targeted FPS on and off too.
@yrekabakery I know you monitor and follow VRAM use cases pretty meticulously. -
You could mount the radiator to outside of the rear panel. They make brackets specifically for that. Except for the fact you have a closed loop AIO, which makes that idea a no-go without some modding. You'd need to detach the hoses from the radiator and go back together using clamps. Easy enough to do, and not a big deal if you plan to keep the GPU.Rage Set, Papusan and electrosoft like this.
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.