So how high can y'all OC the 7980/9980/10980X series for 1, 2, 4 core loads?
I was hoping to do something like this:
1-Core 4.9Ghz
2-Core 4.8Ghz
3/4-Core 4.7Ghz
5-8 Core 4.6Ghz
9-18 4.3Ghz
Is this something that can be done? My older CPUs don't like being too far apart from single and multi core but I heard these newer CPUs are different.
-
Robbo99999 Notebook Prophet
Death Stranding surprisingly has low CPU requirements according to the published required specs: https://www.systemrequirementslab.com/cyri/requirements/death-stranding/13762
It also performs well on low core CPUs, and by looks of it doesn't show much performance benefit beyond 6c/12T CPUs:
https://www.overclock3d.net/reviews...c_performance_review_and_optimisation_guide/4
And this one showing no benefit beyond 6c/6T CPUs:
https://www.dsogaming.com/pc-performance-analyses/death-stranding-pc-performance-analysis/#:~:text=Death Stranding also performs incredibly,and an average of 105fps.&text=It's also worth noting that,require a high-end GPU.
It could be your fast & low latency RAM is synergising with the CPU and allowing for the CPU to be pushed further to higher framerates and thereby increasing CPU usage by allowing for a higher framerate (you've got 200fps at some points which is not the case in the reviews), but I'm not really sure about that. I find it surprising it's using so much of your CPU given these reviews I've linked. Strange!tps3443 likes this. -
Hey you can overclock each individual core and set voltage per each core. No need to drop the frequency for more cores being active. So you can overclock each core at its own voltage and frequency individually.
Run a heavy load take note of each cores temperatures, and overclock from there.
So, you can run your hot cores slower, and the cold cores faster, and the best cores the fastest.
I run 4.8Ghz on my 7980XE fairly easily on all (18) cores.
Running 4.6Ghz, is like running it stock lol.
10980XE is better silicon. But, it’ll run a little warmer than a delidded 7980XE.
I run 4.6Ghz at 1.203V or 4.8Ghz at 1.286V.
^ You should do better than that being an 10980XE. -
That is very interesting. My performance does seem pretty good. I run no DLSS, and I do not use the other scaling options either.
Performance is adequate for a 240HZ panel at 1080P with no DLSS at all, using a 2080Ti on maximum detail settings.Last edited: Oct 26, 2020 -
Looking over that review you posted again, my performance at 1440P looks better than their performance at just 1080P.
My 2080Ti FE is slightly overclocked, but it’s still air cooled and on the stock bios. So nothing crazy by any means. Running in the very low 1,900’s sustained usually.
I am still waiting on my waterblock to arrive so I can run a steady 2.1Ghz+
I have double checked my graphics settings. DLSS is indeed off, it’s just not needed.
Also, looking at their task manager usage it says 67%. But all (16) threads of the 9900K look to be maxed out to 100%. I dunno how accurate that may be though. You mentioned something about task manager reporting wrongly. So maybe? -
RIPPPPPP EVGA cancelled the order. They said it was an error of it being in stock.
Might just return the CPU and take a hit at this point. Can't find any mATX boards for cheap -
I am seeing some on eBay for $100-$130.
-
@Robbo99999
So I think the game does use a lot of CPU.
I found this video of exactly what I was looking for. He runs 4K/1440P/1080P with and without DLSS. At 1440P, and 1080P his system is literally micro stuttering, and GPU usage is dropping down in to the high 70’s % usage range.
That’s with a 3900X but, it starts to waste GPU power at 1440P, and even worse at 1080P. With just a 2080 Super.
Last edited: Oct 26, 2020 -
I reserved one, I am debating going and grabbing it. $809 for a 10980xe retail box and brand new is a really, really good deal. But I don't think I will because I enjoy my 10900K and would rather wait for next gen X series at this point. DDR5 and hopefully a next gen HEDT are coming 2021.electrosoft and ssj92 like this.
-
Idk if I want a used eBay board for a brand new $1,000 CPU I was planning to keep for awhile.
The EVGA Micro2 was $109.99 new with 3 year warranty. Waiting to see if the company charges any fees for returns. Really wanted X299 set-up but it was not meant to be. -
Evga warranties used products purchased, 2nd hand owners have a warranty.
-
Good to know, guess I can ask the sellers, although idk if anyone of them will have much left lol
-
Yep I registered my dark, it has like 670 days left.
This is why used non working evga 2080Ti’s with warranty stickers on it fetch a premium. Because you can get it fixed by evga. Just make sure to see exact photos, and serial numbers for future reference.
Evga is great. And my next expensive GPU will more than likely be a evga. Whenever I eventually decide to upgrade from my 2080Ti FE.
My first 2080Ti was a Gigabyte Windforce, I received the card, and it was artifacting like a beast. “This has been a few months ago” “Common issue with early 2080Ti’s” “Space invader artifacting”
Gigabyte pretty much told me to suck it. Very upsetting news..And it took them weeks just to tell me that lol. -
The platform for gaming in 2020 has become very viable. Much more so than when it released. The performance I see out of (18) cores in the game Death Stranding is just ridiculous. It uses every thread to 50+% capacity and even higher in some cases.
I am however looking forward to a
next gen HEDT platform. Hopefully they release one.
X99 came at the beginning of DDR4, so I hope intel does it again. -
Robbo99999 Notebook Prophet
I don't know man, either way you've got enough CPU power for any game! -
Robbo99999 Notebook Prophet
RTX 3070 reviews out, as fast as 2080ti:
https://www.guru3d.com/articles-pages/geforce-rtx-3070-founder-review,1.html
RTX 3080 30% faster than RTX 3070 whilst here in the UK the 3070 is going for about £540 cheapest, whilst I bought my TUF 3080 for £700 (yet to arrive) which is 30% more expensive than the 3070 here - so 30% faster & 30% more expensive, but an extra 2GB of VRAM over the 3070....didn't make such a bad choice choosing a 3080! -
Choosing the 3080 is the way to go if you can get your hands on one for sure.
But, these numbers from the 3070 review are really really low. Their 2080Ti performance is like bottom of the barrel NON-A low power 250-260 watt variant.
I just hit reset to default on MSI AB on my 2080Ti FE. I ran timespy, and my 2080Ti achieved this. Bone stock default settings only using the default MSI AB fan curve.
They have a 13,600 for their own 2080Ti, I mean what the hell is that lol? And similar numbers for the RTX3070. But it is very unrealistic.
Seeing their power consumption chart tells it all. Their 2080Ti was moving 260 watts. While the RTX3080 is pulling 338 watts.
I really like the RTX3070, and I wish I had one personally so I could compare the (2) GPU’s head to head. I wish he would have used a Founders Edition 2080Ti for comparison though.
Last edited: Oct 27, 2020 -
Robbo99999 Notebook Prophet
He did use a Founders Edition 2080ti, that's where the figure comes from:
https://www.guru3d.com/articles_pages/geforce_rtx_2080_ti_founders_review,33.html
Generally I think Guru3d is pretty solid for getting performance information from. I'd trust somewhere like GamersNexus more though because they're more technical, Hilbert is a bit looser with his science.
His results for the 3070 (guru3d) does seem to tally with the "press release" from NVidia as to what to expect from the 3070 performance - 2080ti performance. So I don't think there's a problem with his figures. His graphics score for the 3070 in Timespy also matches closely to a few other sites I've seen review the 3070 when I look just now. I think his information is good in terms of 2080ti / 3070 / 3080 performance comparisons. -
I watched the Jayz 2 cents RTX3070 review. And he did a head to head comparison. His numbers seemed to be a little more in line with what to expect. Guru3D just copy and pasted a timespy run from 2 years prior. And it was low.
Anyways, Jayz2cent achieved a little more than 14,500 with a FE 2080Ti in timespy graphics score. This made more sense to me ^, and is in-line with what to expect.
One thing surprising about the 3070, was how cool the card seemed to run through testing on its stock configuration FE cooler. Nvidia did a good job with that.
Watercooling a RTX3070 would be worth it, and then soldering on the 8 ohm shunts, or possibly bios flashing “If you can”. I imagine you could squeeze some serious performance from that $500 dollar GPU.
Obviously 8GB is still a limiting factor. I dunno why Nvidia didn’t just go all out! They certainly can do it.
The RTX3080 is blazing fast! And so is the 3070. But they left them with drastically smaller memory capacity compared to the 3090.
I think the 3080 should have had 12GB-16GB of Vram. And then give the 3070 10GB-14GB of Vram. Priced them $50 dollars more expensive, waiting another 1-2 months to manufacture as many as possible. And then sell them all!Last edited: Oct 27, 2020electrosoft and ajc9988 like this. -
See, I'm waiting for tomorrow to decide anything, because supposedly the 6800 XT can out rasterization the nvidia cards and is between the 2080 Ti and 3080 on RT, meaning price is what determines.
Also, there is supposedly a 3070 Ti and 3080 Ti on the 102 die coming. The 3080 Ti will be like 9984 cores or of like 10774 full die or around that. The 3090 was cut down around 2%. This is closer to 9%. The 3080 is around 18% cut down.
Either way, we'll know more tomorrow! -
Why should they when they still haven’t pushed out the Ti models? Nvidia ain’t stupid. Of course they don’t go big and high before Big Navi is out. That would be stupid. Both on price point and performance.
-
They can't beat the 3090 without going to the 100 die, because the 3090 is only cut down 2%. Sure, they can go in that 11% difference, but that is it!Papusan likes this.
-
There is no point push out faster, more expensive cards before they are forced to do so... https://videocardz.com/newz/nvidia-preparing-geforce-rtx-3080-ti-with-9984-cuda-cores#disqus_thread
+ its easier put a correct price point after AMD is out with the cards.Last edited: Oct 27, 2020 -
@Papusan
I have something weird going on with windows. Maybe you’ve seen something similar.
So, I can power on PC and all is well. 0% CPU usage, 8% ram usage. Benchmark performance is good. 28-31C idle core CPU temps inside of a closed case.
I can walk away for a few minutes, come back and see (1 or 2) of my (18) cores sitting at 40%-50% usage. And those same exact threads will stay like that until the end of time. Even during normal usage of my PC.
I can hit CTRL+ALT+DLT. Open task manager, close task manager without doing a thing. And those (1 or 2) cores fall back to 0% usage immediately. It never fails. So, I can’t find what’s using it these threads.
After a few more minutes, just staring at the desktop. They start to begin silently working again. Until I hit CTRL alt DLT.
I noticed this because my motherboard said my idle temps were 44-47C. Which I know is wrong. But my motherboard reports the hottest core temp on the read out. This is how I come to find this phantom core usage.
It is always the same cores, core 9 or 10. Or both..
Anyways, it is strange. And I figured I’d ask you what’s up.
Nothing is running in the background. Only MSI AB, and Rivatuner.
Rinse and repeat, it never fails.
Last edited: Oct 28, 2020 -
This looks more like Win maintenance tasks kick in. You can check up with Microsoft Process Explorer and google the results. Or look into the link in the posts above.tps3443 likes this.
-
Robbo99999 Notebook Prophet
Either way 3070 is basically 2080ti performance regardless of whether you're arguing 1000 TimeSpy points here or there. I don't see why the 3070 would be worth watercooling anymore than the 3080 or the 3090, in fact I'd imagine it to be less point in doing so with the 3070 if it's easier to cool. I don't think we've seen much performance improvement with Ampere have we with watercooling and even shunt modding, I've not looked it into great detail though, just remember Der8auer didn't get much improvement from an unlimited power mod?
Yeah, I think 8GB is a bit skimpy for the 3070, ok for now mostly, but for how long is less certain.electrosoft likes this. -
It’s worth watercooling and power modding every single one of them.
The firmware that these modern day video cards run is straight cancer. My CPU would let me cook it to death at whatever frequency I want at 110C if I wanted to do that.
Unfortunately we don’t have the same luxury with video cards.
If you watercool a 2080Ti and give it more power, then running 2,100Mhz or higher is sustainable under any load for any amount of time. And then your graphics score is right at a RTX3080 level performance. The same can be said for a RTX3080 though, if you watercool it and give it more power It’ll sustain 2.1Ghz too.
My last NON-A 2080Ti was 27% faster than the 2080Ti in Guru3D’s review of the RTX3070.
Der8auer has good results with power modding the 3080. He could sustain 2.1Ghz. And I think around 2,070Mhz with the 3090. Based on how a stock 3080 runs, it clocks similar to a 2080Ti. So, I would expect similar gains of 20-30%. More or so around 25%ajc9988 likes this. -
Robbo99999 Notebook Prophet
I think you must be wrong about the 20-30% performance gains for a watercooled & power modded 3080. Given that you say this means it can achieve 2100Mhz, and you say it's 20-30% increase over stock, then if you do the Maths this would translate to you inferring that the 3080 only runs at a stock boost clock of 1680Mhz (if you assume a 25% performance figure increase from what you're saying) (2100/1.25 = 1680Mhz) - and that's false, 3080 runs a higher boost clock than that. In fact the 3080 at stock runs about 1815Mhz under heavy load according to the sensor graphs in the 3080 review at Guru3d: https://www.guru3d.com/articles-pages/geforce-rtx-3080-founder-review,8.html. (I think they use Firestrike Extreme Graphics Test 1 for that power test but I could be wrong). This would mean that if you could run 2100Mhz sustained with power mod & watercooling then this would give you the following theoretical performance increase to "stock" (aircooled & not overclocked): 2100/1815*100 = 15.7% performance increase. But, overclocking an aircooled 3080 will net you about 5% performance increase over stock according to Guru3d ( https://www.guru3d.com/articles_pages/geforce_rtx_3080_founder_review,31.html), which means that if you subtract that difference from your watercooled & power modded 3080 then this means only a 10% increase in performance brought about by a powermodding & watercooling a 3080 if you assume it's running 2100Mhz consistently. But this is just me interpolating what you're saying, link me test if you like showing your 20-30% performance increase, would be simpler than me theorising, but I like theorising.) -
So AMD is back. Teasing blows 6900XT vs 3090, but $999.
6800XT about 3080 performance at $649.
6800 beating the 2080 Ti for $580, with 16GB ram. Clear wins with the 6800 over 2080 Ti, they said 18%. -
The slides apparently had AMD's DLSS enabled, and the GPU's were overclocked " Rage mode on", with their CPU/GPU cache enabled. So, I would wait for reviews to see the real numbers.
Some people are saying the nvidia cards had no DLSS on in the slides. But, AMD has their version of DLSS on. And they said they had RAGE mode on with GPU caching too.
I am curious to see some real numbers. -
GrandesBollas Notebook Evangelist
I'm glad to see the real pressure put on NVIDIA. But, AMD's history with supporting drivers makes me extremely wary.
-
electrosoft Perpetualist Matrixist
AMD 6000 series reveal:
Pros:
6800xt ~= 3080 @649 w/ 16GB ($50 less)
6900xt close enough 3090 @ $999 ($500 less)
Cons: No tests were done with RT enabled. Testing was done with a 5900x (status of the synergy between GPU:CPU and 6000 series unknown).
As always, presentations present products in their best light so I expect a reality correction when reviewers get a hold of them as always.
I can't see wanting to save only $80 and go with the 6800. It would be the 6800xt or 6900xt.
Would have liked to see the 6800 come in at $549 and it looks to beat the 3070 soundly (RT unknown).
Having played WoW w/ RT enabled vs off, there is no scenario in my future gaming where RT is not an option. If the 6900xt can somehow provide better RT performance than the 3090, I'm all over it.
Overall....Impressive....so impressive....and it really puts Nvidia's moves into their proper perspective. I'm glad AMD seems to almost be doing to Nvidia what they did to Intel and that's just great for consumers.Convel, Robbo99999, Rage Set and 2 others like this. -
Ok this is the best I can do. Earlier when you posted the RTX3070 review, their 2080Ti scored 13,600 graphics score. Well, a stock 2080Ti runs around 1,800Mhz too.
This is a picture of my last 2080Ti at 2.1Ghz. It is 27% faster than the same 2080Ti in the Guru3D review. And this is not some suicide run, this performance translates to gaming. Because the clock rate is sustained long term. All of these RTX cards heat up, and clock down and performance goes with it. I have no doubt, that if you watercool and power modify a RTX3080, the exact same thing can be achieved by atleast 20% faster performance.
The RTX2080Ti shows huge gains by increasing the core clock to only 2.1Ghz. The RTX3080 has so many Cuda cores I can’t imagine how well it would do. The card would probably yield 22-23K in timespy graphics.
Also, the card below was a NON-A, so I was out of juice yet again at around 416 watts of power consumption. If it were an actual FE it would be even better.
This is why we watercool, it puts the graphics card in another tier.
Last edited: Oct 28, 2020 -
Also, the 6800 seems more like a 1440P card, with 18% over the 2080 Ti at that resolution claimed. It also has 16GB, so no issue like the 3070 on settings hitting performance.
That makes it seem to be placed for a 3070 Ti future product.
Edit: to be clear, if the leaked benchmarks are true, we could wind up where you need AMD for fire strike extreme, nvidia for port royal, and don't know who will take time spy yet.Last edited: Oct 28, 2020Papusan likes this. -
its an interesting presser, to be sure. Will likely wait for Nvidia and AMD to hash things out once the dust settles then I will likely pull the trigger on some new components provided things are more or less on the up and up.
ajc9988, Papusan and electrosoft like this. -
Yeah, Nvidia waited this and we will soon see 3090 performance from 3080Ti at much lower price point
And 3070 Ti will be on GA102.
+ the 10GB Nvidia cards. Where will those end?
Dead end before arrival! And who want 8GB 3070 cards now?
Last edited: Oct 28, 2020Rage Set, ajc9988, Talon and 1 other person like this. -
electrosoft Perpetualist Matrixist
That's the smartest approach definitely.... -
electrosoft Perpetualist Matrixist
Near 3090 performance, but sure.
There's no scenario where a 3080ti doesn't make an appearance @ $999.99 with the 6900xt seeming very capable but that has been referenced already on videocardz.
For me, it's all about RT performance from this point forward as my main focus. WoW is just so damn pretty with it enabled @ 4k. -
I am starting to come around and enjoy ray tracing. I would always leave it disabled in every game i played that supported it. But, when I played Metro Exodus I enabled RTX on Ultra just for fun, and I found my self where i couldn't live without it. And I eventually beat the game with it on. I was only getting about 60-80FPS VS. 100-120FPS. But, it looked amazing, and after i became uased to it. I just couldnt turn it off. -
1st party benchmarks are always inflated. They tested the 6900XT with both smart access memory and RAGE mode (see overclocked) enabled. No mention of RT performance is telling. No answer to DLSS 2.0.
Still it's good to see AMD be competitive with Nvidia, even if it appears they won't be with ray traced tests. Nvidia cheaped out with 8nm and 10gb/8gb cards. They wanted to keep their margins nice and fat and it might just blow up in their faces which is a win for consumers. Nvidia will have to move quickly to 7nm and increase memory capacity for their cards. If the rumors are true were simply getting 3080 Ti and 3070 Ti which sort of sucks since they still going to be 8nm cards. Unfortunately it seems Nvidia can compete with 8nm and will save 7nm for next gen cards to protect their margins.
Edit:
Some additional thoughts on this new "Smart Access Memory" seem somewhat anti consumer, at least if we apply the same standards that have been applied to Nvidia in the past with their own proprietary hardware. It's closed to AMD CPUs, proprietary and performance uplift is contingent upon only buying AMD hardware. Reviewers are going to need to do reviews with it on and off as it requires are new 5000 series AMD CPU AND 500 series motherboard. Going forward, it would seem Intel and Nvidia will need to team up and create their own solution which just adds more closed eco systems and more anti consumer gates.Last edited: Oct 28, 2020Robbo99999, electrosoft and Papusan like this. -
I wonder how many chips Nvidia have put aside for the Ti cards
Of course it will be shortage for current Amp cards when wee as well Nvidia knew what would come from AMD
electrosoft likes this. -
My desktop still maxes everything I play so I dont lack for performance but admittedly I would love to have a bit more beef.
I still have to make sure I can finance the P750ZM mods that have been underway for a few months now, hopefully maybe over the next week it will be wrapped up so my attention is there and it will scratch that "upgrade" itch. As well as, allow for me to use that as my main for a bit while I sell the 2700x / 5700XT and maybe even RAM if I really want to overhaul the current system. By then we may have some 3rd parties enter the fold as I would prefer to have a 2.5/3 slot card on air. I can just move my current 15mm(iirc?) bottom fans outside the case if its 3 slot.
I may finally have a reason to get an NVME drive for games, currently still have no real need of one so that would be cool. A 2TB NVME for primary games and 2TB SATA SSD for secondary would be quite the luxury.
I still dont really care about Ray Tracing, still bummed that other technologies never got a real foothold. PhysX and Water PhysX (or w/e it wouldve been called). Were in 2020 soon to be 2021 and liquids are still either wax or off camera and I find that appalling personally. Its been my pet peeve for basically a decade now.
I think you guys might like the P750ZM that has been in development when its done, dont want to rob the modder of his thunder so I'll let him take care of the details when he's ready to...Robbo99999 likes this. -
Either way, the AMD 6000 series will more than likely take a huge demand off of the RTX 3000 series GPU’s.
Robbo99999, electrosoft, Papusan and 1 other person like this. -
electrosoft Perpetualist Matrixist
I've played WoW since Vanilla launch so I've been through all the engine upgrades, changes and enhancements. Subtle but nice. I've been playing the latest Xpac for a few years so I've gotten used to seeing certain graphic elements. I even spent a week or two playing @ 4k with my new display with a 5700xt (which quickly let me know I'm gonna need something with more punch even without RT). So when I got my 3090 and toggled RT on and started to play the shadows and depth and detail thereof took me back. I found myself turning it off and on and rerunning certain raids and instances just to note the differences and realized RT is a must for me from now on and any GPU consideration will be judged with it on only.
If the 6900xt brings the high heat w/ RT, I have no problems selling or returning my 3090 (return window is till January 21st @ BB) and pocketing the difference. I wouldn't return it for a 3080ti to save ~350.00 (assuming a $999 launch price).
I look forward to third party reviewers.
I just watched Steve's GN breakdown of the presentation and didn't realize AMD had RAGE MODE and Smart Access Memory on when comparing the 6900xt to the 3090 and again no real mention of RT. -
electrosoft Perpetualist Matrixist
If it is giving you what you need, no need to spend then in the here and now.
I enjoyed your Alienware 17 mod thread. I look forward to see what you do with the P750ZM. -
Does it even matter? Cheaper and good enough performance stealing sales. Nvidia's current line cards ain't good enough (3080/3070). Who want 8 and 10GB cards now? Only because of Nvidia's extra features over Big Navi?
Edit. It may boil down to who can offer more for less with around equal performance. Or able to deliver cards. But have 8/10GB high performance cards even a life after this?
Last edited: Oct 28, 2020temp00876, Rage Set, ajc9988 and 1 other person like this. -
Provided I finally get a working board then it shouldnt be too long here.
3x dead boards and logistics over the pond tends to drag things out.
I personally think it will be the pinnacle of DDR3 era hardware in terms of gaming for maybe 2017 and earlier titles. Naturally it may not be the greatest for things being released today due to games finally taking advantage of additional threads/cores but I didnt build it for that purpose. This mostly started as a way to re-purpose old hardware, I think you may have participated in that thread as well... Anyways Mass Effect 2/3 felt off in my desktop and that bothered me. So here we are four months later lol
It will also be my gaming medium while I am using my stationary bike, as streaming doesnt seem to work to well when everything is wireless and the router no longer in my room... -
Will we need 8-core, 16-thread CPUs for gaming soon?
https://www.pcworld.com/article/358...hread-cpus-for-gaming-soon-ask-an-expert.htmlRobbo99999 and ajc9988 like this. -
Doesnt suit me, my desktop will likely always be ahead of any baseline, but my P750ZM will be a strong secondary machine since it almost rivals my desktop as it is now lol.
I guess I should really get in gear for selling off my 13 R3 and maybe the 17 R1 parts. I dread making repeated stops to the post office but oh well -
Robbo99999 Notebook Prophet
(Guru3d low'ish 2080ti score might be because drivers, old drivers from launch)
I'm talking about the 3080, not the 2080ti anyway, so I asked you for an example of that. I don't think, in fact I'm pretty darn sure there's not a 20-30% increase in performance left on the table when watercooled & power modded, because you'd need a core clock say at least 25% higher than the standard stock air-cooled boost clock - so as we can see the boost clock under heavy load in Guru3d review of the 3080 is 1815Mhz (one of my previous links), so 1.25x1815 = 2269Mhz, so you'd need at least 2269Mhz to claim 20-30% faster than stock. So that's not gonna happen. Link me to the water-cooled and power modded 3080 results if you like, I've been lazy & not googled it yet. I guess I have problems when people make unsubstantiated and likely wild claims about stuff, for instance you saying that 3080 when water-cooled & power modded is worth a 20-30% increase in performance. It's for sure not 20-30% extra performance over an overclocked air-cooled card, and I'm also pretty darn certain it's not even gonna be 20-30% extra over a completely stock card. Like I showed before with the Maths, the most I'd expect to see would be 10% extra performance over an overclocked air-cooled card - that's assuming a stable 2100Mhz on the core.....and in reality I'm betting it would only be 5-10% because unmodified air cooled overclocked cards are hitting easily over 1900Mhz already.Last edited: Oct 28, 2020
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.