My son plays Ark Survival Evolved quite a bit. And this game is from 2016, it goes over 9GB VRAM in some areas at only 1440P. It is a demanding title still though.
Even a stock GTX1080 at 1080P can only provide about 35-50FPS maxed out.
-
electrosoft Perpetualist Matrixist
Throw in Nvidia allegedly canceling their higher memory versions of the 3070 and 3080.....
https://videocardz.com/newz/nvidia-allegedly-cancels-geforce-rtx-3080-20gb-and-rtx-3070-16gbPapusan likes this. -
This won't help Nvidia stealing sales from it's competitor if AMD come closer in performance + slightly cheaper. The only saviour for Nvidia must be that both "sell everything they is able to produce". But not good for Nvidia's market share, LOOL... Btw. You forgot add in this one
Mr. Fox, Rage Set and electrosoft like this. -
electrosoft Perpetualist Matrixist
-
Robbo99999 Notebook Prophet
"Memory Used" shown in GPUz isn't reflective of the minimum amount of VRAM that you could get away with using, because games often dump a load of stuff in VRAM that's not strictly required, kinda like using it as a large cache - if the VRAM capacity is there then the game will allocate more than it really needs. So with your Dues Ex game and it's showing 10GB VRAM used in GPUz, you'd likely get the same game performance using an 8GB VRAM card or maybe even 6GB (or less?). Such points have been made by Steve Burke over on Gamers Nexus for instance (or am I thinking it was Guru3d I saw that, probably both I think), and it's widely accepted that Memory Used as shown in GPUz does not equate to "Memory Required".
That's interesting, so this will show the true minimum amount of VRAM you could get away with having without degrading performance? So if you monitored that on an 8GB card and it was showing 6GB being used, then you wouldn't want less than a 6GB card? Do these new versions of program show you an easy to see single number for overall VRAM usage, or do you have to drill down into some "settings" to have a look at what your specific game process is consuming?Last edited: Oct 22, 2020 -
electrosoft Perpetualist Matrixist
Interesting...hmmmm. I'll have to see if I can find the GN article or YT (hopefully semi recent).
Here is a nice comparison of games at 1080 vs 1440 vs 2160:
In some of the games, you can see the entire (or close to it) VRAM is allocated regardless of resolution.
Others show a sizeable difference depending on resolution and do not attempt to allocate as much overall.
WoW Ultra @ 4k is ~6GB
In DX:MD, as you lower detail levels, memory usage drops accordingly.
4k @ lowest settings = ~4GB
As as you start to push upward, it warns you will need a GPU with more than 4GB of memory (and higher as you increase the details).
The higher the detail levels and AA, the higher the memory use climbs in tandem till at Ultra max everything @4k, we hit ~10.5GB.
How much is inefficient coding and/or using it otherwise? Dunno, but it is being allocated and used.
On the other hand, game companies will continue to write to the LCD so there will probably never be a scenario you can't run now and future games @ 4k on a 3080.
You may just reach a point where dialing up everything to ultimate max pew pew pew zomg levels might exceed VRAM, saunter into DDR4/5 and offer a potential performance hit or just tell you your card's VRAM doesn't support those settings. -
Robbo99999 Notebook Prophet
Here's a recent Gamers Nexus article where he mentions that "Memory Used" as seen in GPUz is not the same as Memory Required, I've cut & pasted his paragraph, but I'm sure he's mentioned it many times:
https://www.gamersnexus.net/hwrevie...review-benchmarks#!/ccomment-comment=10013844
--------------------------------------------------------------------------------------------------------
"We’d like to quickly address one point before proceeding: A lot of commenters have expressed concern about the 10GB framebuffer on the RTX 3080, remarking that 10 is lower than 11, and therefore somehow worse than the 1080 Ti or 2080 Ti. A few reminders: First, when you use software (like GPU-Z -- which is great software and which we highly recommend), it will give you the “allocated VRAM,” but not necessarily the engaged or utilized VRAM. If you plug in a 12GB FB card and play a game, that game might tell you that it’s “using” 11GB,” and could cause you to suspect that this is close to capacity. In reality, it’s requesting that memory, but not actually utilizing it. Further still, memory capacity is very rarely the issue for gaming -- it’s more often that you’re running into memory bandwidth issues first, or maybe even some other part, like ROPs limitations."
---------------------------------------------------------------------------------------------------------
Your last points where you say "but it is being allocated and used", that's not the case as we can see from above (& common knowledge now).
Yeah, and about the Lowest Common Denominator, that's right, as cards get more VRAM as standard, then game developers are gonna start using more of it, so for sure is it possible that some of these cards are gonna run out of VRAM in the future....I do think that will eventually become the case with the 10GB VRAM 3080 for instance, but praps only at 4K and perhaps 1440p.electrosoft likes this. -
yrekabakery Notebook Virtuoso
You completely missed the point of what I was saying, but I see that @Robbo99999 has addressed it. -
electrosoft Perpetualist Matrixist
Thanks for the link
True, I meant to say allocated not allocated and used. That was a typo on my part seeing as that is a central theme to my responses "allocation is allocation" and the main issue is how much is actually needed for various reasons but the coders still deemed it necessary to allocate that amount even if it isn't being used optimally.
I posted that video showing the various mindset of devs where some just grab max frame buffer (or close to it) regardless of resolution/settings and others allocate based on settings and need which DX:MD appears to do and scales accordingly with the program giving you fair warning you will need more VRAM as your settings increase (duh).
In the end, we're trying to second guess or establish the hows and whys particular devs needed to request various amounts of VRAM and how much is actually needed, but when all is said and done it is what it is and that is what they requested for allocation for their games and other graphically based programs. Either the card has the VRAM they requested or it doesn't.
This was my point (6 of one, half a dozen of another), but I take it @yrekabakery didn't get it....it's ok.
All I can do is answer their MSI AB response with a screen shot and keep it moving....
-
yrekabakery Notebook Virtuoso
Your MSI AB screenshot still only showed allocation, not actual usage. To show actual usage, you need to use the latest beta and check the GPU Dedicated Memory Usage \ Process sensor.Robbo99999 likes this. -
EVGA put out that 450w "XOC" vBIOS which it's not XOC, but rather a marketing ploy. Still the 450w does help in benching if you are going for your highest score. Still I see the GPU bouncing off that 450w limit (and exceeding it at times) when under heavy benching. Even games left unchecked at 4K will draw up to 400w++. With this vBIOS you are essentially removing the power limit for gaming and keeps the GPU clock rates nice and stable. Now you just need to keep it cool. The XC3 is a turd from EVGA, far too power limited. A 3080 FTW3 with XOC is matching or better than some of the cheap $1499 3090 cards that are power limited. It won't ever match up in vRAM, but for pure gaming the difference is so insignificant between the two. The extra $700 is a complete waste. I went back and forth on the 3080 vs 3090, and ultimately decided I see Nvidia moving to 7nm refresh next year and those cards are going to be significantly faster than this 8nm heater.
I'm surprised you enjoyed DLSS 1.0 that much. While they did improve that game's DLSS, it's really crap compared to DLSS 2.0 in recent games where you can't see a difference and at times DLSS 2.0 looks better in terms of image quality compared to TAA.
DLSS 2.0 works on 20 and 30 series RTX cards.Robbo99999, electrosoft and Rage Set like this. -
I have a feeling that some here are going to have an AMD GPU outside of myself. On most of the forums and websites I visit, I see a lot of Nvidia fans pulling an AMD. "Wait until Hopper" They are a bit nervous it seems. On the other hand, AMD fans have been down this road many times. The AMD hype train doesn't exactly get them to the right station on time. The truth of the matter is that AMD is largely always going to be behind Nvidia. Unlike Intel in the past, Nvidia does actually view AMD as competition.
I am waiting patiently for the 3090 KP and the 6900XTX (if the rumors are right) but I do need another GPU (or two). This is the first time in a long time that I only have a single GPU that is shared amongst my desktop rigs.Robbo99999, TheDantee, Mr. Fox and 3 others like this. -
Falkentyne Notebook Prophet
What? Where is that?
Im on the latest beta and there's no such thing as "GPU Dedicated Memory Usage\Process sensor". What is this ? ELI5 please?
-
electrosoft Perpetualist Matrixist
All will be revealed soon, but the 6900XTX definitely has me curious. ~3080 level performance + 16GB of VRAM hopefully competitively priced? Yes please. But I rode this hype train last year with the 5700xt, used it for over a year and found it to be a very competent card. If I had stuck with my 30" 2560x1600 display, I would still be using it.
It is an AMD exclusive, but so was the 5700xt AE and they never seemed to go out of stock and 5700xt's were readily at or near launch and remained that way. Here's hoping the same holds true for the 6000 series.
A 3090 KP is WAY out of my pricing range. I turned down a new 3090 from a buddy when it was 1600+ and I only bought the 3090 FE because of a 10% coupon code and rewards points. I sold the free game code and Geforce NOW for $30 bringing my total cost to ~$1343.00 with a holiday return window till January 21st at BB. -
electrosoft Perpetualist Matrixist
Nudged it over 21k TS GPU....
Robbo99999, tps3443, Rage Set and 1 other person like this. -
yrekabakery Notebook Virtuoso
You need to activate the GPU.dll plugin (three dots next to "active hardware monitoring graphs" in the monitoring settings) before those sensors will show up in the list.Robbo99999 and Mr. Fox like this. -
electrosoft Perpetualist Matrixist
Hitting power limits past this point during runs.....
Robbo99999 and Rage Set like this. -
electrosoft Perpetualist Matrixist
Thanks for the tip man. I like being able to show that and shared. The more data the better.
Rage Set likes this. -
It looks like MSI's desktop MB engineering team finally got some help from the notebook department. Would be better if this have been the opposite.
Z490, B460 and H410: MSI optionally links CPU limits to the CPU cooler computerbase.de
The manufacturer calls this approach "CPU Cooler Tuning". Yeah, a very needed feature. The buyers need it with a spoon. -
If that were my 3090 I would have already taken the soldering iron to it. And watercooling too. You won’t break it, and you can easily desolder these components for warranty purposes.
It’ll probably do 23K+ and hold that performance in games.
Unfortunately with air cooling, these high timespy graphics runs cant relate to real world performance, because it heats up and the clocks go down.electrosoft likes this. -
electrosoft Perpetualist Matrixist
Oh yeah, I know you would have already ripped it apart by now and shunted and blocked it.
I wanted an FE for the aesthetics either 3080 or 3090 albeit I will say I really like the super thick slab look of the 3090.
If I wanted to go power mod + block, I would have held out for a 3080 FTW3 (or swapped like @Talon from his XC) or other 3x8 card.
On another note, I put fans to max as a temp check against runs and this thing at 100% is on par if not louder than a P870TM1 @ 100%.Rage Set likes this. -
Robbo99999 Notebook Prophet
Yeah, cool, my point was that you can get away with less VRAM than is allocated in GPUz for most games without any negative impact on fps or experience - that's quite an important thing to know if you're stressing about VRAM quantities.
So is that actually showing 10GB in use rather than just allocated? @yrekabakery , you're familiar with this new setting, is that screenshot indeed showing 10GB VRAM actually in use & "required"? (or is that showing 255MB in use, that can't be right though?!) -
electrosoft Perpetualist Matrixist
6800XT is looking pretty decent:
https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/ -
Wonder what the 1080p numbers look like (1080p peasant reporting in)
-
yrekabakery Notebook Virtuoso
Nope, he’s showing the wrong sensors (again).
-
Looks like to me that AMD is going to price this card the same as the 3080. The question now is how well will these cards overclock?Papusan likes this.
-
Need to add in this
A return of GeForce RTX 3070 Ti? videocardz.de
AMD's $299 5600X Spotted Flying Past All Intel CPUs in Single-Threaded Performance tomshardware.com | today
The only thing that lacks now is losing in overclockability. Then it's fulfilled for mainstream chips. Intel need to come on track again. But 8 cores Rocket on 14nm++++, only to gain more performance in gaming is too litle. They can perhaps get back the lost in single threaded performance/win back the gaming crown
Aka better than nothing.
Intel's Xeon Scalable 'Ice Lake-SP' Volume Ramp Delayed to Q1 2021 tomshardware.com | Today
Multiple delays of Intel's 10 nm process technology have affected the company's roadmap in many ways. While the company seems to be on track with its client 10nm CPUs, server processors are a whole different matter as they have different qualification and production cycles. In its Q3 FY2020 conference call, Intel announced that it would have to delay initial shipments of its 3rd Generation Xeon Scalable 'Ice Lake-SP' CPUs to Q1 2021.Last edited: Oct 23, 2020Rage Set and electrosoft like this. -
electrosoft Perpetualist Matrixist
Feel free to spell out the steps instead of being CrypticCarl.
Here are the steps I took with the beta 463b2(which is all I've been running since upgrading):
Here ya go:
https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/Last edited: Oct 23, 2020Papusan and Robbo99999 like this. -
yrekabakery Notebook Virtuoso
-
Robbo99999 Notebook Prophet
Ah, have another go if you like with yrekabakery's instruction below, would be interesting to see just how different it is to the 10GB usage you're seeing at the moment.
-
electrosoft Perpetualist Matrixist
Figured out what was wrong.
The beta available from the MSI page directly, which is 463b2 differs from the one from Guru3D which is also 463b2.
I tried the direct from MSI beta on three systems and did not have access to the process selection only the screen caps from above.
I even individually downloaded it on all three systems (My desktop, my laptop, wife's current laptop) directly from MSI:
http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup463Beta2.zip
I then downloaded the beta directly from Guru3D and the options appeared:
https://www.guru3d.com/files-details/msi-afterburner-beta-download.html
Robbo99999, Papusan and yrekabakery like this. -
Falkentyne Notebook Prophet
-
electrosoft Perpetualist Matrixist
The link from MSI I linked was from a few hours ago and was b2 so they need to update then if b3 is out.
that was from the main page too. I just went there thinking the main page link to b2 was up to date.
the beta I downloaded from guru also said b2.
but if b3 is out I’ll certainly grab it. -
Falkentyne Notebook Prophet
Go to guru 3d forums, MSI Afterburner section "RTSS 6.7.0 beta 1" thread, oddly enough. That's where all the discussion is. Unwinder linked it there.
It was a workaround for fans not reporting the correct RPM, and some other stuff.Rage Set, Papusan and electrosoft like this. -
Order EVGA X299 Micro2, only $109.99
Also got a 10980XE...now just need a high end GPU loltps3443, Papusan and electrosoft like this. -
Awesome!! I was running a 1660Ti on mine for a little while, pretty rough lol. Just traded another 7980XE I had for a 2080Ti FE. So, I’m good to go now!
I have a beast again!ssj92 likes this. -
electrosoft Perpetualist Matrixist
@yrekabakery @Robbo99999
Not all but most.....
Robbo99999 and Rage Set like this. -
Robbo99999 Notebook Prophet
Nice one, thanks for testing. Yeah, so that screenshot is showing that it really only needs 1.2GB less than your 10GB figure you were posting before. Does the amount change much from the 9.4GB as you play the game in different parts of the game....so for instance does it mostly show 10GB pretty stable in GPUz and then in your new MSI Afterburner perhaps the real amount of VRAM required fluctuates up & down around the 9.4GB you've got in your screenshot and by how much (roughly)? -
Did you find a deal on your 10980XE?
-
Just started playing Death Stranding. The game is very easy to run GPU wise, and extremely well optimized. It uses all (18) and all (36) threads from 60-100% on average.
For some reason, MSI Afterburner will not update their app to support more than 16/32 processors. Why? I don’t know.. The 7980XE has only been out for over 3 years lol.
Utilization ranging from 63-100% on every single thread! That is amazing. I have never seen a game use a CPU like this.
The game runs like butter, and I just cant believe it uses my 7980XE at 4.8Ghz as good as it does! Absolutely amazing.Robbo99999, Rage Set, Papusan and 1 other person like this. -
Anyone know how to fix MSI AB to see 18 cores? Or am I just stuck using the HWinfo running in the back ground and running with rivatuner?
-
Anyways, whoever hasn’t tried it. Death Stranding is pretty interesting. Be ready for your CPU to get bent over backwards though. Even at 2560x1440. Probably the most CPU intensive game I’ve ever seen.
I didn’t realize it was even CPU intensive until I glanced at my utilization. (36) threads hitting up to 100%.
Wow, who says a 7980XE isn’t a great gaming CPU? -
Robbo99999 Notebook Prophet
If you're using Windows Task Manager to view CPU usage I noticed it over reports actual CPU usage. A program like HWInfo would show you the proper utilisation per core, but the graphic of Windows Task Manager overestimates the usage by quite some amount. Just in case you're using Windows Task Manager, becasuse that does seem like a high CPU usage for such a many thread CPU. -
$989.58 plus tax. Brand new in box.
Kinda thinking if I should just cancel both and get 5950x +AsRock X570M instead along with a 980 Pro SSD lol
$1099.57 + tax for X299 Micro2/10980XE
$1,132.97 + tax for X570M/5950X/980 Pro
Or just stick with my 4930K until Q4 2021 lol
wow it was only $809.99 at MC.... https://www.microcenter.com/product...ga-2066-boxed-processor_Hatchfeed?storeid=101Last edited: Oct 25, 2020 -
It uses every single thread. This is the only game I’ve ever seen this my 7980XE actually use every thread available! I’m blown away. It knows how to use a CPU!
MSI AB only shows 32 threads. But, I have since configured it with HWinfo showing all 36 threads, and it is the same thing.
usage is usually 60-100% walking around the open world. Never seen this in my life. But hey, what a well optimized game. It’s butter smooth.
GPU usage is always 98-99% and I’m averaging around 120-160 FPS at 1440P maxed out.
I minimized the game, so it took the load off, but, if you look at the maximum usage. Just about every core has hit 100% lol.
It is stupid that I have to use HWinfo to see all 36 threads. I’m just flabbergasted that MSI AB cannot take 2 seconds to update these CPU’s in to their own applications. Why the maximum is 32 threads is beyond me.
delete duplicate keywordsLast edited: Oct 25, 2020Robbo99999 likes this. -
Robbo99999 Notebook Prophet
Well to me that looks like you're at an average of about 45% usage per thread (in the more jagged part of the graph, first half before you paused the game) rather than the 60-100% that you say. That's still good CPU usage though and impressive in the number of cores it can use.tps3443 likes this. -
It is a awesome CPU intensive game. I’ve read of people with 9700K’s and they get GPU usage drops, all cores hammered at 80-100%.
When I had a 8086K it would hit 100% on all 12 threads at times during intensive multiplayer and I would see GPU usage fall about 5-10%.
I will get some better screenshots of when I am walking around the open world. That was indoors, and they play quite a few mini cutscenes back to back with all of that lower usage. And apparently the cutscenes are pre-recorded 60FPS for PC. So, usage would drop quite a bit.Last edited: Oct 25, 2020Robbo99999 likes this. -
This is such a fun game. This is 1080P 240HZ with my 2080Ti, and 7980XE at 4.80Ghz, DDR4 4000Mhz CL15. Working all (36) threads like a boss!
Also, this is just walking around. For some reason when Mule’s start chasing you, or if your building something even more CPU usage happens. No action is really going on in these screenshots. I am just getting started in the game.
Last edited: Oct 25, 2020Rage Set and Robbo99999 like this. -
So, at 1080P my 7980XE averages 64% usage per thread across (36) of them. At 1440P this drops to 51% average usage across (36) threads.
So, I am curious as to how this works. Am I right saying that if you only had an 8/16 processor, you would be Literally slamming it to almost full capacity? And this is the only game that could actually do that???Last edited: Oct 25, 2020
*Official* NBR Desktop Overclocker's Lounge [laptop owners welcome, too]
Discussion in 'Desktop Hardware' started by Mr. Fox, Nov 5, 2017.