Watch Dogs v1.06 actually finally fixed the texture streaming issues and made the game A LOT smoother... if you weren't running SLI, because v1.06 completely and utterly broke SLI scaling (I'm talking 20% or less). Thankfully they tested using a single GPU so this isn't an issue.
4K results don't need much explanation, but I'll give my take on the 1080p results.
If you take a closer look at the 1080p results, you'll see that the 980 is actually using 3.9+GB of vram, while the 970 is still stuck below 3.5GB. Maybe the driver detects the game is running "only" at 1080p and thus refuses to let the GPU tap into the 500MB slow segment? In any case, you still see a lot more spikes than the 980, and the kicker here is the lowest spike on the 970 is only marginally less than the highest spike on the 980. So the 1080p result I interpret as showing what happens when the driver forcefully caps vram usage to 3.5GB -- even though the slow segment isn't accessed, performance still sucks, because as you said it's likely thrashing textures in and out of system ram.
-
I thought certain software could only detect the first 3.5GB, and thus only read out 3.5GB. But those graphs show stuttering dips, it could be UBI crap's game that is having the problem. That's a weird theory since Windows manages the vRAM.
-
The 970 vram readout for 4K was fine (3.9+GB), so I doubt that's the issue.
Yes we all know Ubicrap's Watch Dogs optimization, but again, v1.06 finally fixed the texture streaming issue and actually did make gameplay a lot smoother if you weren't running SLI. Also, why does the 970 experience more spikes AND with more severity than the 980, especially when FPS is practically the same (31.8 vs 30.0) at 1080p testing? -
edit --
NVIDIA has responded. They are working on a driver fix. -
-
-
This is the Reddit post D2 is referring to btw. tl;dr version below
-
https://forums.geforce.com/default/...tx-970-3-5gb-vram-issue/post/4438090/#4438090
No idea how they will fix it. It's too technical for me.octiceps likes this. -
-
I would love nothing more than to switch to AMD as nVIDIA is a disgusting company.
But alas...
https://www.youtube.com/watch?v=69fsTLJw4oQMr. Fox likes this. -
-
As a notebook user there's no choice where pure power is concerned, unfortunately. Maybe that is suspect to change. As for desktops, especially single GPU, AMD is a pretty solid choice.
Mr. Fox likes this. -
Like I've said a month or two ago, if one is going for a single desktop GPU setup right now, the R9 290 would be my recommendation. Even more so after this 970 disaster.
Mr. Fox likes this. -
If all I ever did was click-n-run gaming like a console jockey riding a PC donkey, as happy as a 'possum eating dukey with something as simple as 60 FPS gameplay, I'd probably switch to AMD and hope the GPUs last at least 2 years before they crap out.
Zero989 likes this. -
From HardwareCanucks::
Battlefield 4:
Less than 3.5GB VRAM:
GTX 970 is 15% slower than GTX 980
More than 3.5GB VRAM:
GTX 970 is 21% slower than GTX 980
Result. 6% loss going over 3.5GB VRAM usage
Dragon Age:
Less than 3.5GB VRAM
GTX 970 is 12% slower than GTX 980
More than 3.5GB VRAM
GTX 970 is 17% slower than GTX 980
Result. 5% loss going over 3.5GB VRAM usage
Hitman:
Less than 3.5GB VRAM
GTX 970 is 15% slower than GTX 980'
More than 3.5GB VRAM
GTX 970 is 17% slower than GTX 980
Result. 2% loss going over 3.5GB VRAM usage
Middle Earth Shadow of Mordor
Less than 3.5GB VRAM
GTX 970 is 11% slower than GTX 980
More than 3.5GB VRAM
GTX 970 is 17% slower than GTX 980
Result. 6% loss going over 3.5GB VRAM usage -
Will it matter more or less when DX12 is using tiled resources. Another nvidia fail...
-
-
-
I have no idea what Hardware Canucks did.
-
-
All I know is it seems all other GM204 based GPU's are fine. Lesson here, if you're concerned, don't buy a GTX 970 desktop GPU. They should have just made them 8GB like the mobile counterpart and this likely would not have been an issue.
-
I even have 2GB more VRAM on my 970M than 970, so suck on it
RMXO likes this. -
Yeah well, I play BF4 on Ultra and I barely use 2.2GB vRAM... Honestly can't see the point of mobile cards having 6-8GB vRAM when desktop cards,which you use for higher resolution (2K/4K) use more vRAM and have less.. Well at least it's good to know that my 970M didn't get screwed like all the 970 owners...
-
-
-
Maybe there is some differences between games on how agressive they need to write/read from VRAM and how much they just cache in the VRAM since its free and available for the game engine?
I also think its difficult to see difference unless you have used the higher capacity version of the same card on the same game. Load times in the games, stuttering and smoothness etc. Many things that can come to play I guess. -
-
Don`t tell me you actually think 3GB is better than 6GB lol. -
nVIDIA always allocates vRAM AFAIK, even if a game isn't programmed that way. Check BF4 vid mem usage as you'll see it go over 2GB with some cards that have more than 2GB. Yet 680/770 never had stuttering with maxed details using the same settings. It's not always possible to get an actual fix on the amount of vram usage until stuttering occurs.
Cloudfire likes this. -
OC = Overclock of GPU limited by vBIOS to +135MHz to 1059MHz with boost to 1194MHz, vRAM at 6000MHz
Last edited: Jan 29, 2015Cloudfire and LoneSyndal like this. -
See? 980 and 970 arent using max vram, yet 980 allocates more. I bet 4gb card is fine for ultra textures.
-
More user testing from OCN
Oh and remember PeterS's promise to do his best to help if someone wanted to return the cards? Yeah about that:
Zero989 likes this. -
wow. Disgusting company sounds like a compliment for nvidia. There are no words...
-
Last edited by a moderator: Jan 29, 2015
-
-
My concern right now is this: what if the major hardware sites simply don't bother to do any frame time testing, or they run them but present the results in such a way that obfuscates the issue at hand, and allows plausible deniability on nVidia's part? We've already seen nVidia's official press release regarding average FPS numbers, which to me was pretty well useless.
And if you want a conspiracy theory spin to this, consider that nVidia released the FCAT tool to basically highlight AMD's frame pacing issue in CF and publicly shame them, when AMD's defense was "oh but look we get very high average FPS numbers so there's no problem!". But now nVidia is doing the exact same damn thing of only releasing the average FPS, which does jack for determining whether there would be stutter in this context. It is quite curious then, why a tool nVidia designed specifically to detect this problem, is not being used by themselves in this situation where it would be most appropriate. Perhaps the truth really hurts?
[/tinfoil hat]octiceps likes this. -
-
-
Before anyone tells me to go get a refund if I'm so unhappy... I spent $350 on watercooling these cards, and there's no way in hell anyone is ever going to refund me a dime for those, so I'm pretty much screwed either way.
-
The point is this. If Nvidia marketed this as 3.5Gb, would you not have bought it. Was specifically having exactly 4Gb not 3.5Gb of full speed VRAM a major factor in your buying decision.
If the answer is no, SHUT THE HELL UP -
If they can`t simple math like this straight, no wonder AMD is losing a ton of money.
If they want to bake in all the data from different measurements using across a whole test using FCAT, run the test 5+ times, and bake it in to an average FPS, thats is just as valid.
You don`t like it because it presented the card in a relative positive manner. So you couldn`t continue your pathetic Anti-Nvidia campaign.
You didnt like pure FPS numbers from reviews because you couldnt see details.
Now we have a tool like FCAT that measure frame rates across the whole test and give a complete picture, and suddenly its a conspiracy where reviewers and the program tries to cover up it all.
Ok, derp.
There is a growing number of reviews that found nothing to very little using FCAT and testing various games over 3.5GB usage.Last edited: Jan 29, 2015 -
i think that amd poster was aimed at their own cards@ when amd says the card has 4GB then its REALLY 4GB and not 3.5+0.5
-
-
Ya holy! That is one monster 970m
-
First of all, sometimes it's the PRINCIPLE that matters, I paid for something I didn't get, no matter how trivial it may seem, but I'm sure I'll get burned for this. And second, actually yes, if nVidia had properly advertised the 970 as a 3.5GB 224 bit card, I would've actually got the 980 instead.
Yes I know HardwareCanucks ran FCAT tests, but until they post the frame time results, it still doesn't mean much. It's possible to get high FPS while still stuttering, due to uneven frame output, which is exactly what happened with AMD before they finally implemented frame pacing in their drivers. AMD's defense all along was "the average FPS numbers are very high, how can there be stuttering?". So nVidia released FCAT to prove there was stuttering. Yet now the same thing is happening to nVidia themselves, and they are using the exact same defense.
The problem with FCAT testing with a single card, is people could (and have been) simply saying "yeah ok so there's some differences, but you're getting like 25 FPS anyway so you're not going to be playing that game". Which is true. BUT, the key issue here is for multi-GPU setups. Even when you have enough GPU core power to push 60 frames with both the 970 and 980, the 970 would still crap out prematurely due to the vram wall, even though the 980 would stilll keep trucking on. I know it may sound silly to argue over 500MB, but as we've seen already, the trend is for PC ports to get crappier optimization, and for vram requirements to balloon like crazy. Especially when running at high resolutions (or using lots of SSAA), vram will start getting chewed up like crazy, and the last thing I want to happen is for my performance to tank because of vram, and not because I ran out of GPU power. If you need an example, think 780 Ti and Watch Dogs. (yeah ok not the best example, but that's a case of the GPU hitting the vram wall and killing performance)
So while this may or may not be as big of a deal for those running single GPUs, for people running with multiple cards it's a dealbreaker, and this problem will unfortunately become more and more common as newer games (and older games when you run 4x DSR) start hitting that 3.5GV vram wall.
tl;dr Essentially breaks 4K/high DSR gaming for 970, especially for future titles.octiceps likes this. -
My results ( Gigabyte P35X, 980M 8GB):
https://www.paste.to/2pFFhF7g
Seems the 980M 8GB can allocate all of its RAM without issues. -
Guru3D
Middle Earth: Shadow of Mordor: No problems with 970
PCLab
Assassin Creed 4: No problem. Less framerate "problems" than 980
Shadow of Mordor: No problems
Far Cry 4: No problems
HardwareCanucks
BF4
Dragon Age
Hitman
Middle Earth
Used FCAT and got an average FPS drop from 2-6% with over 3.5GB VRAM usage compared to 980. -
Oh man this is turning into a P-R cluster____ for nVidia. So remember how PeterS promised to help everyone with returns? Yeah he backtracked and recanted his statement:
You dun goofed nVidia, you dun goofed.octiceps likes this.
GTX 970 VRAM (update: 900M GPUs not affected by 'ramgate')
Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Jan 23, 2015.