in the end, it doesnt really matter. 970 sales went through the roof for a good reasonthe fact that it has two memory partitions doesnt negate its good price/performance ratio...
-
It's still dishonest advertising at best. I'm not as outraged as I should be because I never intended for my 970s to be a long term, "futureproof" solution. I always viewed them as stop-gap cards and they still are to me. But I can see how someone who picked up the 970 hoping to get a few years out of it would be pretty peeved.
-
Regardless of the price, I was never particularly impressed at how much 970 was cut down compared to 980 when compared to previous trends of x70 and x80 GPUs on the same ASIC (770 and 780 don't count because GK104 vs. GK110). This is just further validation of that.
-
And now we know why the 970 is so cheap -- no reference design, coil whine lottery, and memory partition. Talk about a big pile of mess.
-
It reminds me of this article: Video Card Failure Rates by Generation - Puget Custom Computers
Now that's only from one AIB--ASUS--and mostly DirectCU cards, but perhaps the reason AMD cards fail more often is because they're cheaper. The AIB partners cheap out on PCB and component design to meet margins. That's why it's generally recommended to get AMD reference boards instead of non-reference ones.jaybee83 and moviemarketing like this. -
-
-
Meaker@Sager Company Representative
Well the good news is that the ROPS on the mobile cards are fully intact (where applicable) so this issue does not impact us
Cakefish likes this. -
-
Meaker@Sager Company Representative
AnandTech | GeForce GTX 970: Correcting The Specs & Exploring Memory Allocation
GTX 980 1 Segment (4x8 MC)
GTX 970 2 Segments (4x7 MC)
GTX 980M 1 Segment (4x8 MC)
GTX 970M 1 Segment (3x6 MC)
GTX 965M 1 Segment (2x4 MC)Cakefish likes this. -
Wrong page, but I see it.
AnandTech | GeForce GTX 970: Correcting The Specs & Exploring Memory Allocation
Cakefish likes this. -
Meaker@Sager Company Representative
I listed the start of the article so people could read the whole thing
I then posted the important information in the table.
-
-
thegreatsquare Notebook Deity
-
Meaker@Sager Company Representative
Yes the TMUs are connected to the shader units so they get disabled as the shaders are. This is how it has always been stated, GPU-Z just has not been able to handle it properly.
-
-
Myself I'm actually more p!ssed about this than the vram wall, because unless I understood the situation wrong, this is simply outright deception. There's just no reasonable explanation for why nVidia never bothered to correct the specs when the 970 was selling like hotcakes and sold out everywhere, but suddenly when feces have hit the fan 4 months later, they come out and correct the specs. -
Tell that to the duped 880M owners. It's been, what, 10 months and counting? 2014 was not kind to Nvidia.
-
As for the deception thing, it was supposedly a misunderstanding. Seeing as how their bloody website apparently lists the GTX 470's specs when you click on the GTX 480 and they have all their boost clocks set wrong? I would be inclined to believe that. But either way, people were indeed not marketed a card with 3.5GB fast vRAM and 512MB of slow vRAM. Also, due to the way the card is wired, it means that the game can switch to slow vRAM at 3GB being used by a game, because of windows and windowed mode/borderless windowed/etc. Hell, I'm sitting here using 631MB vRAM. I have chrome open and steam and teamspeak, listening to music and tweetdeck. I've got basically nothing open here. If I had a 970 this means if I launched a game like Titanfall I couldn't use insane textures, because it'd switch over to slow memory. Imagine, my mobile GPU outperforming a big bad 970 because nVidia wired it like frankenstein.
nVidia should offer something back to people who bought it expecting 4GB of fast memory. And should make sure people know about its issues in the future. -
They should have just sold the 970 as having 3.5GB vRAM and left it at that. It would be another distinguishing factor between 970 and 980. But honestly why not 8GB vRAM since their mobile counterparts have that much?
-
moviemarketing Milk Drinker
-
-
In any case I am seriously p!ssed because I am very much a principles guy. I don't game at 4K and have never run into this vram wall, but I'm just as upset as those who are affected because the product has been very clearly (intentionally) misrepresented by nVidia, and that is absolutely unacceptable, and really grinds my gears. -
But I indeed understand your frustration and inability to give them the benefit of the doubt. Things should have been quadruple-checked before launches. Either way, they should have to pay for their mistake, whether it was genuine or not. -
Isn't vram simply mirrored in SLI so you end up with the same bandwidth? If anything I'd think those running multiple cards would be most likely to notice this issue, because for a single 970 you'll run out of GPU power long before you hit the vram wall.
-
-
Texture size (not render resolution) is the main hit for vRAM in games. Remember, I can maintain near 60fps in Titanfall on pretty much max settings on a single 780M using near 4000MB of vRAM:
-
). Even with double bandwidth on the slow segment, its still about 1/4 of the fast segment, so I imagine it should cause some degree of hitching/stuttering/magical rainbows.
-
I don't think Watch Dogs' stuttering is necessarily due to VRAM when you have 3GB+, but due to its poorly coded texture streaming. If you recall, that patch near the end of last year fixed pretty much all stuttering on single GPU systems, but completely broke SLI scaling.
-
Oh and we all know Ubitoad is responsible for Watch Dogs' stuttering. Maldo's mod where he re-packs the "ultra" textures into the "high" setting for Watch Dogs eliminated LARGE amounts of stuttering for most people; I'm fairly sure that the game was coded to be terrible for people. -
Yeah I remember we having that discussion about whether it was malice or sheer incompetence.
-
-
-
been through ur guides already
-
-
Meaker@Sager Company Representative
-
NVIDIA might like to claim no performance impact but I am relieved that the 980M (and 970M, 965M) aren't structured in this way.
It's quite interesting to see that the 980M has more ROPs active and more L2 cache than a high-end desktop 970, despite fewer SMM cores enabled. -
Meaker@Sager Company Representative
It's not surprising really though, look at that spread of core configurations, they can practically use every chip they make, faulty cache and/or shader unit into 970, faulty shaders but still efficient into the 970M and 980M and decently power efficient but something really wrong around the core into the 965M.
-
Aaaaaaaaaaand there we have it, user testing proves the 970 is without a doubt, a 3.5GB card.
gg nVidia -
This is shaping out to be extremely funny. I wonder what nVidia has to say about this.
-
-
Well I'm pissed yes, but I do find comic value in nVidia's P-R doublespeak. Also have some popcorn ready to see what happens next.
Most likely nothing, people will forget about this in a month or two, nVidia will continue to outsell AMD 3 to 1 even though their cards costs more and their drivers are just as **** now -
I`ve been gone for a while. Too much hassle going many pages back. What have we learned the latest days? Anything new?
n=1 pissed? Why? Dont the 970 perform like it should in the games? -
-
This whole thing is a mess.
I`ve seen posts and tests showing close to 4GB usage with 970 earlier.
The link that was posted earlier show 4GB usage with Watch Dogs and they didnt get any performance hit
Investigating the 970 VRAM Issue : pcmasterrace -
Yes I've experienced the same. But as someone pointed out, the issue only happens when the game actually NEEDS >3.5GB vram, and not just simply allocating it. Basically allocated vram =/= actively in use, thus why no issues when Watch Dogs gobbles up vram like mad.
-
The horse is probably dead, but I'm gonna beat it again anyway just in case
So we now have definitive proof that the 970 suffers from stuttering once vram requirement goes beyond 3.5GB. Have a look at the frametime graphs here:
Look at the frametimes for Watch Dogs @ 4K (bottom right). See how all those spikes occur in close succession? Yeah that's going to manifest itself as stuttering. I mean look at that graph, it's pretty much a block of spikes at 150ms or worse all the way through, Jesus Christ.
Google Translate of the last two paragraphs: (emphasis mine)
moviemarketing, octiceps and D2 Ultima like this. -
I still don't see how that's definitive proof. For one they use a poorly coded Watch Dogs game.
They run Ultra at 1080p to get 3.5GB
Then they run High at 4k to get 4.0GB
Who's to say that it isn't due to thrashing textures in and out of RAM? 980 shows spikes to 150 or so, just not as frequent. Maybe the 970 can't handle the 4k load as well. Are the video cards the same brand and is it consistent from video card to video card?be77solo likes this. -
GTX 970 VRAM (update: 900M GPUs not affected by 'ramgate')
Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Jan 23, 2015.