The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    GTX 970 VRAM (update: 900M GPUs not affected by 'ramgate')

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Jan 23, 2015.

  1. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Watch Dogs v1.06 actually finally fixed the texture streaming issues and made the game A LOT smoother... if you weren't running SLI, because v1.06 completely and utterly broke SLI scaling (I'm talking 20% or less :mad:). Thankfully they tested using a single GPU so this isn't an issue.

    4K results don't need much explanation, but I'll give my take on the 1080p results.

    If you take a closer look at the 1080p results, you'll see that the 980 is actually using 3.9+GB of vram, while the 970 is still stuck below 3.5GB. Maybe the driver detects the game is running "only" at 1080p and thus refuses to let the GPU tap into the 500MB slow segment? In any case, you still see a lot more spikes than the 980, and the kicker here is the lowest spike on the 970 is only marginally less than the highest spike on the 980. So the 1080p result I interpret as showing what happens when the driver forcefully caps vram usage to 3.5GB -- even though the slow segment isn't accessed, performance still sucks, because as you said it's likely thrashing textures in and out of system ram.
     
  2. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    I thought certain software could only detect the first 3.5GB, and thus only read out 3.5GB. But those graphs show stuttering dips, it could be UBI crap's game that is having the problem. That's a weird theory since Windows manages the vRAM.
     
  3. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The 970 vram readout for 4K was fine (3.9+GB), so I doubt that's the issue.

    Yes we all know Ubicrap's Watch Dogs optimization, but again, v1.06 finally fixed the texture streaming issue and actually did make gameplay a lot smoother if you weren't running SLI. Also, why does the 970 experience more spikes AND with more severity than the 980, especially when FPS is practically the same (31.8 vs 30.0) at 1080p testing?
     
  4. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Yeah, I dunno. I see a class action lawsuit inc...

    edit --

    NVIDIA has responded. They are working on a driver fix.
     
  5. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    this is indeed something that needs testing I'll say. But I agree with Watch Dogs being a bad idea; however users on Reddit have used better running games like Shadows of Mordor at 1080p/1440p with the "ultra" textures to bolster textures over the 4GB mark, and the 980s cope better than the 970s. But this is something that needs a whole lot of pure testing. The preliminary results however, undeniably show that using 4GB in a game is a problem for 970s where it is not for 980s. It may not be 100% of 970s, but it does happen in SOME of them, which is not what people bought them knowing, so... yeah.
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Do you have a link?
     
  7. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    This is the Reddit post D2 is referring to btw. tl;dr version below

    And how do you fix a hardware problem with a driver?
     
  8. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    octiceps likes this.
  9. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I wonder what'll happen if I post this over at the nV forums... Instant infraction, temp ban, perma ban? And no I won't actually bother, because first of all it'll just get lost in the stream of posts, and more likely than not some overzealous mod will just delete it anyway. And with that I need to blow off some steam...
     
    octiceps and TomJGX like this.
  10. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Mr. Fox likes this.
  11. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I see this as more reason to switch from Nvidia, as they intentionally screw over gamers not using their products.

    They'll "improve" it with drivers. They won't fix it.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    As a notebook user there's no choice where pure power is concerned, unfortunately. Maybe that is suspect to change. As for desktops, especially single GPU, AMD is a pretty solid choice.
     
    Mr. Fox likes this.
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Like I've said a month or two ago, if one is going for a single desktop GPU setup right now, the R9 290 would be my recommendation. Even more so after this 970 disaster.
     
    Mr. Fox likes this.
  14. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,341
    Likes Received:
    70,661
    Trophy Points:
    931
    If all I ever did was click-n-run gaming like a console jockey riding a PC donkey, as happy as a 'possum eating dukey with something as simple as 60 FPS gameplay, I'd probably switch to AMD and hope the GPUs last at least 2 years before they crap out.
     
    Zero989 likes this.
  15. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    From HardwareCanucks::

    Battlefield 4:

    Less than 3.5GB VRAM:
    GTX 970 is 15% slower than GTX 980

    More than 3.5GB VRAM:
    GTX 970 is 21% slower than GTX 980

    Result. 6% loss going over 3.5GB VRAM usage


    Dragon Age:

    Less than 3.5GB VRAM
    GTX 970 is 12% slower than GTX 980

    More than 3.5GB VRAM
    GTX 970 is 17% slower than GTX 980

    Result. 5% loss going over 3.5GB VRAM usage


    Hitman:

    Less than 3.5GB VRAM
    GTX 970 is 15% slower than GTX 980'

    More than 3.5GB VRAM
    GTX 970 is 17% slower than GTX 980

    Result. 2% loss going over 3.5GB VRAM usage


    Middle Earth Shadow of Mordor

    Less than 3.5GB VRAM
    GTX 970 is 11% slower than GTX 980

    More than 3.5GB VRAM
    GTX 970 is 17% slower than GTX 980

    Result. 6% loss going over 3.5GB VRAM usage
     
  16. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Will it matter more or less when DX12 is using tiled resources. Another nvidia fail...
     
  17. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Again, FPS is meaningless in this regard. Frame times is where it's at.
     
  18. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Read the article. Its a pretty good test and a good indication.
     
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I have no idea what Hardware Canucks did.
     
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Of course. Anything that goes against your hope for major drama against Nvidia is quickly shrugged away. :rolleyes:
     
  21. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    All I know is it seems all other GM204 based GPU's are fine. Lesson here, if you're concerned, don't buy a GTX 970 desktop GPU. They should have just made them 8GB like the mobile counterpart and this likely would not have been an issue.
     
  22. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    According to Anandtech, only GTX 970 and no mobile graphic card have two separate memory banks so I dont even know why we should care :p

    I even have 2GB more VRAM on my 970M than 970, so suck on it :p

    [​IMG]
     
    RMXO likes this.
  23. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Yeah well, I play BF4 on Ultra and I barely use 2.2GB vRAM... Honestly can't see the point of mobile cards having 6-8GB vRAM when desktop cards,which you use for higher resolution (2K/4K) use more vRAM and have less.. Well at least it's good to know that my 970M didn't get screwed like all the 970 owners...
     
  24. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Because the alternative is 3GB and that wont cut it in a growing number of games ;)
     
  25. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Except I have 3GB vRAM in my 970m and so far so good. We'll see how it goes over the next year or so.
     
  26. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Have you played the games that use over 3GB?
    Maybe there is some differences between games on how agressive they need to write/read from VRAM and how much they just cache in the VRAM since its free and available for the game engine?

    I also think its difficult to see difference unless you have used the higher capacity version of the same card on the same game. Load times in the games, stuttering and smoothness etc. Many things that can come to play I guess.
     
  27. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yes, I have, Shadow of Mordor in particular and it did better than the MSI GS60 with 6GB vRAM with the "High" detail textures that supposedly require 6GB vRAM.
     
  28. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That could be one game that cache a lot. Who knows.
    Don`t tell me you actually think 3GB is better than 6GB lol.
     
  29. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    nVIDIA always allocates vRAM AFAIK, even if a game isn't programmed that way. Check BF4 vid mem usage as you'll see it go over 2GB with some cards that have more than 2GB. Yet 680/770 never had stuttering with maxed details using the same settings. It's not always possible to get an actual fix on the amount of vram usage until stuttering occurs.
     
    Cloudfire likes this.
  30. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I meant to do a comparison of the GS60 vs P650SE with 6GB vs 3GB vRAM but never got around with it. In any case here's Mordor:

    OC = Overclock of GPU limited by vBIOS to +135MHz to 1059MHz with boost to 1194MHz, vRAM at 6000MHz

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]
     
    Last edited: Jan 29, 2015
    Cloudfire and LoneSyndal like this.
  31. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    See? 980 and 970 arent using max vram, yet 980 allocates more. I bet 4gb card is fine for ultra textures.
     
  32. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    More user testing from OCN

    Oh and remember PeterS's promise to do his best to help if someone wanted to return the cards? Yeah about that:

     
    Zero989 likes this.
  33. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    wow. Disgusting company sounds like a compliment for nvidia. There are no words...
     
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nice try, Nvidia aficiando, but I was simply pointing out that Hardware Canucks reported their FCAT data incorrectly/not in the proper format, which makes it very hard to deduce what their results were. I expected to see frame time and frame variance by percentile line graphs, not a rounded average FPS bar chart.
     
    Last edited by a moderator: Jan 29, 2015
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Do you have a 120/144 Hz screen?
     
  36. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    My concern right now is this: what if the major hardware sites simply don't bother to do any frame time testing, or they run them but present the results in such a way that obfuscates the issue at hand, and allows plausible deniability on nVidia's part? We've already seen nVidia's official press release regarding average FPS numbers, which to me was pretty well useless.

    And if you want a conspiracy theory spin to this, consider that nVidia released the FCAT tool to basically highlight AMD's frame pacing issue in CF and publicly shame them, when AMD's defense was "oh but look we get very high average FPS numbers so there's no problem!". But now nVidia is doing the exact same damn thing of only releasing the average FPS, which does jack for determining whether there would be stutter in this context. It is quite curious then, why a tool nVidia designed specifically to detect this problem, is not being used by themselves in this situation where it would be most appropriate. Perhaps the truth really hurts?

    [/tinfoil hat]
     
    octiceps likes this.
  37. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You really should give my vRAM guide a read-through if you think bumping resolution of the game is the largest affecting factor to vRAM. Also, downsampling/supersampling is a thing that also requires extra vRAM as if you were running the high resolution natively.

    It's more difficult if you have external monitors plugged in, as they can use a lot of vRAM while gaming (especially on Windows 8.1). If you're fullscreening the games your vRAM will stretch a lot further. I also have no idea what to think about shadows of Mordor and its vRAM hoggage. Seems like it barely needs the thing.
     
  38. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Exactly. Nvidia lied and is still lying to try to cover up their previous lies. I don't believe for one second that they didn't know about this during development and testing. They created FCAT, for crying out loud. You can't tell me they don't use their very own tools.
     
  39. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Before anyone tells me to go get a refund if I'm so unhappy... I spent $350 on watercooling these cards, and there's no way in hell anyone is ever going to refund me a dime for those, so I'm pretty much screwed either way.
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  41. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    The point is this. If Nvidia marketed this as 3.5Gb, would you not have bought it. Was specifically having exactly 4Gb not 3.5Gb of full speed VRAM a major factor in your buying decision.

    If the answer is no, SHUT THE HELL UP
     
  42. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So neither you nor AMD can count? 3.5GB +0.5GB = 4GB.
    If they can`t simple math like this straight, no wonder AMD is losing a ton of money.



    So you get to decide how a reviewer present the facts? No
    If they want to bake in all the data from different measurements using across a whole test using FCAT, run the test 5+ times, and bake it in to an average FPS, thats is just as valid.

    You don`t like it because it presented the card in a relative positive manner. So you couldn`t continue your pathetic Anti-Nvidia campaign.


    So let me get this straight.
    You didnt like pure FPS numbers from reviews because you couldnt see details.
    Now we have a tool like FCAT that measure frame rates across the whole test and give a complete picture, and suddenly its a conspiracy where reviewers and the program tries to cover up it all.

    Ok, derp. :confused:

    But why are you unhappy? Have you noticed anything when gaming?

    Most would have because GTX 970 is still a really good card performance/dollar. The gaming performance is still on the top no matter what happens.

    There is a growing number of reviews that found nothing to very little using FCAT and testing various games over 3.5GB usage.
     
    Last edited: Jan 29, 2015
  43. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i think that amd poster was aimed at their own cards@ when amd says the card has 4GB then its REALLY 4GB and not 3.5+0.5 :D
     
  44. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
  45. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Yo HT, what are the OC clocks? For the P650SE and MSI GS60? Just wondering because of the massive increase in performance with the OC..
     
  46. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Ya holy! That is one monster 970m
     
  47. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    To use a very bad car analogy, you'd be ok with buying a boosted V6 that runs like a V8, but isn't actually a V8.

    First of all, sometimes it's the PRINCIPLE that matters, I paid for something I didn't get, no matter how trivial it may seem, but I'm sure I'll get burned for this. And second, actually yes, if nVidia had properly advertised the 970 as a 3.5GB 224 bit card, I would've actually got the 980 instead.


    No, average FPS DOES NOT yield any useful information in this regard. Cloud please, this issue is exactly like the AMD frame pacing issue I mentioned a while back. This is not something that an AVERAGE FPS number will tell you anything about. Why do you think nVidia's official press release only showed AVERAGE FPS numbers and nothing else?

    Yes I know HardwareCanucks ran FCAT tests, but until they post the frame time results, it still doesn't mean much. It's possible to get high FPS while still stuttering, due to uneven frame output, which is exactly what happened with AMD before they finally implemented frame pacing in their drivers. AMD's defense all along was "the average FPS numbers are very high, how can there be stuttering?". So nVidia released FCAT to prove there was stuttering. Yet now the same thing is happening to nVidia themselves, and they are using the exact same defense.

    There's a couple of reasons here, the non-technical ones aside, I usually put on as much eyecandy as I can, and crank up that DSR (effectively SSAA), so vram is of absolutel importance.

    The problem with FCAT testing with a single card, is people could (and have been) simply saying "yeah ok so there's some differences, but you're getting like 25 FPS anyway so you're not going to be playing that game". Which is true. BUT, the key issue here is for multi-GPU setups. Even when you have enough GPU core power to push 60 frames with both the 970 and 980, the 970 would still crap out prematurely due to the vram wall, even though the 980 would stilll keep trucking on. I know it may sound silly to argue over 500MB, but as we've seen already, the trend is for PC ports to get crappier optimization, and for vram requirements to balloon like crazy. Especially when running at high resolutions (or using lots of SSAA), vram will start getting chewed up like crazy, and the last thing I want to happen is for my performance to tank because of vram, and not because I ran out of GPU power. If you need an example, think 780 Ti and Watch Dogs. (yeah ok not the best example, but that's a case of the GPU hitting the vram wall and killing performance)

    So while this may or may not be as big of a deal for those running single GPUs, for people running with multiple cards it's a dealbreaker, and this problem will unfortunately become more and more common as newer games (and older games when you run 4x DSR) start hitting that 3.5GV vram wall.

    tl;dr Essentially breaks 4K/high DSR gaming for 970, especially for future titles.
     
    octiceps likes this.
  48. Qyyz

    Qyyz Notebook Enthusiast

    Reputations:
    10
    Messages:
    49
    Likes Received:
    29
    Trophy Points:
    26
  49. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Guru3D
    Middle Earth: Shadow of Mordor: No problems with 970

    PCLab
    Assassin Creed 4: No problem. Less framerate "problems" than 980
    Shadow of Mordor: No problems
    Far Cry 4: No problems

    HardwareCanucks
    BF4
    Dragon Age
    Hitman
    Middle Earth
    Used FCAT and got an average FPS drop from 2-6% with over 3.5GB VRAM usage compared to 980.
     
  50. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Oh man this is turning into a P-R cluster____ for nVidia. So remember how PeterS promised to help everyone with returns? Yeah he backtracked and recanted his statement:

    Luckily I have a copy of the original right here in this thread, let me pull it out:

    Notice how the offer to help with returns/exchanges has been rescinded? :rolleyes:

    You dun goofed nVidia, you dun goofed.
     
    octiceps likes this.
← Previous pageNext page →