The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    GTX 970 VRAM (update: 900M GPUs not affected by 'ramgate')

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Jan 23, 2015.

  1. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    2nd update: 900M GPUs confirmed to be unaffected. Only the desktop GTX 970 has the split memory pools of 3.5GB+0.5GB.

    Update: http://www.pcper.com/news/Graphics-Cards/NVIDIA-Responds-GTX-970-35GB-Memory-Issue

    http://www.anandtech.com/show/8931/nvidia-publishes-statement-on-geforce-gtx-970-memory-allocation

    ----------------------------------------------------------------

    Some of you may have heard already, but basically there's a massive potential storm brewing over potentially faulty GTX 970 desktop cards; whereby the GPU appears to be unable to utilise the entirety of its 4GB VRAM without slowing down tremendously (it is not currently known whether this is a hardware or software issue - but the GTX 980 seems to be unaffected).

    More information about this topic can be found here:

    GTX 970 memory bug reportedly cripples performance in memory intensive scenarios
    GeForce GTX 970 Design Flaw Caps Video Memory Usage to 3.3 GB: Report
    GeForce GTX 970 Users Reporting Problems With 4GB VRAM Usage On High End Card
    Nvidia “looking into” Vram problems with the GTX 970
    NVIDIA GeForce GTX 970 Can't Use All 4 GB of Memory
    Current GTX 970 Can’t Use all 4GB of VRAM; Nvidia Investigating
    Nvidia’s GTX970 has a rather serious memory allocation bug
    NVIDIA GTX 970 Owners Report Unusual VRAM Behavior, Unable To Efficiently Allocate More Than 3.5GB
    GTX 970 3.5GB Vram Issue

    I feel we need a thorough investigation into our own 900M series GPUs to see if any may be affected by this bug (hopefully not!).

    All you need to do is download this benchmark and this DLL file (just plopping them both into your downloads folder is fine). Note: it may be flagged as malware by Chrome, but it is safe to run - I have already done so with no issues.

    Here are my results at stock clocks/344.75 driver (it seems that the 980M may be safe - as you can see the throughput remains constant throughout the entirety of the test):

    [​IMG]
    This is an example of one of the potentially faulty GTX 970's:

    [​IMG]
    [​IMG]
    I really hope that none of the 900M GPUs are impacted by this potential problem.
     
    Last edited: Feb 5, 2015
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    This is interesting... it claims mine has the error; however it shows up after 3.5GB vRAM (when my OS uses 400MB normally). So my "issue" is that windows is accessing my last 400MB when I try to run this bench. I know I've used 4000MB already without performance loss. My heightened vRAM usage without gaming is from Windows 8.1 + having two monitors installed.

    If people are noticing performance loss here though, that's an issue that needs fixing.

    Edit: in fact, I ran it again with GPU-Z open. My display drivers crashed (without a black screen) apparently at the end of the test. But look at the used memory in GPU-Z then take into account that windows is using memory.

    I wonder if people are simply using differing windows settings? If someone on Windows 7 with a single monitor tests that, they WILL have it run flawlessly... but if someone else on Win 8 or 8.1 with a second monitor tries, they'll get this result below:
    [​IMG]
     
  3. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Ah yes, that is to be expected I suppose as you have SLI and not Optimus.

    And yeah, the forum threads I've been reading (they've started popping up everywhere) have all had 970 owners who have noticed reduced performance when the GPU is using its full 4GB of memory.
     
  4. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I think a whole lotta people who don't know what the hell vRAM is and how their OS and monitors affect it are getting blown up though. It's one thing if you grab a 970 and a 980 and shove it into the same PC and test them one after the other and one gives problems the other does not; that's a clear engineering fault. But random people online? LOTS of chance for issues.
     
  5. Cammac66

    Cammac66 Notebook Guru

    Reputations:
    79
    Messages:
    54
    Likes Received:
    15
    Trophy Points:
    16
    Just ran the benchmark on the 980M. Got some interesting results using 344.75:

    Overclock:

    [​IMG]

    Stock:

    [​IMG]

    Not really sure what to make of the results but I kept getting that hiccup at 3584MB with the OC and at stock.
     
  6. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Is your card the 4GB or 8GB version of 980M?
     
  7. Cammac66

    Cammac66 Notebook Guru

    Reputations:
    79
    Messages:
    54
    Likes Received:
    15
    Trophy Points:
    16
    It's the 4GB card from ASUS as its soldered onto the motherboard.
     
  8. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Do you have an external plugged in to that machine and are you using Windows 8?

    ASUS only use 4GB cards.

    It shouldn't be soldered... it should be replace-able, but only with another ASUS card meant for that same machine.
     
  9. Cammac66

    Cammac66 Notebook Guru

    Reputations:
    79
    Messages:
    54
    Likes Received:
    15
    Trophy Points:
    16
    No nothing is plugged in. And I'm using Windows 8.1

    Sadly on the G751 the GPU is soldered directly onto the motherboard along with the CPU

    [​IMG]
     
    reborn2003 likes this.
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Damn, ASUS went full BGA this go-around.
     
  11. Cammac66

    Cammac66 Notebook Guru

    Reputations:
    79
    Messages:
    54
    Likes Received:
    15
    Trophy Points:
    16
    Oh yeah they sure did. You can't even get the battery out this time around.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah... I'm never buying an ASUS.

    That being said, Windows 8.1 should be using up vRAM just by having it on, but I don't get why your later vRAM checks work. Whatever, #nVidia.
     
  13. Cammac66

    Cammac66 Notebook Guru

    Reputations:
    79
    Messages:
    54
    Likes Received:
    15
    Trophy Points:
    16
    Mind you I still really like the G751. I know for many it's an instant red flag but I do intend to get a new machine every 2 or 3 years and it fits all my criteria.
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Are people complaining because they "only" get to use 3800MB instead of 4000MB? What about OS interfering and using some of the remaining VRAM? Or any software running in the background?

    And they threaten about recall? Is the whole internet community bored?
     
  15. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I've no doubt it performs quite well and all, but seeing the recent things with HQ chips, I've come to the realization that no HQ chip on the planet is worth it for me. They're too power limited, even if you adjust with XTU. And then I hate having to mess with a machine that's hell to open up XD.
     
    jaybee83 and Ashtrix like this.
  16. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    There's massive performance drop after 3.5GB VRAM usage. Read up on it before you start making assumptions. :rolleyes:

    VRAM usage is reported as a total number. So anything that uses VRAM, such as Windows, is included in that number.
     
  17. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's actually as low as 3200MB for many users. I saw a lot of the <S>idiots</S> fine upstanding users at Linus Tech Tips showing their cards getting huge bandwidth drops after 3100MB of usage etc.
     
  18. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Oh I see. So from OP, when usage passed 3.2GB usage the memory bandwidth plummet to 22GB/s
    Why does the 970 benchmark report 150GB/s when it should have 224GB/s in the early stages?
    EDIT: Also the L2-cache which doesnt belong to the VRAM go down from 417 to 75GB/s at the same time as the VRAM.

    Not sure I trust that program. Do people even know what it does other than "bench the VRAM"?
     
    Last edited: Jan 23, 2015
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    ^ nope XD
    10char
     
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    lol OK. So nobody really knows what they are looking at.

    Let me ask this:
    Have anyone replicated problems in any real world scenarios, ie gaming, where they get horrible performance with these "broken" cards? Because using 3GB+ should be easy and 22GB/s bandwidth will undoublty be easy to spot in the game.
     
  21. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I do think the people on other forums and comment sections declaring immediate AMD victory, NVIDIA's imminent demise, class-action lawsuits and free upgrades to 980's for all 970 owners are living on another planet.

    Still, it will be interesting to see how this story develops... it's really up in the air at the moment. We know nothing for sure until an official response from NVIDIA (beyond the standard 'we are currently looking into this').
     
    Last edited: Jan 23, 2015
    Cloudfire likes this.
  22. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  23. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Like Im gonna read through all those forum posts. Nope, not today.

    If the 970 performance was so bad, I think its extremely fishy that no reviewers stumbled upon this issue when testing the cards with high resolutions that use a lot of VRAM. Or any GTX 970 owner before people started this rumor and people started looking for it.

    Yeah I dont believe this at all until I get some respectable reviewer showing it.
     
  24. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Yeah we need Anandtech and similar reliable sites to dig into this! Or clarification from NVIDIA themselves.
     
  25. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Maybe the older drivers they tested didn't suffer from this issue.
     
  26. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    so i am looking for a constant data speed in the test?

    sounds like that if you have more than 4gb of vram it should pass with no issues when it comes to the 900M cards.
     
    Last edited: Jan 23, 2015
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I can safely say I've seen, without usage loss, my vRAM max out and sit there and have no problems whatsoever. I can also safely say that windows uses a whole lot of vRAM, especially windows 8. If people tested this and ONLY around 3500 their vRAM got screwed, BUT the vRAM was maxed in GPU-Z, they should have checked what their default vRAM usage was before attempting this. It's crap like this why I wrote my bloody vRAM guide, so people would know that windows itself can be a vRAM hog. But if someone is getting problems from 3200MB and up, and windows is NOT using 800MB of vRAM on its own, then fine. Awesome. You have a faulty card.

    People, however, are incredibly stupid and bandwagonist. Unfortunately, people who test things extremely thoroughly like I do are rare, and in big companies like Anandtech they're pretty slow to respond, so widespread panic ensues a lot.

    I've seen users claim no performance loss at full 4GB vRAM usage under windows 7, and I've seen other users claim that games stop using vRAM at 3.5GB. Then retards (not going to strikethrough it this time) posting two screenshots of Watch Dogs (vRAM hog) with vRAM capping at 3.5GB... except the other pic had lower texture quality and other effects lowered... obviously vRAM would be used less. And only a couple people even considered to point that out.

    What's happened is that somebody got happy about it and is going on and on about it. I'm not saying that the cards cannot be broken... don't get that wrong. But a LOT of things increase vRAM usage, and I have in the past booted windows using 800MB of vRAM for some unknown reason. It just is what it is. Windows 8 WILL not use less than 256MB of vRAM on a single 1080p screen. The last vRAM tick that will run without a hitch is ~3712 or so, and this is ONLY assuming that a single 1080p monitor is plugged in and no pictures or any other vRAM eating programs are running. And for all we know; most 980 users are on Windows 7 with a single monitor plugged in, resulting in a simple 128MB of vRAM being used; not enough for the 3840MB run to be ticked off. I think a lot of users showing the 3500+ file being broke are mistaken, and the ones showing 3200+ being broke are not.

    But we won't know for quite a while.
     
    Cloudfire and Tonrac like this.
  28. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    What the heck are we supposed to see with this program? It all looks the same 970m or 980m on the screenshots I've seen.
     
  29. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    look at the memory speeds on the right. if they stay the same +/- a few then it's ok, but it there's a big drop then there's a problem....or so they say!
     
  30. sasuke256

    sasuke256 Notebook Deity

    Reputations:
    495
    Messages:
    1,440
    Likes Received:
    449
    Trophy Points:
    101
    Capture.PNG
    No optimus, Dual Screen (Internal + VGA)
     
  31. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    the DRAM looks like it was a fluke it managed to get 1792MB at full speed; the L2 cache appears to be correct.
     
  32. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    LOL :p
    ALL of Nvidia`s cards are broken if we should interpret the wonderful benchmark.
    Time to sue Nvidia for all they got and put them bankrupt. No need to prove anything in court. Just pull up screenshots from this benchmark.


    Judge: "What are we looking at here?"
    "Uhm, its a VRAM benchmark. It does something something with the VRAM or the bus.
    "How does it work? How does it test the VRAM? Can it overload the bus? Is the way it test the VRAM similar to what people will stumble upon in games, or is it purely artificial? Please explain"
    "Uhhhhhhhhmmmm, you know Mr Judge, I actually only read about it on a forum. So I downloaded the benchmark and tested it out. I really have no idea how it works. I was just bored. So I jumped on the Nvidia hate bandwagon"


    GTX 770
    [Lazygamer] Nvidia’s GTX970 has a rather serious memory allocation bug - Page 13

    GTX Titan
    https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4430863/#4430863

    GTX 765M from sasuke above
    http://forum.notebookreview.com/attachments/gaming-software-graphics-cards/120223d1422091775t-check-whether-your-card-can-use-all-its-vram-capture.png

    GTX 980
    http://shintai.ambition.cz/rec2.jpg

    Even GTX 970M is affected
    https://i.imgur.com/lwjmFm0.png?1
     
    HTWingNut likes this.
  33. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    best thing would be to run a benchmark thats not an unknown in its operational details and let it run in DOS, so all of the vRAM would be unused and free for the bench. either that or maybe a live linux distribution :p
     
  34. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    LOL. Exatcly.

    I was trying to wrap my head around what the heck this thing is doing. Make it open source and then maybe we can determine if there's an issue at all. Some random program from Mega download is telling some people in a DOS windows that their RAM is moving too slow. Alrighty then. Let's all take a step back and if you're really concerned try to verify this through other means. It could really be a non-isssue.

    Should I freak out because my vRAM is showing 102GB/sec and it technically should be 120GB/sec?

    [​IMG]
     
    Cloudfire likes this.
  35. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    MUCH ADO ABOUT NOTHING

    Investigating the 970 VRAM Issue : pcmasterrace

    Some other comments on Lazygamer:

    And if you care for my personal experience, I've seen vram usage peak around 3.8GB in Watch Dogs, but there was no extra stuttering due to vram swap, all the stutter came bundled with the game itself. :rolleyes:
     
    Last edited: Jan 24, 2015
    Cloudfire likes this.
  36. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Exactly the same as other benchmarks show. The program is reporting less bandwidth on the cards than what they should really have.
    You should go ballistic. Nvidia is intentionally selling broken graphic cards.
    My math skillz tell me that you have lost 102/120, 15%!!!! performance because of this

    Out with the pitchforks!

    Thanks. I bet this is not the last detailed information/review we will see about this.
    I bet Nvidia probably already knows its very little to worry about, just need to figure out everything before giving an official statement.
     
    Last edited: Jan 24, 2015
    HTWingNut likes this.
  37. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    --deleted--
     
  38. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I have no idea what's going on. I agree, the benchmark results are all over the place and probably can't be trusted (I saw a 980M 8GB user get wildly different results from my own). But in any case, it's already gone viral:

     
    Last edited by a moderator: May 12, 2015
  39. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Thanks. Starts at 42:51
     
    Cakefish likes this.
  40. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I specifically added a time-stamp onto the link - why are all YouTube timestamps broken nowadays? They never work anymore. Been like this for many months now.
     
  41. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Doesn't work with video tag, but iframe embed works using the ?start=2571 (has to use seconds unfortunately)

    <iframe width='560' height="315" src="//www.youtube.com/embed/e85aRCFH8gM?start=2571" frameborder='0' allowfullscreen></iframe>
     
    Last edited by a moderator: May 6, 2015
  42. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  43. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
  44. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I'm happy to say that I called this and called windows vRAM usage being the actual underlying issue since my first post, and more than anything else, I am VERY happy that I understand vRAM to that point where I saw through this retardation a mile away =D.
     
  45. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I was just about to post this, cheers. I'm glad it's cleared up now.

    Panic over, I guess?

    As long as performance remains on our 980Ms/970Ms/965Ms.

    I don't understand why this segmentation of memory was necessary on 970 but not the mobile cards - is it due to the odd number of SMMs it has vs mobile GPUs which all have even numbers of SMMs?

    Sent from my Nexus 5
     
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I dunno, sounds like Nvidia is trying to cover up a hardware flaw. Defective by design, if you will. Their carefully worded statement reminds me oh so much of Bumpgate. We won't know for certain until PCPer releases some FCAT frame time data of 970 vs. 980 both using >3.5GB in the same benchmark. FPS doesn't tell the whole story.

    And about that benchmark, it's not the reported GB/s that's important. You're looking for a sudden drop-off after a certain point. Plus I doubt very many of you ran it correctly, esp. since most of you are probably on Windows 8 where it is impossible to disable Aero/DWM/desktop composition.

    GTX 970s can only use 3.5GB of 4GB VRAM issue - Page 18

     
  47. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Nvidia tested two cards, GTX 970 and GTX 980

    GTX 980 got 179GB/s with all VRAM usage with the benchmark
    GTX 970 dropped down to 22GB/s after around 3300MB VRAM usage with the benchmark

    Nvidia designed the GTX 970 to have two VRAM banks, one of 3500MB and a last one of 500MB. Games and software are able to use all 4000MB, but 3500MB have higher priority for the GPU, which means for games below 3500MB it will only have access to that bank. If the game require more, the GPU will have access to the last 500MB for a total of 4000MB.

    The cause for this is because the benchmark can only around access 3.5GB VRAM and is not able to access the last 500MB. So the benchmark instead starts testing the system RAM indirectly through VRAM flushing etc, which is most likely 1600MHz dual channel RAM, because it have a bandwidth of 24GB/s.

    In gaming, games with more than 3500MB VRAM usage had 3% less FPS vs 980 than the games that used less than 3500MB VRAM. Within margin of error.

    So we have both testing and and explanation from Nvidia that shows that GTX 970 have no issues
     
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    What are you talking about? If the benchmark can only access 3.5GB, then why doesn't 980 have the same problem? And why does L2 cache drop at the same time on 970?

    image.jpg

    BTW the benchmark results are from end users, not Nvidia.

    And like I said, FPS doesn't tell the whole story. 970 owners were reporting huge spike in frame times (microstuttering) after 3.5GB allocated. An average number such as FPS won't show that.
     
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Someone needs to check if the reference cards have the same memory bank issue. The 970 had no reference version for quite a while.

    Also, looking at the benchmarks they used... why did they pick Advanced Warfare? Advanced Warfare doubles up until it uses all your vRAM anyway. It's a poor choice. Shadow of mordor is the best bet, but the correct way would be to use "high" textures with the lighting/etc off (to keep vRAM usage consistent) then to use "ultra" textures to force ~5GB vRAM usage, and see how badly the framerate tanks. The memory clocks clocked to the same with the same memory bus width means the bandwidth is the same for both cards; if the system is the same then one card shouldn't say... stutter at 10fps while the other chugs along at 40 if memory swapping is the issue.

    Either way, I'm still not convinced many people tested correctly, or that the test is functioning correctly, but at the end of the day you can always check what GPU-Z says. If GPU-Z says 4096MB vRAM and can show your card using no more than 3584MB of vRAM, then nVidia really did lock off the extra 512MB to not be detectable by software. If however people can get 4096MB to be shown in software calculations, then we CAN check it correctly.

    I'm sitting on the side of the fence of "MOST people don't have any issues and think they do" until I see some deep conclusive tests.
     
 Next page →