The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    GTX 970 VRAM (update: 900M GPUs not affected by 'ramgate')

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Jan 23, 2015.

  1. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    in the end, it doesnt really matter. 970 sales went through the roof for a good reason ;) the fact that it has two memory partitions doesnt negate its good price/performance ratio...
     
  2. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    It's still dishonest advertising at best. I'm not as outraged as I should be because I never intended for my 970s to be a long term, "futureproof" solution. I always viewed them as stop-gap cards and they still are to me. But I can see how someone who picked up the 970 hoping to get a few years out of it would be pretty peeved.
     
  3. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Regardless of the price, I was never particularly impressed at how much 970 was cut down compared to 980 when compared to previous trends of x70 and x80 GPUs on the same ASIC (770 and 780 don't count because GK104 vs. GK110). This is just further validation of that.
     
  4. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    And now we know why the 970 is so cheap -- no reference design, coil whine lottery, and memory partition. Talk about a big pile of mess.
     
  5. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It reminds me of this article: Video Card Failure Rates by Generation - Puget Custom Computers

    Now that's only from one AIB--ASUS--and mostly DirectCU cards, but perhaps the reason AMD cards fail more often is because they're cheaper. The AIB partners cheap out on PCB and component design to meet margins. That's why it's generally recommended to get AMD reference boards instead of non-reference ones.
     
    jaybee83 and moviemarketing like this.
  6. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Also did people not catch on to the incorrect specs that are only being corrected now, 4 months after release?
     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    And the 980M/970M are even further cut down from that. That's why *I* was pissed at it. But seeing as how maxwell is a power drinker beyond stock, a full 980 in a 120W envelope would have been incapable of OCing with current PSU tech. And it'd have been a laughingstock among enthusiasts (no matter how strong it was) if the cards were locked at stock.

    because GPU-Z and other hardware checking programs read the specs as what nVidia published, with 64 ROPs and 4GB installed vRAM, etc. Unless someone was actively monitoring vRAM usage, they'd not have known, and since most desktop users know about as much about vRAM as they do about mobile CPUs, nobody worth their salt probably looked. Or anyone worth their salt enough probably stared at 980s or stuck with Titans/780Tis/etc. I could guarantee you if I had those cards in a system I would have noticed from day 1 with my overlays.
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Well the good news is that the ROPS on the mobile cards are fully intact (where applicable) so this issue does not impact us :)
     
    Cakefish likes this.
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    How do you know?
     
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Cakefish likes this.
  11. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  12. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    I listed the start of the article so people could read the whole thing ;) I then posted the important information in the table.
     
  13. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Oh, they could if they have time. Not everyone does, though, or cares enough to get into the nitty-gritty.
     
  14. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    So is GPU-z 8.1 right? A before and after w/8.0:

    GPU compare.PNG
     
  15. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Yes the TMUs are connected to the shader units so they get disabled as the shaders are. This is how it has always been stated, GPU-Z just has not been able to handle it properly.
     
  16. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's TMUs, not ROPs. Not related to the discussion here, but 96 TMUs is correct for 980M. 128 is for desktop 980.
     
  17. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    nono you misunderstand, I'm saying I'm surprised people aren't (or aren't as) outraged at nVidia passing around incorrect spec sheets to everyone for the past 4 months, and is only revealing the 970's true specs now after things have blown up in their face.

    Myself I'm actually more p!ssed about this than the vram wall, because unless I understood the situation wrong, this is simply outright deception. There's just no reasonable explanation for why nVidia never bothered to correct the specs when the 970 was selling like hotcakes and sold out everywhere, but suddenly when feces have hit the fan 4 months later, they come out and correct the specs.
     
  18. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Tell that to the duped 880M owners. It's been, what, 10 months and counting? 2014 was not kind to Nvidia.
     
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, according to Anandtech's report, they said the engineering team told them and the marketing team didn't actually update it and nobody complained so nobody looked. Or something. That's pretty much the most concise way I can explain it.

    As for the deception thing, it was supposedly a misunderstanding. Seeing as how their bloody website apparently lists the GTX 470's specs when you click on the GTX 480 and they have all their boost clocks set wrong? I would be inclined to believe that. But either way, people were indeed not marketed a card with 3.5GB fast vRAM and 512MB of slow vRAM. Also, due to the way the card is wired, it means that the game can switch to slow vRAM at 3GB being used by a game, because of windows and windowed mode/borderless windowed/etc. Hell, I'm sitting here using 631MB vRAM. I have chrome open and steam and teamspeak, listening to music and tweetdeck. I've got basically nothing open here. If I had a 970 this means if I launched a game like Titanfall I couldn't use insane textures, because it'd switch over to slow memory. Imagine, my mobile GPU outperforming a big bad 970 because nVidia wired it like frankenstein.

    nVidia should offer something back to people who bought it expecting 4GB of fast memory. And should make sure people know about its issues in the future.
     
  20. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    They should have just sold the 970 as having 3.5GB vRAM and left it at that. It would be another distinguishing factor between 970 and 980. But honestly why not 8GB vRAM since their mobile counterparts have that much?
     
  21. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    My guess is they were planning to wait 6-12 months before dropping the 8GB versions, why not get the same people to spend some more money to upgrade again?
     
  22. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Probably because 3.5GB vRAM is a weird number nobody cares for, and it does technically have 4GB vRAM attached, and 256-bit memory buses are not supposed to have that multiple of vRAM attached (128MB, 256MB, 512MB, 1GB, 2GB, 4GB, 8GB, etc). It was a problem with the GTX 660Ti; a 192-bit memory bus is supposed to have 1.5GB or 3GB vRAM attached (why the 870M had 3GB and 6GB variants), but they added an extra memory chip that doesn't run at the same speed as the first 1.5GB on a single memory controller. Most people didn't know or care about it though. It's been done in the past, but I've never seen the amount of memory be "neutered" rather than "bolstered"... the 970 remains as I've called it a frankenstein-wired card.
     
  23. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Misunderstanding and deception is one fine line away from each other, it all boils down to intentions. After the 880M fiasco, I'm no longer willing to give nVidia the benefit of doubt.

    In any case I am seriously p!ssed because I am very much a principles guy. I don't game at 4K and have never run into this vram wall, but I'm just as upset as those who are affected because the product has been very clearly (intentionally) misrepresented by nVidia, and that is absolutely unacceptable, and really grinds my gears.
     
  24. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Even if you ran into it, your vRAM bandwidth is still doubled via SLI, so you would likely have noticed it far less. Then add that you're probably not playing the games that are so vRAM heavy, and even more so if you're on a single monitor so that your vRAM is snapped off when you fullscreen, you'd actually likely need to pass 3.5GB of in-game vRAM demand before you found an issue. So... Evolve would probably do it for ya =D. =D =D.

    But I indeed understand your frustration and inability to give them the benefit of the doubt. Things should have been quadruple-checked before launches. Either way, they should have to pay for their mistake, whether it was genuine or not.
     
  25. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Isn't vram simply mirrored in SLI so you end up with the same bandwidth? If anything I'd think those running multiple cards would be most likely to notice this issue, because for a single 970 you'll run out of GPU power long before you hit the vram wall.
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    In AFR, VRAM is mirrored but bandwidth is doubled.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You don't have to run out of it. Quite simple: set Shadows of Mordor to ultra with the ultra textures at 1080p on a single 970. You'll run out of vRAM *WAY* before you run out of GPU power, as that game is fairly easy to run.

    Texture size (not render resolution) is the main hit for vRAM in games. Remember, I can maintain near 60fps in Titanfall on pretty much max settings on a single 780M using near 4000MB of vRAM:
    [​IMG]
     
  28. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well at least the slow segment will be 56GB/s instead of 28GB/s lol

    See I've always wondered, does SoM actually NEED/REQUIRE that much vram, or does it simply pull a Watch Dogs aka allocate but not actually use it? I ask because I've seen Watch Dogs hog 3.8GB vram at 1440p downsampled to 1080p (1.78x DSR), but never experienced this vram stuttering (outside of stuttering that came bundled with the game anyway :rolleyes:). Even with double bandwidth on the slow segment, its still about 1/4 of the fast segment, so I imagine it should cause some degree of hitching/stuttering/magical rainbows.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I don't think Watch Dogs' stuttering is necessarily due to VRAM when you have 3GB+, but due to its poorly coded texture streaming. If you recall, that patch near the end of last year fixed pretty much all stuttering on single GPU systems, but completely broke SLI scaling.
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Honestly, I don't know. I know for sure that people with 3GB etc cards that tried maxing the game got real stuttering issues, and I believe people've determined that on full ultra at 1080p it uses ~5.4GB vRAM. I don't own it myself, mainly because as of december I am a NEET and broke, but either way it seems like they just... uprezzed like everything.

    Oh and we all know Ubitoad is responsible for Watch Dogs' stuttering. Maldo's mod where he re-packs the "ultra" textures into the "high" setting for Watch Dogs eliminated LARGE amounts of stuttering for most people; I'm fairly sure that the game was coded to be terrible for people.
     
  31. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah I remember we having that discussion about whether it was malice or sheer incompetence.
     
  32. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    wow, thanks for that link, extremely interesting site with tons of articles on my ToRead-list now :D
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Add my guides to the list and grab a plate of food. Prepare for teh knowledge incoming.
     
  34. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    been through ur guides already ;)
     
  35. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    XD. I update them a lot though. Check them back every few months XD
     
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Part of the issue is while accessing that 512MB of ram it can't access the main part.
     
  37. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    PHEW!

    NVIDIA might like to claim no performance impact but I am relieved that the 980M (and 970M, 965M) aren't structured in this way.

    It's quite interesting to see that the 980M has more ROPs active and more L2 cache than a high-end desktop 970, despite fewer SMM cores enabled.
     
  38. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    It's not surprising really though, look at that spread of core configurations, they can practically use every chip they make, faulty cache and/or shader unit into 970, faulty shaders but still efficient into the 970M and 980M and decently power efficient but something really wrong around the core into the 965M.
     
  39. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Aaaaaaaaaaand there we have it, user testing proves the 970 is without a doubt, a 3.5GB card.

    gg nVidia
     
    D2 Ultima and octiceps like this.
  40. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    This is shaping out to be extremely funny. I wonder what nVidia has to say about this.
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Too soon, man. I think n=1 is still pissed.
     
  42. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well I'm pissed yes, but I do find comic value in nVidia's P-R doublespeak. Also have some popcorn ready to see what happens next.

    Most likely nothing, people will forget about this in a month or two, nVidia will continue to outsell AMD 3 to 1 even though their cards costs more and their drivers are just as **** now :rolleyes:
     
  43. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`ve been gone for a while. Too much hassle going many pages back. What have we learned the latest days? Anything new?
    n=1 pissed? Why? Dont the 970 perform like it should in the games?
     
  44. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    And this is why people shouldn't take P.R. numbers at face value. The average framerates above 3.5gb vram usage don't necessarily change that much (as Nvidia claimed), but the minimum framerates tank. Would be nice to have some FCAT numbers from some review sites as well just to be sure.
     
  45. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    This whole thing is a mess.

    I`ve seen posts and tests showing close to 4GB usage with 970 earlier.
    The link that was posted earlier show 4GB usage with Watch Dogs and they didnt get any performance hit

    Investigating the 970 VRAM Issue : pcmasterrace
     
  46. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yes I've experienced the same. But as someone pointed out, the issue only happens when the game actually NEEDS >3.5GB vram, and not just simply allocating it. Basically allocated vram =/= actively in use, thus why no issues when Watch Dogs gobbles up vram like mad.
     
  47. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The horse is probably dead, but I'm gonna beat it again anyway just in case :D

    So we now have definitive proof that the 970 suffers from stuttering once vram requirement goes beyond 3.5GB. Have a look at the frametime graphs here:

    [​IMG] [​IMG]
    [​IMG] [​IMG]

    Look at the frametimes for Watch Dogs @ 4K (bottom right). See how all those spikes occur in close succession? Yeah that's going to manifest itself as stuttering. I mean look at that graph, it's pretty much a block of spikes at 150ms or worse all the way through, Jesus Christ.

    Google Translate of the last two paragraphs: (emphasis mine)

     
  48. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    /thread

    /Nvidia
     
    D2 Ultima likes this.
  49. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I still don't see how that's definitive proof. For one they use a poorly coded Watch Dogs game.

    They run Ultra at 1080p to get 3.5GB

    Then they run High at 4k to get 4.0GB

    Who's to say that it isn't due to thrashing textures in and out of RAM? 980 shows spikes to 150 or so, just not as frequent. Maybe the 970 can't handle the 4k load as well. Are the video cards the same brand and is it consistent from video card to video card?
     
    be77solo likes this.
  50. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Because it doesn't happen on the 980?
     
← Previous pageNext page →