The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    How much vRAM does one really need?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Templesa, Oct 6, 2014.

  1. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    All the vRAM.


    Jokes aside, since this debate will continue to rage, I figure we could use a thread dedicated to it.

    Confused about just how much you might need? Ask away, and someone far more knowledgeable than me might answer!
     
    ryzeki likes this.
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Okay. If this is the case, I am going to openly link it. I wrote a guide on how vRAM works, how much you need and what it bottlenecks and what it does not bottleneck.

    This is a full, comprehensive guide here. It should contain all you need to know and more, and if there is anything you wish to ask, feel free to do so.

    The Video RAM Information Guide
     
    Ningyo and m1ko like this.
  3. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    It depends on the sort of games you play and the way the engines handle the environment. As a rough rule of thumb, open world games chew up massive amounts of VRAM since a good chunk if the environment is cached to allow seamless transitions.
    A good example of why this is used is BioShock Infinite, this game uses relatively little VRAM despite the world being fairly large in size, however you get bits of microstutter on occasion when the game engine is streaming parts of the world on demand from the HDD.

    Other times, the game is simply badly coded. Wolfenstein demands copious amounts of VRAM, presumably for seamless area transitions however you still get loads of annoying texture pop-ins.
     
  4. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    No, Wolfenstein: TNO uses the ID Tech 5 engine which has "megatextures" or something to that effect. Basically, their textures are super large and need lots of memory to load them all in correctly. It doesn't help texture popping, it doesn't use SLI or CrossFireX for increased memory bandwidth to PREVENT the pop-ins, and most importantly, the textures aren't even drawn all that well have the time.

    It at least runs fairly well though.
     
  5. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    The answer to this is 8GB if you want to play console ports, end of story.

    Sent from my HTC One_M8 using Tapatalk
     
  6. Dufus

    Dufus .

    Reputations:
    1,194
    Messages:
    1,336
    Likes Received:
    548
    Trophy Points:
    131
    Nice write up, you might also be interested in this http://forum.notebookreview.com/gaming-software-graphics-cards/752160-how-much-vram-enough.html also some info on the switched graphics where the iGD is used for rendering might be useful, at least for muxless laptops.
     
  7. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Sad but it's proving to be true. We're getting emulator like efficiency.
     
  8. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Its not really that. Most console ports look similar between PC and console on high settings. The newer consoles are using let's say ~4-5GB of their unified memory to pull that off. It stands to reason that a direct port is going to need the same vram to get the same quality and ultra is simply going to require more.

    Is it lazy coding? Absolutely. But exactly when did console ports have great optimization when brought to the PC? I can't think of such a time.

    The reality is that while the consoles are woefully underpowered compared to a modern midrange gaming PC, they have that damn unified GDDR5 so we are at a distinct disadvantage until we make up for that by increasing our vram or we wait for the developers to start actually porting properly. Which solution sounds more viable? I think Ubisoft has shown the answer to that with Watch Dogs.

    Sent from my HTC One_M8 using Tapatalk
     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    LOL it's not that bad. It takes like orders of magnitude more powerful hardware to successfully emulate a console. That's why Xbox/PS2 and 360/PS3 emulation has never really taken off.
     
  10. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Ubisoft are saints compared to WB's Shadow of Mordor.
     
  11. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    At least SoM doesn't appear to stutter like Watch Dogs did. That was unplayable... I've seen the SoM benches and things look just fine for the 780Ti which Watch Dogs is useless on ultra textures.



    Ummm, PS2 emulation is damn near flawless thanks to PCSX2 but yeah, Xbox not so much. PS3 will be a long time thanks to PowerPC... If ever. Look at PearPC and how slow that was with even the slowest of PPC macs...
    Sent from my HTC One_M8 using Tapatalk
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Really? Aside from the VRAM demand, SoM seems to perform much better than WD.
     
  13. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Well Watchdogs launched with alot of problems but they did get it right after patch 1.03 (I think it was that one) and it ran like a hot knife through butter.
    Yep, nothing appears to be wrong with SOM apart from being a total Vram w.hore.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    MegaTexture isn't to blame for any game's high memory consumption.
     
  15. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    I am looking at the Asus G751 but I'm put off by them having lower VRAM than the competition. I'm not tech savoy VRAM but is Maxwell 4GB "better" than Kepler 4GB?
     
  16. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    .......yes
     
  17. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    HT hold me I'm scared this Asus will have problems if games are going down this 4gb more path in the future. :(
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Well there's no difference in the RAM, just that the Maxwell architecture for the GPU is much more efficient than Kepler. 4GB should be fine for the next couple of years I'd say, unless you want to load up some of these "ultra" textures it seems games want to include and unoptimized at that.
     
    J.Dre likes this.
  19. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I've been good with 4GB's for a while now, and would say that 4GB's is all one needs, as HTWingNut suggests. ;) As for what one may want to "future proof" their system (a phrase we gamers love to throw around :D), I think 6GB's would be ideal. But if you plan to upgrade within two years, there's no point.

    P.S. As 1080p will be the standard for mobile gaming for at least one more year, I am speaking in reference to 1080p. Higher resolutions obviously require more VRAM.
     
  20. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I've never seen dynamic RAM get used for the GPU before, even on my old 280M. Interesting indeed. Also, I do not have access to my iGPU so I can't add anything from that, unfortunately. You have a good bit of information there though. May link your thread in my own vRAM thread for dynamic RAM usage and Optimus laptop usage via iGPU information?
     
  21. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That doesn't mean much. The difference between 1080p and 4k res for example, for most games, is about 400MB vRAM. The only game I've seen where this is not the case is Hitman Absolution (if I remember correctly) which somehow shot up from just over 1GB to about 3.3GB being used with that resolution difference. Otherwise, it really is not a problem to simply improve resolution.
     
  22. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I know it's not a huge difference, but it is a difference. That's all I was saying.
     
  23. Shinra358

    Shinra358 Notebook Consultant

    Reputations:
    37
    Messages:
    229
    Likes Received:
    13
    Trophy Points:
    31
  24. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yes, this is true. It's just whatever GPU memory you have now, if it works on 1080p without issues, it'll likely work at 4K without issues (assuming you have the power). vRAM would not be much of an issue in that regard.

    LOL? The definitive edition for PC? Doesn't the PC already HAVE high res textures and advanced graphics? They claim the game will have running optimizations... but I don't believe it at all. Sleeping Dogs already runs great on most anybody's hardware. I bet it's gonna run worse and look just about the same. (I know I'm fairly pessimistic, I am sorry. I have no trust for devs :()
     
  25. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It seems the Definitive Edition has higher population density. It also has lower system requirements, so optimization could've been improved. I'll probably pick it up unless reviews stink as it's only $15 for me, which is cheaper than getting all the DLC for the original.

    LOL it's funny--Square Enix and its studios are just about the only companies I trust right now. Their track record over the last few years has been really good. DX:HR, Sleeping Dogs, Hitman: Absolution, and Tomb Raider have all been very solid, well-optimized games.
     
  26. Shinra358

    Shinra358 Notebook Consultant

    Reputations:
    37
    Messages:
    229
    Likes Received:
    13
    Trophy Points:
    31
    Yeah, it still kinda sucks when running the max AA settings though. And even with it, there are still jaggies. Downsampling also didn't work with it. And Wei looked like Wol Smoth sometimes so I think they fixed his facial proportions too. I'm sure reducing the high res textures sizes are somewhere in there. That helps on performance. In DEHR, they added and improved on shadows so I'm sure that made it in too with this. Reduced price with all dlc included is also a plus. They may not have fixed the 3dvision issues though unfortunately because they didn't on DEHR (not have 3d options in the in-game menu so ppl can control it and view it easier by just using the 3d hotkeys).
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That is a good point indeed... but if I had $15 I'd buy Binding of Isaac rebirth (for $10.04) and Shadow Ops DLC for Beat Hazard (for $5.00)

    The game had SSAA enabled, so downsampling + using its in-game SSAA would have probably either not worked or ripped a GPU to shreds.
     
  28. Shinra358

    Shinra358 Notebook Consultant

    Reputations:
    37
    Messages:
    229
    Likes Received:
    13
    Trophy Points:
    31
    When you downsample, you're supposed to turn AA off, unless you mean through profiles. But I think that using DS on it, it would actually get rid of jaggies unlike it's AA.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    LOL what's the point. Downsampling is OGSSAA, Sleeping Dogs' in-game supersampling is also OGSSAA, so same thing. GeDoSaTo doesn't work with DX11 games so you can't use the Lanczos sharpening filter. Hopefully, DSR comes out for Kepler at the end of the month like it's supposed to (and isn't locked to GTX GPU's only) as its only notable feature is the adjustable 13-step Gaussian filter. Normal downsampling and upsampling in Windows uses a bilinear filter which blurs/softens the image.
     
  30. Shinra358

    Shinra358 Notebook Consultant

    Reputations:
    37
    Messages:
    229
    Likes Received:
    13
    Trophy Points:
    31
    You have multiple downsampling settings with Gedosato. You don't have that many with just SD alone. Downsampling is also faster than the setting version AA as I see from some games that I already have.GFE has a very small list of games that it works with for GFE doesn't detect every game. Then you will have to wait on Nvidia to add support which is at a terrible pace as you can see from their other softwares. Gedosato works with anything besides anything higher than dx9 but the author is currently working on dx11 versions. Even if Gedosato was dx11 compatible right now though, I think the way that SD does it's settings may still not have it work. But we'll see. Is it already working for Nvidia's DSR for SD?
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    DSR is only available on the desktop GTX 980 and 970 cards right now. Support for older and mobile GPU's remains to be seen.
     
  32. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    Just made an angry post on the Asus ROG forums threatening them that I'd buy from MSI instead if they didn't release 6/8gb versions of their G751. D:
     
    octiceps and ZerockX like this.
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Attaboy, JinKizuite. Now you should post on game forums threatening to take your business elsewhere if these devs don't optimize properly. :)
     
    Ningyo likes this.
  34. Arise

    Arise Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    2
    Trophy Points:
    6
    The g751 is just fine at only 4GB, lower memory means better over clocking. :)
     
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No it doesn't.
     
  36. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    PS4 has 8gb ddr5 but its not fully 8gb available for gaming.I read a article that ps4 and xbox can use 5-5,5gb vram for gaming.Im just interested how much vram will gta 5 consume.
     
  37. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    PS4 has 6GB for Gaming, 2GB left only, 1GB actually reserved for OS (OS using less than 1GB currently) 1GB left untouched for potential features to add in the future if needed via firmware/updates or might get unlocked for gaming if SONY reduced the current OS Size/added everything they wanted and PS4 on its last year or 2 where developers demand the extra vram

    Thing is developers can use 5.5GB VRAM and 512MB as RAM for the game (PS4 Memory is unified/shared) and consoles are not expected to get "ultra settings" sure developers can only require 3GB VRAM card then people would complain the textures does not look any better than the consoles therefore lazy developers not taking advantage of pc is thrown everywhere
     
  38. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    4GB minimum to survive through this console generation.
     
  39. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    Notebookcheck seems to have benched Shadows of Mordor with a 3gb 970m @ ultra with 51fps. Has the ultra texture pack been released yet or is this just high still, if so what's up with that? Hardly seems like it would stutter through lack of VRAM at that frame rate.
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Average FPS won't indicate stuttering. Minimum FPS is what you're looking for.
     
  41. sa7ina

    sa7ina Notebook Consultant

    Reputations:
    543
    Messages:
    272
    Likes Received:
    203
    Trophy Points:
    56
    As much as i can!
    Fun to plan a whole scene in 3ds max and import it part by part to the game engine and give those dumb objects a life by script.
    From the video games developer point of view.

    From the gamers point of view, 8GB standard opens up (for the developers) a lot of options both for textures sizes and diversities + scene complexities and sizes = more realistic look. (someone mentioned 4K?)
    In other words it allows developers to less sacrifice the art for performance...they can "get wild on the canvas".
    It's the gamers interest too.

    And i'm not talking about bad console ports...pure PC Gaming.
     
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'm fine with a game requiring lots of memory if it frees up the artists to create tons of varied, unique assets with amazing level of detail. Totally understandable. But so far, we have not seen that with these newer titles that use up ridiculous amounts of VRAM without a corresponding generational leap in fidelity, which is why they seem poorly optimized.

    If Witcher 2 can look so good with <1GB, if RAGE can render its entire world without a single repeated texture using ~1.5GB, if Crysis 3 can fit start-of-the-art technology and visuals within 2GB, then what excuse do these devs have?
     
    sa7ina and Ningyo like this.
  43. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Yeah right on Octiceps. I think its going to take a heavy hitter like Star Citizen to show the world what a game requiring huge resources should look like.
    Of course its a PC exclusive, but I'm pinning some hopes on it jump starting another renaissance period in PC gaming.
     
    octiceps, sa7ina and flamy like this.
  44. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    :O I see. Well notebookcheck also just released a benchmark article discussing SoM's VRAM usage which I'm reading now, Oh ASUS why you do this to me. :(
     
  45. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    vRam size, or even storing textures and static information in vram rather than in main memory, has absolutely nothing to do with framerate.

    Xbox One: 8Gb (shared) "vRam". PS4: 8Gb (shared) "vRam".

    Bad developers use bad programming practices, and continue to use bad practices when making "PC ports". What happens here is that because of the way the current console architecture is set up, suddenly the holy grail in visual fidelity has to do with vRam hacks. That.. frequently use cpu. So that all the information has to be resubmitted anyway after the calculation is done. But hey, so all the assets are stored in what is defined as video-ram. And this is so convenient that they're now bringing this convention over to "PC" as well.

    That this actually gives you a performance hit is of no concern, of course. What's really going on here is that video ram is becoming fast enough that you can store and transfer information to and from it sufficiently fast enough to not really need dedicated main system ram.

    So everything is slower. And suddenly you need more dedicated video-card ram. Because reasons.

    If you're wondering why some of us have retired our hobby more or less permanently this generation, wonder no more.
     
    D2 Ultima, TBoneSan and Ningyo like this.
  46. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Read the Article? Middle-earth: Shadow of Mordor Benchmarked - NotebookCheck.net Reviews

    you'll face stuttering and slowdowns, which have a big, negative effect on the gaming experience. Previous to the launch of the GeForce 900 generation, only the top models of the GeForce GTX 870M (6 GB) and the GeForce GTX 880M (8 GB) could meet the high memory requirements in the notebook range. This gets especially apparent in the minimum fps. Instead of 26 fps, the predecessor of the GTX 870M only reached 11 fps (GTX 770M @ 3 GB) at ultra settings. Likewise, the GTX 780M with 4 GB VRAM stuttered more than its counterpart of the 800 series. While the performance was comparable otherwise, the minimum fps were more than 50% lower (13 vs. 30 fps).
     
    octiceps likes this.
  47. sa7ina

    sa7ina Notebook Consultant

    Reputations:
    543
    Messages:
    272
    Likes Received:
    203
    Trophy Points:
    56
    It called Evolution.
    Each generation of GPU we get more bandwidth to the vRAM...We need where to write all this power in real time.
    All the stutters and hold ups during run time causes by instantly kill a large amount of objects by code and than reloading a new set in a single frame where it's condition started the function...a lot of work for a fraction of a second...that's why they invented Occlusion Culling, loading the game objects to the vRAM part by part by player position.

    Cry Engine is awesome.
    Think of it's Evolution. :thumbsup:

    I use Unity3d Engine 4.6...because of it's versatility.
    Already bought version 5.X for less than a half the price in Pre-Order.
    Tons of new features...state of the art can't wait.
     
  48. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ...just in case anyone was interested. Occlusion culling means that if you have two objects and the closer object is obscuring the one behind it - then you don't render the one you don't see.

    Sounds great, obviously. So now you just put everything, including textures and resources, of everything that /may/ be possibly seen in the nearest game-world galaxy into ram. And then quickly fetch the object whenever it's visible. Great efficiency! It all fits! All logical!

    Sadly, no. What actually happens - and this is of course something like 99% of the time, unless you for whatever reason decide to simply create objects that aren't visible (by whatever standard "visible" means through clipping scenery, and so on) - the object behind the first object is only /partially/ obscured. Now suddenly you realize that the real reason why occlusion culling is even mentioned when it comes to the "need" for impossible amounts of ram, is that the actual engine simply has no resource management.

    Because unless the object behind the first object is invisible all of the time, what actually happens? It means that this version of occlusion culling has no impact, but the existence of the algorithm requires all the resources in the nearby galaxy to be placed in vram.

    What sort of negative impact this has on performance, no one in AAA cares about that. That better variants and implementations of partial occlusion detection exists, and has been employed previously with massive success. That per object occlusion is a study on it's own, and that there are infinitely better implementations out there have been scrapped - even on current architectures - is simply of no concern to anyone.

    Because this generation, increased ram means increased framerate. Because reasons.
     
    D2 Ultima, sa7ina and Ningyo like this.
  49. sa7ina

    sa7ina Notebook Consultant

    Reputations:
    543
    Messages:
    272
    Likes Received:
    203
    Trophy Points:
    56
    You are right but.
    "the detection" as you said is not calculated in real time but Pre-Calculated. When the scene is done you tell the algorithm XYZ numbers of Occlusion Areas across the scene...the Pre-Calculation Algorithm check 1 by 1 each corner of all the Occlusion Areas, checks 360 for all visible object from that intersection (and it takes a LOOOT of time)...Dynamic Objects in the scene does not "Occlude".
    In the memory in real-time you have a struct of Occlusion Areas (cube) and the list of Static Objects visible in that cube...when you move from cube to cube you load unload or enable disable game objects not relevant to this cube.
     
  50. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ...so it happens at build time anyway. Why not just build a heap or something and create a cache prediction from that?
     
 Next page →