The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Virtual memory question

    Discussion in 'Windows OS and Software' started by KillWonder, Jul 13, 2014.

  1. KillWonder

    KillWonder Notebook Evangelist

    Reputations:
    155
    Messages:
    432
    Likes Received:
    17
    Trophy Points:
    31
    I'm using Win7 64bits with 8gm ram and game alot, what is a good mb settings to put in value for begin and maximum?
     
  2. JamesBl

    JamesBl Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    0
    Trophy Points:
    15
    The OS picks the best virtual memory for your system by default. I wouldn't touch it. If your computer is slow, its the hardware(processor or GPU) that you have and not the virtual memory.
     
  3. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    I use System Managed Size. Assuming your aren't nearly out of hard drive space, that will likely work as well as trying to set it manually.

    If you do try setting it manually, you risk setting it to low to handle the odd peak in memory usage. That is, if you set it to something low like 512 MB and one day you run 3 times as many things as usual for some reason, you may run out of virtual memory, and things will grind to a halt as all of a sudden your programs can't get the memory they need. If you follow the old "two times your physical memory" recommendation and set it to 16 GB, you're just wasting space, as by the time you hit 14 GB or so of virtual memory in use (including physical), your computer will be slower than molasses in January because it will be reading virtual memory from the hard drive so often. System managed tends to be the best option, as it will keep it reasonably low most of the time, but if there is the odd day that you need more for whatever reason, it will increase it. It may be a bit slow when you have high virtual memory usage, but at least programs won't crash due to not being able to get memory (which can happen if you don't have enough virtual memory).

    Having played around with this some over the years, the only time I'd set it manually nowadays is if I were quite low on disk space and wanted to make sure I'd always have a certain amount free so that, for example, programs could save files properly and not fail to save because all the remaining free space had been dedicated to virtual memory. But at that point, it's really time to be buying a larger hard drive.

    JamesBI isn't correct in saying that it's always the processor or GPU that's the cause of slowdowns, however. Inadequate amounts of memory is another common cause, and there are others as well. I routinely run into slowdowns on my work computer due to not really having enough memory. The solution to that is more physical, not virtual, memory. I doubt you'll be running into problems due to memory amounts with 8 GB if gaming is your primary use of the machine, however.
     
  4. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Leave it at system managed, there is a lot more to virtual memory, memory allocation, etc than meets the eye in Windows. There's a blog on technet called Pushing the Limits of Windows by Mark Russinovich that covers how memory works in Windows. It is a bit of a long read (for a blog), but it's worth reading.
     
  5. ajkula66

    ajkula66 Courage and Consequence

    Reputations:
    3,018
    Messages:
    3,198
    Likes Received:
    2,318
    Trophy Points:
    231
    The question which I'd ask is whether you have a HDD or SSD.

    If it's the former, follow the advice given above and enjoy.

    With a SSD, I'd add more physical RAM and kill the paging file altogether.
     
  6. Kuu

    Kuu That Quiet Person

    Reputations:
    765
    Messages:
    968
    Likes Received:
    18
    Trophy Points:
    31
    I personally just set it to a specific size and leave it alone depending on the physical RAM the system has; 4GB will get a 4GB Page file, 8GB will get a 1GB page file (or 2) and more than 12GB I'll disable it or set it to use 1GB max. On my own desktop and laptop (16GB and 8GB respectively) I have it disabled to save space on their 128 and 256GB SSDs, but they have different workloads that I know won't exceed the physical RAM limits there.

    But, from what I've read, Windows will make a paging file anyway in some odd cases irregardless of what you set it at, but if that happens to me, so be it.
     
  7. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    I always change my pagefile over to my Data drive (D:\) and set it for double the amount of RAM I have installed. System managed size for some reason never sets the page file larger than the amount of RAM installed. I like to have twice that amount available in case something catastrophic happens...(like I open a 16000x12000 vector image in Photoshop, again :rolleyes:)
     
    alexhawker likes this.
  8. KillWonder

    KillWonder Notebook Evangelist

    Reputations:
    155
    Messages:
    432
    Likes Received:
    17
    Trophy Points:
    31
    I have set it at 6142mb for both and im using a SSD. Ive read that this is the most stable for me.
     
  9. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,256
    Likes Received:
    11,609
    Trophy Points:
    681
    That's a bad thing to do. Some programs might still want to use page, even if you have enough RAM.
    Better reduce it to 1-2GBs than kill altogether.
     
  10. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Try hitting over 20 GB of page file due to a simulation that you didn't setup properly! :p
    Some programs also freak out if you don't have a page file, older versions of Adobe CS and Dawn of War II are notable examples.
     
  11. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    So at a 20GB page file running did the simulation ever end, or could you ever [ESCAPE] from it? :)

    Edit; the debate on a PF on a SSD has been around for a while. Yes having one will use the SSD and there by wear. I've had mine on none for well over 4 years on 2 different systems. This is not for everyone though.

    My simple rule of thumb is do not play with it. If you have to at least set it to 1GB. If you think you knowledge of any prevailing issues is good enough then you could disable it.

    Even if not related though, whenever you ask for help on any issue disclose this and any other system customizations. You would be surprised how often a seemingly unrelated setting can cause an issue.
     
  12. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    I cancelled it as soon as I saw what was happening, anyways, even with the PF on a SSD, it slowed down to a crawl so there was no point on letting it running. I have the PF on my SSDs on my M6700 and the SSDs are still reporting to be alive and kicking at 100% "Health" after close to two years (will be two in october). I have about 1 TB and 1.35 TB written on each drives respectively.

    Back to the virtual memory question, I just saw a post on the ArsTechnica forums in a thread about memory management in windows 7 and one person replied with a comment that the memory mnagement chapter of the Windows Internals book is 200 pages long so memory management in Windows is pretty complex. :p I used to want to tweak everything when I was younger, but then again, I figured the guys at MS probably already tweaked the page file settings to something appropriate. Maybe not the most optimal settings for a given workload, but close enough imo.
     
  13. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    W/O the page file on my 480GB Mushkin I am at 100% good with 20.66TB read and 11.35TB write. I hear when they go though it is a rapid decent. This drive is since September 2012 and 16142 hours. Originally I was going to use it in the P79xx from gateway but it ended up in this machine.
     
  14. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    Well, when I opened up that massive vector image, Photoshop crashed very quickly. I saw the page file usage in CPU meter go from 1% to 100% as well as my RAM. How Windows didn't halt and catch fire is beyond me. But you and KCETech have no excuses!!!! Calculating UFO trajectories and simulating supernovas takes more than a large page file... :D
     
  15. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
  16. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    Hi,
    The speedup he does not address is the HDD. With defragers, such as Disk Perfect, they will move the page file to the middle of the platters on a disk. If you tend to fill the HDD this could cause either a skip of data written to the end or beyond of a pagefile on the platter or fragmentation of the written file or free space. Now you can optimize a disk with partitions to avoid this issue.

    With a SSD it is a matter of free space and/or unneeded writes establishing and maintaining the page file. Since SSD's access, and read/write speed, is consistent on all sectors page file location is irrelevant. With proper firmware over provisioning and wear leveling the stagnant writes to the pagefile area should not cause uneven NAND ware.
     
  17. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ..sometimes what actually happens is that a program will fill the page-file with cache before using ram. Sony Vegas for example does that.

    And if you're lucky with the free space you have left on the disk, the page-file will increase, then decrease, then increase again, etc., every time you make any changes to a project. And that takes a very long time.

    So generally speaking - since like people mention, some programs require a page-file - is to set it to any random size that's fixed. That's going to save you issues with massive loading times, or random lockups and hangs (never mind the constant fragmentation issues).

    But then if you have enough ram and don't use specific programs that have these extremely curious ways to deal with memory mapping, you could just as well turn it off.

    I mean, I haven't read the "second coming" treatises on why page-files isn't really all that hot after all. But there is no function in a system with 4Gb+ ram that should rely on virtual addressing.
     
  18. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    I see where you're going with avoiding unnecessary wear on the SSD, as well as preserving possibly-limited space. On the other hand, if you do actually need the paging file due to too much virtual memory usage, it's going to be a lot slower on a HDD. I've run systems that didn't have adequate memory for the job, and had a HDD paging file. It could be painful. 512 MB RAM Pentium 4 with 1.25 GB of virtual memory in use - not fun. Took 10 minutes just to exit the program that was using 1 GB of virtual memory. 8 GB Win7 laptop with 10 GB of virtual memory in use by half a dozen programs that you're switching between, also not very fun or responsive. My reckoning would be that the point at which you'd reach "really annoyingly slow" would be later if you had the paging file on an SSD instead of a HDD. Although considering that my desktop rarely goes about 3-4 GB of RAM use, I haven't really tested that theory.

    My SSD is from late 2011, has 11,965 hours on, 10.44 TB of writes (and 2.01 TB of reads) and about 95% health according to Intel's estimate. That's with a page file. The write:read ratio is a lot higher than either TANWare or tijo's drives, quite possibly due to the page file. Since I almost never hit 100% RAM usage anyway, perhaps I'll switch my page file over to my Toshiba HDD for a bit and see how much that affects the ratio in 6 months or so.

    Finances allowing, I do agree that having a small (or auto-managed) page file on the HDD and plenty of physical memory really doesn't have any downside.

    There's a lot that's incorrect in that statement. From a technical standpoint, every program you run outside of some pieces of the kernel and BIOS code relies on virtual addressing. Real physical memory addressing hasn't been commonly used (outside of perhaps some niche embedded applications) in decades. The major issue with real (non-virtual) addressing is that it's quite easy for programs to overwrite each others' memory, causing all sorts of problems.

    You probably meant more along the lines of relying on the page file for virtual memory, as opposed to being able to fit everything into physical RAM. And while for the average user it's still uncommon for one program to use 4 GB+ of RAM (although in industrial uses, that's becoming less and less true every day), it's not so uncommon for the sum of all programs the average user is running to require more than 4 GB of RAM.
     
  19. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Yup and while those programs may be paging stuff, that doesn't mean it will make the system slow to a crawl, that will depend on how much you need the stuff that is paged or whether you need to use all those programs at all. Of note, I have the page file on my SSDs, so I'll check the read:write ratios, but they should be in line with what most users have. Of note though is that I have 32 GB of RAM on that laptop, so it's not like I'll hit the page file very often outside of what is paged because it won't be needed, say windows indexing which will resume when I'm at idle, etc.

    Also, Apollo is spot on about the difference between virtual addressing vs actually using the page file.
     
  20. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    Yeah, bad(incorrect) choice of words. All memory addressing is of course virtual outside assembly.. not even direct always then either.

    ...What I'm really talking about is "Demand paging", I think? With the specific windows implementation that is some sort of artful compromise in order to get the worst disadvantages between having a page-cache, and from using demand paging..

    Anyway - I think that what happens when some programs fail to run without a page-file, is that the dll-functions have to be copied to a memory area the program has ownership over. That it's not allowed to duplicate it in ram, but only duplicate it via the virtual memory manager. ...it's a very curious way to do things, imo.
     
  21. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    To follow up on the read:write rations on my SSDs given that I have a paging file on them:
    Intel 520 (windows 7 drive): 1.80 TB : 983 GB, power on hours: 2013
    Mushkin Chronos Deluxe (Win8 drive) : 1.88 TB : 1.39 TB, power on hours: 2555 hours

    I tend to use Windows 8 more than Windows 7 on that laptop by the way.
     
  22. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    While my R/W is much higher so are total hours, 16,523 right now.