The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Defragging on Vista

    Discussion in 'HP' started by flipfire, Oct 23, 2007.

  1. flipfire

    flipfire Moderately Boss

    Reputations:
    6,156
    Messages:
    11,214
    Likes Received:
    68
    Trophy Points:
    466
    Anyone else notice how Vista's disk defragger is so un-informative? Compared to the XP defragger it actually had a informative graph of the progress and for Vista its just one button?!

    I left mine on overnight to defrag and when i woke up it didnt even look like i used the defrag button cause it didnt return any results and the Defragment Now button is clickable again. Makes me wonder if it even actually did anything to the drive

    I disabled the daily auto-defragging for gaming purposes, so i manually do it once it a while.
     
  2. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    Microsoft minimized Vista's defrag system so that it runs on lowest priority. It's not really intended to be used by the end-user. Instead, it should run in the background and constantly shuffle files around when there's a free CPU cycle. If you want to manually defrag your system, get Auslogic's Defragmenter.
     
  3. bigspin

    bigspin My Kind Of Place

    Reputations:
    632
    Messages:
    3,952
    Likes Received:
    566
    Trophy Points:
    181
    Vista built-in defrag program is crap. Try Auslogic Disk Defrag(Free & Fast)
     
  4. times

    times Notebook Evangelist

    Reputations:
    0
    Messages:
    316
    Likes Received:
    0
    Trophy Points:
    30
    i use diskeeper,but you have to pay for it lol
     
  5. flipfire

    flipfire Moderately Boss

    Reputations:
    6,156
    Messages:
    11,214
    Likes Received:
    68
    Trophy Points:
    466
    I downloaded Auslogic and was satisfied with it. I ran it on all my drives even the external one and it looks like the vista defragger didnt do its job right.

    Thanks,
     
  6. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    Don't disable the defragger and don't use third party tools. Third party tools can actually cause more damage. It might be nice to see all of those files lined up in a row, but that actually causes MORE problems down the road. The best disk setup is where the files are scattered around the disk, so they can be appended to without creating more fragments.

    Re-enable the auto defrag with runs with low priority (Vista also has low priority disk access, so it should not affect anything else), and be happy you have one less thing to worry about when maintaining your system.
     
  7. 05GT

    05GT Notebook Enthusiast

    Reputations:
    7
    Messages:
    40
    Likes Received:
    3
    Trophy Points:
    16
    I used IOBit Defrag... It's freeware and really useful.
     
  8. flipfire

    flipfire Moderately Boss

    Reputations:
    6,156
    Messages:
    11,214
    Likes Received:
    68
    Trophy Points:
    466
    Sorry im confused :confused: . Are you saying i shouldnt defrag cause its better to have a fragmented drive? What are downsides and risks?

    I usually defrag once every month or two. The Auslogic program worked well
     
  9. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    Basically, YES. For a long time the software companies have been perpetuating the problem of fragmentation, which is not a huge problem anymore. It's made even worse by the graphics everyone wants, without which everyone says the software is "crap".

    The practice of cramming all of your files onto one end of the disk is exactly the thing that caused fragmentation in the past with FAT. Think about it. You have a file, and then you defrag and cram it all together with the file right next to it. Now you want to add something to the end of that file. Where does that added data go? It can't go with the rest of the file because there's another file already there. So it needs to make a fragment with that new part somewhere else. That's CAUSING fragmentation, not fixing it.

    The alternative is to have files randomly scattered around the disk, so when they are changed they are more likely to have space on the end, so the file doesn't become fragmented. This is the way unix/linux file systems work, and you never have to defrag those.

    Windows already does optimization of the files -- it analyzes which ones are needed for a fast bootup, for example, and moves those around to be close together. Any more than that, and the software is causing more problems than it fixes.

    Of course, you won't hear about that from the companies themselves, and most other people are just parroting the "common wisdom" they heard from others, most of whom are not really thinking about it.

    Just let the background defragger do it's thing, and feel better because you have one less thing to worry about.

    I should add a caveat here, and that's for large file's downloaded from p2p or bittorrent. Those are hugely fragmented because of the way they download, and defragging *just those files* can make a huge difference when you're watching a movie in reducing stuttering.
     
  10. Andromeda

    Andromeda Notebook Consultant

    Reputations:
    25
    Messages:
    144
    Likes Received:
    0
    Trophy Points:
    30
    You mean excessive free space consolidation is not useful. Don't confuse defragmenting files with consolidating free space or file placement. Defragmenting is never a bad thing, only that free space consolidation i.e shoving all the files into one large block is not as useful as it might appear at first glance. Well, actually, it's useful until those files start getting fragmented lol. Defragmenting has more benefits than drawbacks especially for slow laptop drives that run on batteries (faster file access and lower battery consumption).

    That said, I like auto background defrag, vista or third party, I think it's easier to let the defragmenter handle the process instead of running defrags manually/scheduled and sitting around staring at the screen waiting for it to finish.

    BTW, IIRC, there are options in utorrent to preallocate the space for a download, so that it does not fragment, subject to the free space availability condition.
     
  11. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    Yes, I agree. However I believe that most people are complaining that the vista defrag "sucks" because it doesn't give them a pretty graphic with all the files smooshed into one spot when it's done. I also am trying to get people to stop tweaking everything and just let the system do it's thing. Generally it all works pretty well and doesn't need tweaking or third-party tools.

    uTorrent is one client, and azureus seems to do this pretty well too. Others though, like shareaza, don't do it very well at all. I have seen files with over 10,000 fragments from shareaza. I think it's good to make people aware of it.
     
  12. gridtalker

    gridtalker Notebook Virtuoso

    Reputations:
    18
    Messages:
    2,976
    Likes Received:
    0
    Trophy Points:
    0


    Vista defrag system is horrible
     
  13. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    Care to elaborate on this or are you just trying to get a higher post count?
     
  14. DMANbluesfreak

    DMANbluesfreak Notebook Consultant

    Reputations:
    17
    Messages:
    153
    Likes Received:
    0
    Trophy Points:
    30
    Yeah that post was pretty worthless, as we already said that Vista Defrag isn't bad, despite not showing a progress bar.
     
  15. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    If you look at the other posts this guy has made, almost all of them are equally useless. I have a suspicion it's a spammer who is smart enough to make posts just relevant enough to not be considered spam.
     
  16. DMANbluesfreak

    DMANbluesfreak Notebook Consultant

    Reputations:
    17
    Messages:
    153
    Likes Received:
    0
    Trophy Points:
    30
    Hrm... whats the point in spamming anyway? Does it help him in any way?
     
  17. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    The link is his sig is to a celebrity gossip site.
     
  18. flipfire

    flipfire Moderately Boss

    Reputations:
    6,156
    Messages:
    11,214
    Likes Received:
    68
    Trophy Points:
    466
    Sorry a little off topic question.

    Anyone know how to disable mobsync.exe? It seems to keep on opening everytime i stick my usb flash drive in and an annoying wizard keeps popping up

    Ive tried the control panel sync settings and media player settings with no luck
     
  19. JoeCHecht

    JoeCHecht Notebook Consultant

    Reputations:
    62
    Messages:
    157
    Likes Received:
    0
    Trophy Points:
    30
    Defragging can make a big difference for large files that do not change. For example something like a pre-allocated hard disk file for a virtural machine, or perhaps videos and the like. Free space consolidation can be very important for windows programs that use mapped memory, as I believe the largest size that can be mapped is equal to the largest block of contiguous disk space. Of course, correct me if I am wrong, as I am writing this "before coffee". If this is true, a 32 bit program doing memory mapping could require up to 4GB of mapped memory per allocation, and a 64 bit program... well, much mcuh more, meaning that free space consolidation might be very important.

    FWIW, I use perfect disk, and I like it.I also do a lot of very large memory mappings in some of the programs that I write.

    Joe

    Joe
     
  20. orev

    orev Notebook Virtuoso

    Reputations:
    809
    Messages:
    2,829
    Likes Received:
    1
    Trophy Points:
    56
    I'm not a programmer, so I cannot speak to memory mapping, but I would suspect that the virtual memory system should take care of that?

    As for free space consolidation, the problem is that you have smooshed all your files to one end, and now you need to append some data to one of them. The appended data can't physically go next to the rest of the file, so it must go into the free space as a fragment. If the files are randomly distributed around the disk, then you avoid this situation. That's the tactic that linux/unix takes, and you never have to defrag them.

    Consolidating things like program files might give a little help to startup times, but for data files it can be very bad. Once you do it once, you have to do it all the time.
     
  21. JoeCHecht

    JoeCHecht Notebook Consultant

    Reputations:
    62
    Messages:
    157
    Likes Received:
    0
    Trophy Points:
    30
    Memory mapping is probably what the virtual memory system probably uses for the system swapfile, so, no. I believe it must be contiguous. I reserve the right to be wrong, since I am once again posting before the coffee has brewed.

    You are mostly correct in your assertiion about that packing a disk will cause fragmentation. The rule would be that a file may become fragmented if it was (1) written to (2) it was enlargerd (3) it was enlarged past the size of the disks allocation unit size. For example, if the diak allocation size is 4096, and the file size is 8000 bytes, it would have to be enlarged by 193 bytes before it might become fragmented. I say "might become fragmented" since it is possible that (a) the next disk unit (or n units) is (or has) become available,
    or (b) The original file is deleted and rewritten, possibly somewhere else.

    When packing with the smart placement strategy, the defragger takes into account (a) files that do not change in size and (b) files that are rarely used, thus minimizing the possibility of future fragmentation, and still allowing for the benifites of free space compatation.

    This strategy is not for everyone, however, it can pay big dividends. for example, I have very few files that actually change in size, however, I commonly allocate huge amounts of diskspace on a temporary basis that I either need very fast access to, or it is manditory that the file be made of contiguous diskspace. For me, packing the free space works very well, and at the same time, defrags happen very quickly, since very few files need defragging.

    Joe [going to get coffee]
     
  22. daedal

    daedal Notebook Enthusiast

    Reputations:
    1
    Messages:
    40
    Likes Received:
    0
    Trophy Points:
    15
    Interesting read, thanks for the input. So in a nutshell, what would be your recommendation then?
     
  23. JoeCHecht

    JoeCHecht Notebook Consultant

    Reputations:
    62
    Messages:
    157
    Likes Received:
    0
    Trophy Points:
    30
    It depends on how you use your computer.

    To help you to decide, remember, defragmenting your disk just defrags the files. Packing the disk (AKA free space consolidation) places "movable files " (not the system files that cannot, or should not be moved) to the beginning of the disk, thus allowing the unused portion of the disk to be available with minimum fragmentation. An iintellegent packer will place files that do not change in size first, then files that rarely change next, then files that rarely change in size next, and finally files that commonly change in size. This will strategy will provide maximum system performace, and the maximum unfragmented free space available. This also means that there is a good chance that new files, and files that grow in size will get fragmented, however, you run that risk regardless, but the chances are much higher when you pack the disk.

    If maximum file transfer thoughput is what you want, and you do not mind re-packing the disk every so often (this can take as few minutes), this is the way to go. FWIW, I do the defrag/pack operation once a month, at night, and for me, it takes about 5 minutes or less. Your mileage may vary significantly. I use PerfectDisk. It can also defrag the system's memory files.

    If overall disk performace is not an issue, then stick with the windows defrag, or skip it all together. I will say that a seriously fragmented disk will seriously affect performace. That fact cannot be disputed. Also, remember that the disk is the slowest member of your computer. ie: It is the bottleneck when it comes to system performance. For example, if you transfer a large file over your gigbit (1000 mbs) network connection, you may only get about 340 mbs, as thats about as fast as the disk is going to run (it is the bottleneck that is holding back the throughput). Fragmentated free space will slow that down even more.

    Joe