Anyone else notice how Vista's disk defragger is so un-informative? Compared to the XP defragger it actually had a informative graph of the progress and for Vista its just one button?!
I left mine on overnight to defrag and when i woke up it didnt even look like i used the defrag button cause it didnt return any results and the Defragment Now button is clickable again. Makes me wonder if it even actually did anything to the drive
I disabled the daily auto-defragging for gaming purposes, so i manually do it once it a while.
-
Microsoft minimized Vista's defrag system so that it runs on lowest priority. It's not really intended to be used by the end-user. Instead, it should run in the background and constantly shuffle files around when there's a free CPU cycle. If you want to manually defrag your system, get Auslogic's Defragmenter.
-
Vista built-in defrag program is crap. Try Auslogic Disk Defrag(Free & Fast)
-
i use diskeeper,but you have to pay for it lol
-
I downloaded Auslogic and was satisfied with it. I ran it on all my drives even the external one and it looks like the vista defragger didnt do its job right.
Thanks, -
Don't disable the defragger and don't use third party tools. Third party tools can actually cause more damage. It might be nice to see all of those files lined up in a row, but that actually causes MORE problems down the road. The best disk setup is where the files are scattered around the disk, so they can be appended to without creating more fragments.
Re-enable the auto defrag with runs with low priority (Vista also has low priority disk access, so it should not affect anything else), and be happy you have one less thing to worry about when maintaining your system. -
I used IOBit Defrag... It's freeware and really useful.
-
. Are you saying i shouldnt defrag cause its better to have a fragmented drive? What are downsides and risks?
I usually defrag once every month or two. The Auslogic program worked well -
The practice of cramming all of your files onto one end of the disk is exactly the thing that caused fragmentation in the past with FAT. Think about it. You have a file, and then you defrag and cram it all together with the file right next to it. Now you want to add something to the end of that file. Where does that added data go? It can't go with the rest of the file because there's another file already there. So it needs to make a fragment with that new part somewhere else. That's CAUSING fragmentation, not fixing it.
The alternative is to have files randomly scattered around the disk, so when they are changed they are more likely to have space on the end, so the file doesn't become fragmented. This is the way unix/linux file systems work, and you never have to defrag those.
Windows already does optimization of the files -- it analyzes which ones are needed for a fast bootup, for example, and moves those around to be close together. Any more than that, and the software is causing more problems than it fixes.
Of course, you won't hear about that from the companies themselves, and most other people are just parroting the "common wisdom" they heard from others, most of whom are not really thinking about it.
Just let the background defragger do it's thing, and feel better because you have one less thing to worry about.
I should add a caveat here, and that's for large file's downloaded from p2p or bittorrent. Those are hugely fragmented because of the way they download, and defragging *just those files* can make a huge difference when you're watching a movie in reducing stuttering. -
You mean excessive free space consolidation is not useful. Don't confuse defragmenting files with consolidating free space or file placement. Defragmenting is never a bad thing, only that free space consolidation i.e shoving all the files into one large block is not as useful as it might appear at first glance. Well, actually, it's useful until those files start getting fragmented lol. Defragmenting has more benefits than drawbacks especially for slow laptop drives that run on batteries (faster file access and lower battery consumption).
That said, I like auto background defrag, vista or third party, I think it's easier to let the defragmenter handle the process instead of running defrags manually/scheduled and sitting around staring at the screen waiting for it to finish.
BTW, IIRC, there are options in utorrent to preallocate the space for a download, so that it does not fragment, subject to the free space availability condition. -
uTorrent is one client, and azureus seems to do this pretty well too. Others though, like shareaza, don't do it very well at all. I have seen files with over 10,000 fragments from shareaza. I think it's good to make people aware of it. -
Vista defrag system is horrible -
-
DMANbluesfreak Notebook Consultant
Yeah that post was pretty worthless, as we already said that Vista Defrag isn't bad, despite not showing a progress bar.
-
-
DMANbluesfreak Notebook Consultant
Hrm... whats the point in spamming anyway? Does it help him in any way?
-
The link is his sig is to a celebrity gossip site.
-
Sorry a little off topic question.
Anyone know how to disable mobsync.exe? It seems to keep on opening everytime i stick my usb flash drive in and an annoying wizard keeps popping up
Ive tried the control panel sync settings and media player settings with no luck -
FWIW, I use perfect disk, and I like it.I also do a lot of very large memory mappings in some of the programs that I write.
Joe
Joe -
As for free space consolidation, the problem is that you have smooshed all your files to one end, and now you need to append some data to one of them. The appended data can't physically go next to the rest of the file, so it must go into the free space as a fragment. If the files are randomly distributed around the disk, then you avoid this situation. That's the tactic that linux/unix takes, and you never have to defrag them.
Consolidating things like program files might give a little help to startup times, but for data files it can be very bad. Once you do it once, you have to do it all the time. -
You are mostly correct in your assertiion about that packing a disk will cause fragmentation. The rule would be that a file may become fragmented if it was (1) written to (2) it was enlargerd (3) it was enlarged past the size of the disks allocation unit size. For example, if the diak allocation size is 4096, and the file size is 8000 bytes, it would have to be enlarged by 193 bytes before it might become fragmented. I say "might become fragmented" since it is possible that (a) the next disk unit (or n units) is (or has) become available,
or (b) The original file is deleted and rewritten, possibly somewhere else.
When packing with the smart placement strategy, the defragger takes into account (a) files that do not change in size and (b) files that are rarely used, thus minimizing the possibility of future fragmentation, and still allowing for the benifites of free space compatation.
This strategy is not for everyone, however, it can pay big dividends. for example, I have very few files that actually change in size, however, I commonly allocate huge amounts of diskspace on a temporary basis that I either need very fast access to, or it is manditory that the file be made of contiguous diskspace. For me, packing the free space works very well, and at the same time, defrags happen very quickly, since very few files need defragging.
Joe [going to get coffee] -
Interesting read, thanks for the input. So in a nutshell, what would be your recommendation then?
-
To help you to decide, remember, defragmenting your disk just defrags the files. Packing the disk (AKA free space consolidation) places "movable files " (not the system files that cannot, or should not be moved) to the beginning of the disk, thus allowing the unused portion of the disk to be available with minimum fragmentation. An iintellegent packer will place files that do not change in size first, then files that rarely change next, then files that rarely change in size next, and finally files that commonly change in size. This will strategy will provide maximum system performace, and the maximum unfragmented free space available. This also means that there is a good chance that new files, and files that grow in size will get fragmented, however, you run that risk regardless, but the chances are much higher when you pack the disk.
If maximum file transfer thoughput is what you want, and you do not mind re-packing the disk every so often (this can take as few minutes), this is the way to go. FWIW, I do the defrag/pack operation once a month, at night, and for me, it takes about 5 minutes or less. Your mileage may vary significantly. I use PerfectDisk. It can also defrag the system's memory files.
If overall disk performace is not an issue, then stick with the windows defrag, or skip it all together. I will say that a seriously fragmented disk will seriously affect performace. That fact cannot be disputed. Also, remember that the disk is the slowest member of your computer. ie: It is the bottleneck when it comes to system performance. For example, if you transfer a large file over your gigbit (1000 mbs) network connection, you may only get about 340 mbs, as thats about as fast as the disk is going to run (it is the bottleneck that is holding back the throughput). Fragmentated free space will slow that down even more.
Joe
Defragging on Vista
Discussion in 'HP' started by flipfire, Oct 23, 2007.