Would you install your antivirus on the Vista partition? Also, what about Office and other similar programs, or would you place them in another partition?![]()
-
-
jackluo923 Notebook Virtuoso
I install all of my program and OS on 1 partition and the only partition on the OS drive. I have a seperate 500GB HDD for my data.
-
Why would you place them on another partition? The OS gets hosed, you can't use those programs anyway.
-
jackluo923 Notebook Virtuoso
-
Ok, still confused
So, my new laptop came with Vista installed and it is in a separate partition. Should I install the antivirus and other programs in this partition or the other. I am going to add another partition for Linux as well.
-
What is it, two partitions of the same size, one is OS and the other is DATA, right?
Probably an Acer or Toshiba.
I'd install them to the OS partition. -
No, MSI GT725. Vista installed in small partition on the 320gb drive. I think its 40gb, not sure though. I will have to look again. I want to create another partition for Linux Mint.
-
Sredni Vashtar Notebook Evangelist
I always create many partitions:
One for the OS and the more 'integrated' programs, like the antivirus, firewall, browser, text and hex editors, plus some image and pdf viewer and a CD burning utility. In this way, should I have problems with the other partitions I can still have the basic tools working.
The other programs (and also the bigger ones, like OpenOffice) go to a "PROG" partition.
OS data (temp, browser cache, email folders, the desktop, and the embedded documents folders - which I use only for temporary storage of working docs) go on another partition.
Then I have one or more partitions for the data I produce and want to keep. A 'small' "myDOCS" partition that can be copied fast. My letters, my programs, my spreadsheets, my pictures all go there
Big data (part images, downloaded freeware, drivers, manuals, audio and videos) go on another partition altogether: those I do not need to backup as often as the vital data.
When something goes haywire in the FS, it's usually one partition that goes in data heaven. I can afford losing one partition better than losing everything. Moreover, it's easier to clone and work on a smaller partition (with a data recovery tool). Even defraggin is faster. -
jackluo923 Notebook Virtuoso
If you don't like linux, then you can install a small, optimized windows vista or XP at the end of the drive. When your main OS is usable, you can use your backup OS. -
The only place I still do multiple partitions like what you're talking about is in servers. -
Sredni Vashtar Notebook Evangelist
And yes, I also have my set of live CDs: knoppix, kanotix, ubuntu...
Recovery is much better done on "small" partitions than it is on the whole disk. Besides, if one data partition 'dies' I can still use the PC and not losing an hour of work while I think how to recover it.
And if it's the OS partition, well, who cares? I just restore the latest image (a few hundred MB only, there's no need to occupy half an external disk) and do some minor tweaking. This has happened before and it saved me a lot of time with respect to restoring GB of data... -
Sredni Vashtar Notebook Evangelist
Backing up the os(es), the osdata, the programs, the big files and all my documents would take an awful lot of time, stress the system, use unnecessary power and waste disk estate I can use for other purposes. -
A seperate partition is a good idea if you plan on regularly wiping your OS installation. Otherwise, its more work than necessary IMO.
-
jackluo923 Notebook Virtuoso
Anyways... creating dozens of partitions on a single hdd will cripple your hdd by raising the access time and considerably lowering the throughtput in some cases.
The best option to have both speed and reliability is to frequently backup on a different drive and leave the hdd as 1 big partition.
If you need to experiment with OSes a lot and install different OS very often, use a virtual machine. You'll save space, processing power, electricity, and other resources.
If the files you want to back up are small, consider online storage. E.g. Live mesh. It automatically sync/backup/restore your data to online cloud and/or any computer on the earth with an internet connnection. This is 100% automatic process. -
davepermen Notebook Nobel Laureate
I create no partitions. I have windows home server, and thus no problem in restoring my disk when something gets "hosed". not that it happens at all normally.
all in one partition == never having to think about where to place something, having to think about "oh crap, that partition is too small, i have to move that around, or resize partition", or what ever.
one partition per disk is enough. i normally have one disk per system. if i have more than one, i use raid0 (with ssd's, reliability is a bit different, so it's not that dangerous). -
I have a RAID 1 for my OS and all programs, and I have an external hard drive for all of "my docs" that gets backed up (incremental) offsite each night. Either way, no disk space or power is wasted, and I have full data security. -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Even if there was a "difference" I don't think it is in any way significant enough to NOT have a separate DATA partition. There are real benefits to having the data separated. I do NOT think it is a good idea to put programs and the OS in different partitions. There is no benefit in doing that at all.
Gary -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Gary -
davepermen Notebook Nobel Laureate
there is no "if that would happen, then". take care for the worst. it saves the thought about all others.
i don't have to think about performance.
i don't have to think about how big each partition has to be, or if i have to resize as my partition needs changed.
i don't have the false believe in better savety.
partitions are oldschool thinking that you can get rid of. at least in vista, the os handles itself well enough to not need any partitioning. xp wasn't that good (but it's still doable) -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
First, I don't buy the performance issue at all unless someone is running an overly optimized defrag. Come on, who REALLY places all the .DOC files right next to the WORD.EXE file. And unless you ARE doing that the, added overhead for a second partition is of no real consequence. The difference in performance is in milliseconds at best. "Much higher performance"??? The difference between 10 miliseconds and 50 milliseconds is much higher, your are right, it's 500%. But, can I measure the difference in how much slower my Word file opens with a stop watch? If not, the difference does not matter. Yes the cumulative difference could amount to a second or two, but only after I open 50-100 Word files. And will I NOTICE that difference. Nope.
Second, I have a very good backup regimen that utilizes the fact that my data is separate from my OS and Apps. I only backup the OS and Apps when needed, i.e. when patches are applied or new apps added. The data gets backed up daily to an external drive, using synchronization software. If the hard drive dies, I swap in a new one, recover the OS/APPS partition form the most recent image and the data partition from the external drive. Worst case scenario covered. Recovery of the OS by itself takes 18-20 minutes tops. As a developer I guard my OS/apps as precious jewels. If I am about to install some piece of software I don't particularly trust, I make an image of my C: partition first. (I don't have to worry if System Restore will be able to undo any damage.) It takes such little time I often do it before I install software I do trust. If I had everything in one partition, I could not afford the time it would take to do that.
I have had to resize my partitions once in three years, so that is a non issue and even then it is a trivial task given the right tools.
Not sure where this came from:"i don't have the false believe in better safety." I never claimed it to be BETTER safety, only more efficient/convenient.
Gary -
-
davepermen Notebook Nobel Laureate
you don't believe in it's savety (neighter do i).
it does make yourself have to manually manage target folders => it's more hazzle.
and for your coding stuff you use versioning anyways, i'd guess.
so, what is the gain of arbitarily splitting your system into bounds? there is simply none. you may not agree but in your heart you know it's true. -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
You said two partitions decrease performance. I said it does, I agree, but it is unmeasurable in any real world situation.
I said it is no more or any less safe. Partitions are not about safety, I never claimed that they were. There is no difference at all in terms of safety.
As for managing target folders. Yes I had to manage three target folders, the my documents folder, my pictures folder and my favorites folder. I had to manage them one time and only one time, when I used Vista's built in feature for moving them to a different location. After that I never have to give it a second thought. I store 100% of my data in the my documents folder, period.
Yes, for my code I use a source control library on a server. But even while the code is resident on my machine, it is in a subdirectory of the My Documents folder.
And to the remaining question: "so, what is the gain of arbitarily splitting your system into bounds? there is simply none. you may not agree but in your heart you know it's true." Sorry, in my heart I know there is a reason. One which despite my detailing it for you, you seemed to have ignored or dismiss. It is one of convenience, making my back up regimen easier and faster. And my restore process, should the need arise even faster and simpler. With a single partition, how quickly can you make a recoverable backup prior to performing something like say installing a service pack? I can do so in 18 minutes.
As I said before, we should probably just agree to disagree on the merits of two partitions. We are not going to convince each other at this point.
Gary -
You use partitions instead of folders? -
-
davepermen Notebook Nobel Laureate
It takes 2-5minutes here to make that backup. Windows Home Serve ronly backups the changes. Your convinience and speed argument is moot, then.
-
davepermen Notebook Nobel Laureate
oh, and normally, i don't even need to do that pre-backup, as i do daily backups automatically anyways => if something goes wrong, i allready have a most-actual backup from some hours, max a day ago.
-
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Gary -
-
davepermen Notebook Nobel Laureate
yep, and done so when switching to win7. i copied back my personal data from my backup.
-
ScuderiaConchiglia NBR Vaio Team Curmudgeon
The ONLY reason I ever started using partitions in the first place was to make it so I could use Norton Ghost to create images of my systems (desktop and laptop). Before I started using two partitions, it took a very long time to create or restore an image. As soon as I created two partitions, the image creation and restore dropped to 18 minutes. I have always kept my data synced to my desktop machine so I really had a redundant backup of it and no real need to have it occupy my image files which were only needed to keep the OS and programs backed up in the first place.
I am not suggesting that this is the solution for everyone, but I am suggesting that the notion that there is NEVER a need to keep data on a separate partition is just plain wrong. For some folks, there are very good reasons for doing so. You may have a different backup scheme as daveperman does, his utilizes resources I don't have available. And that's great. But to suggest that my scheme is wrong doesn't make any sense to me. It works well for my needs given the hardware available to me (namely a laptop and a desktop, both with DVD burners) and is the fastest way to achieve my ends. It has also worked very well for several of my clients. If you have a better way that is as fast for this hardware/software configuration I am all ears.
Gary -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Gary -
davepermen Notebook Nobel Laureate
you just use the wrong backup tools. good backup and restore tools don't take long with a single partition at all.
you say your reasons are "very good". i don't see them very good.
it is an _okay_ config. but not "very good".
best is to suggest your clients a solution like winhomeserver, as most clients are not alone at home, they get backups to up to 10 machines, fully automatic and manual (what you prefer), and networked (save-trough-auto-dublication) shared folders for the families data they want to have together. and a full win2003 server for your own fun (i have sql and svn on it, f.e.).
it's not _that_ expensive. espencially as there often are existing hw components (like, disks) around to reuse for it.
(you payed for norton ghost, and your external backup disk.. the price difference to a cheap hw config for winhomeserver is not far, and the gain is huge) -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
But you DO have my curiosity whetted. Lets say that the hard drive on your laptop fails. What is the process for restoring laptop functionality using the winhome server? How do you get the most recent state of the OS and applications back on the laptop?
Gary -
Anyway. I agree that we just need to agree to disagree. I am saying there is no benefit to doing it your way and I think it's true - likewise, there is little benefit to doing it my way. It seems like you have a reason, even if it's not 100% valid, the process is convenient and works for you. My methods work fine for me, and they both accomplish the same thing. Therefore, I think both are valid methods.
Regardless, I don't think the OP of this topic would benefit very much from partitioning his hard drive. We all can probably agree on that, since it's unlikely he is going to be doing anything as advanced in his backup solutions as we are describing. -
davepermen Notebook Nobel Laureate
, you'll see in 2), why). connect to it, chose your system, restore.
2) the a-bit-less-simple way: after you installed your system successfully and have it backed up first time, open your backup from the home-server to grab the "Windows Drivers for Restore" folder (or how-ever that's called), copy it to the stick.
then, when restoring, you get the additional option to load additonal drivers so you can connect to net, access your raid, etc. else, the bootdisk may not be able to access your disk / server.
if you have only one system, you have to do this, before the system fails. if you have more than one (more likely), you can get those drivers from another system, copy it to disk as needed, restore.
it's a very simple restore process. i've done it some days ago when i installed my newest ssd in my laptop.
so the best thing: create your amazing-home-server-stick (and i think microsoft should give one with each licence, with some funky branding.. would be cool.. and some plug-it-into-home-server-and-load-all-drivers-onto-it-automatically-feature would rock, too)
maybe in the next homeserver version -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Daveperman,
I think I followed that. Let me spit it back to make sure:
For some client machines you can use a "Restore from HomeServer Disk". (I assume this is something that comes with homeserver or that you make ON THE SERVER. (Not on the client machine).
For some client machines you need the above AND the contents of the a "Windows Drivers for restore" folder from the server. I assume this is something that is created on the server for each client machine when you backup a client machine for the first time.
Did I get that right?
Gary -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
My test was to image the C: partition using Acronis and then immediately doing the same thing with Ghost. I was, and am, at a loss to explain such a huge performance hit. But...
Gary -
davepermen Notebook Nobel Laureate
i'm thinking of creating some usb-auto-driver-update-tool for home-server as a plugin. that would be something useful to program for me.. hrr hrr -
jackluo923 Notebook Virtuoso
When you're accessing programs and data at the same time which contains a lot of small files (very typical situation), you can expect up to your hdd throughput drop down to less than 500KB/s. If you optimize your hdd through 3rd party software or even the defrag tool which comes bundle with the OS, accesing those same small files might yield more than 20MB/s throughtput. That's about 4000% performance improvements. With a hdd with lots of concurrent drive access from 2 or more partitions, you'll have a very high chance of lagging and the hdd performance will drop a lot.
Here's why:
I'll describe the worst case scenario here. You have your OS on the 1st partition, Programs on the 2nd partiton and Data on the 3rd partition, and your page file on the 4th partition. When you Open up a program, you'll have to access the program partition, then OS partition and back and forth couple hundreds of time, then transfer data to the page file partition couple hundreds of time, then save a cache file to your data partition couple hundreds of time. All these jumping around can increase your latency by an enormous amount. If your hdd is capable of 100MB sequencial write, having multiple partition and having to read/write small files from different partitions at the same time, your hdd will probably crawl around 500KB with more than 10x the seek time compared to a hdd with very optimized file system. This is the worst case scenario thus in this case you'll notice at least couple thousand % performance difference. Luckily, your HDD isn't doing this all the time thus the performance degration isn't as much, but still a lot compared to a non-partitioned HDD. -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Gary -
davepermen Notebook Nobel Laureate
In my case, I had xp installed like that: c: 10gb, data 60gb, slowstuff 10gb.
i put files of no use into the slowstuff region.
performance loss was feelable. the snappiness got lost when ever a slowstuff file got accessed (not often. but if, it was feelable).
well.. i've heard often how it enhances performance (espencially back in the 98 days). this at last isn't true. and even after sleeping over it, it doesn't really make sence to have data in a different place -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
I just ran some tests just to be sure I was not deluding myself. I have a large word document that is 132 pages long. I put a copy of it on my C: partition I then created a shortcut on my desktop pointing to the document. I rebooted Vista and waited for the hard drive to go quite. I right clicked on the desktop short cut and then timed how long it took from the time I clicked open for the entire 132 pages to open in MS Word. (I watched the "page 1 of 1" change to "1 of 132" in Word to stop the stop watch.) Total time 18 seconds.
I then moved the file to the D: partition. Updated the desktop short cut, rebooted Vista, again allowed the hard drive to go quite. Right clicked on the shortcut, and timed how long it took from the time I clicked open till I saw page 1 of 132 in Word. Total time... (drum roll please)... 18 seconds.
Ok where is my performance loss gentlemen? If there is any (and I do not for one second deny that there probably is actually some loss), it cannot be measured with a stopwatch, as I have said all along.
If there is some better test to show this performance loss I am supposed to be seeing, I am willing to try another test.
Gary -
davepermen Notebook Nobel Laureate
do it again with xp and you could see it. vista has word preloaded anyways (and word starts below 1 sec). so no matter what, all it does is loading the document for 18 secs
(, 18 secs?!)
to test it really at the extreme: take a 2tb disk, install vista on the first 20gb partition you create, and the doc on the last 1gb partition you create at the other end of the disk. then you should be able to measure a difference.
not, that it matters.
but i've heard a lot of "uhh it gets faster as you have short stroking then". yes, it does, as long as the tiny c: drive is the ONLY partition on the disk.
again, 18 seconds?!!?? how big is that file? is it really disk dependent at all? -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
The word file is 528 kb and has a lot of formating, comments, etc. I am pretty sure not all of that time is reading the file, some of it is also the display formating. I'll look around for a larger file and run another set of comparisons. Or if you can think of some other test I am willing to try it. I really do want to get to the bottom of this, just to see if there really is a stopwatch measurable difference.
Gary -
davepermen Notebook Nobel Laureate
if it's 528kb, then it got read with 2 disk accesses max, that would mean 16ms access time maximum on an ordinary disk with 8ms latency at worst.
it would mean 1ms or so on my ssd..(i love them, i really do)
one thing i'm thinking about, right now, is something that needs tempfiles. like trying to winrar a gb of data or so, that it reads from d: and goes into c: temp and back to d: i think the windows-integrated zip works like this?
something that triggers a back-and-forth-and-back-again behaviour over big random data (so converting a movie would not be a great example. i think zipping could work, i can even test that right here on the xp machine). -
davepermen Notebook Nobel Laureate
well, zipping a 23mb file on D: took twice as long as it took to zip it on C: (and a quarter as long to copy it over to c: first.. so yes, copy-to-c + zipping-on-c + copy-to-d is about as long as zipping-on-d).
this with the xp-builtin zipper. i thought that one goes to c: all the time for temp?
edit: no, it doesn't. it creates a tempfile on D:. i now see it with a bigger project. anyways, having userdata in the slower disk parts slows down zipping by much. -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Gary -
Sredni Vashtar Notebook Evangelist
It happened to me twice (with old FS, I reckon that, like FAT32). The file allocation table had gone, and so the backup copy that is stored somewhere else in the partition. I dare you find a folder to recover files from, in that condition.
I cloned the partition with dd, moved in a Linux partition and mounted there for a slow file by file recovery. Had I had a single giant partition I wouldn't have been able to do that because the image file would have been too big.
Moreover, consider a virus that overwrites the first sectors of your hard disk. This would make your whole disk unusable, but only my first partition unusable (I keep a copy of the partition table and MBR on paper).
What if the virus, or a badly written driver (why is it that I am thinking about roxio, now?) destroys the file allocation table of the partition he's in? You'll have to restore everything, and do you really have a backup up to the last second? Hardly so. In my case it will destroy the OS partition only. I would not even bother to restore the Programs partition. Just the last image of the OS one - three minutes and I'm back to work.
I have to agree that partitioning won't save you from a hardware failure. In fact, on my desktop I keep data partitions on a different hard disk. On a latop with only one disk, partitioning can help you not putting all your eggs in one basket, but in case of an earth Quake, you're Doomed.
For example: there is a device named RAM that could help a bit in avoiding excessive disk access. Moreover: are you really sure that your disk's head is not going farther away than mine? Consider that all my programs resides at most at 20 GB from where the OS is. What about the programs you install after you have a half full disk?
(on the Web there is an article by Stallman explaining why the Cloud could lose its silver lining) -
ScuderiaConchiglia NBR Vaio Team Curmudgeon
Update: I just tried another test on an XP machine. This one has a MUCH slower CPU than my Vista laptop. But I compressed another 77 megabyte file on the C: partition and the D: partition. And guess what, it took exactly the same amount of time on both. Not sure why you are seeing degradation, but in all the years I have been creating separate partitons for data (to support my backup methodology), I have never seen any such degradation. This includes Win98, Win2000, WinXP and now Vista.
As I have said numerous times, I accept the fact that there is probably a difference in speed. What I do not accept is that this difference is perceptable to a user in their day to day use of the machine. I am going to try one more test I just though of. I'll report back in a few minutes.
Ok the last test was to run the compression on a FAT32 partition on the XP box. Again the time to compress the 77 megabyte file was exactly the same.
Gary
Vista partition?
Discussion in 'Windows OS and Software' started by ScubaSteveO, Apr 21, 2009.