any estimation?
now 4TB 2.5 portable hard drive is so common in the consumer market that are at approx 120 america dollars.
but when will it be 8TB portable? any estimation?
does anyone remember when 2TB portable hard drive was first launched in the market? and when was 4TB portable?
Portable=2.5"
-
Honestly I believe there will be 8TB SSDs before there are any 8TB 2.5 inch hard drives. I wouldn't buy one even if they did as I would be very concerned about the reliability. Its already been proven that 3.5 inch drives over 3TB lose reliability over smaller capacities. -
I've not come across any data for 2.5 inch drives though. Generally your best bet is to go for the higher quality manufacturers. Western Digital, Toshiba and HGST for example. -
From what I've heard (though I don't have data for this), 2.5" HDDs are supposedly a bit less reliable than 3.5" HDDs. Whether or not it's the form factor or the typical usage (laptops), I wouldn't know.
But iirc, (3.5") HDD failure rates have been in the ~3% or better region for awhile now.Arrrrbol likes this. -
Vasudev likes this.
-
Same here. The only drives I've had fail on me ever were a used (not refurb, just used) 3.5" HDD and the eMMC in my 2-in-1 computer (well, it's failing, and I have since stopped using it). Otherwise, I still have a bunch of 2.5" HDDs, a few 3.5" HDDs (including an IDE drive), and a few SSDs (including one from 2011) that still work.
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
I would also love a 8TB 2.5" drive, or maybe 4TB-5TB in 9.5mm form-factor instead of current 15mm. On side note, 3.5" drives are actually much more fragile than 2.5" - just compare their shock tolerances. While mechanical fragility doesn't necessarily reflect reliability, it is something to be wary of.
Vasudev likes this. -
Of course, there are higher capacity SSD's; 4 TB is available, so I could go to 8 TB (and I guess 10-12 TB counting the M.2's). But $1300 or so for 4 TB is just plain a lot more $ than the rotating rust.Starlight5 likes this. -
StormJumper Notebook Virtuoso
Last edited: Dec 2, 2017Jarhead and Starlight5 like this. -
But as for mobility of that data, I don't agree -- think of photographers or videographers who need fast access to their libraries in the field. Cloud storage simply isn't a good option unless there's fast and reliable access everywhere.Jarhead likes this. -
-
Some companies, incidentally, run internal clouds to host various services, because it often provides more flexibility for deployment.
But you really do have to know what you're doing with consumer cloud services. Look at CrashPlan discontinuing its personal backup service, for example. -
Oh no, I know the pros and cons of it, and for some things I'm fine with it (I also use cloud email, though I also keep local copies). I just wouldn't heavily depend on cloud storage since it can basically go away at any moment (either through closures, policy changes, etc) and because of the possibility of slow/no internet access (as you pointed out already). As far as I use cloud storage, it's simply just another backup/sync point for me; it'll never be primary storage for either home or on-the-go for me.
CrashPlan shutting down their non-business plans was a massive blow to the people on r/DataHoarders :/. And even their business plans are clamping down. I'm still looking for a decent online backup service because of that.rlk likes this. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
I personally have a NextCloud server running on RPI3 - for backup and syncing most important data between smartphone and laptops.
Next, I have a WiGiG docking station with 8TB total storage, accessible wirelessly across the room where I usually work. If I need some of the stored data on the go, I prefer to copy it to laptop SSD rather than taking one of the portable drives with me, unless completely inevitable; it is fast while not being annoyingly uncomfortable like connecting&disconnecting external HDDs or wired docking station numerous times a day to a convertible laptop that doubles as a tablet and thus is not by any means stationary.Last edited: Dec 2, 2017 -
I currently have a Synology NAS that I can access outside my LAN (just set it up recently, so nothing much is going one with it yet), a home-built NAS (that I eventually plan to put a VM on for external access), GMail for mail, and Amazon Photos for photo backup. Eventually I'll get around to finding a service that I can use for backups (BackBlaze, Amazon Glacier, maybe?). But that's about it for me. No cloud music, video, storage, etc. for me.
Starlight5 likes this. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
Regarding NAS - 3 things about it that always concerned me:
1. Latency
2. Price (especially if low latency is critical)
3. Size&weight (I often move, so generally avoid heavy electronics that occupy unreasonable space in favor of compact stuff) -
Eh, I've moved fairly often as of recent, and the weight of a NAS is nothing compared to the weight of other typical stuff (moving my dining room table was a b***).
But yeah, a home NAS is really dependent on what sort of internet connection you have. I built mine when I had a 60/5 connection (wasn't doing anything outside my LAN then) and now I have 1000/1000. The price for prebuilt NASes are pretty high (though you're paying for the software and the "I don't want to deal with it" factor), but a DIY NAS is not all that expensive compared to a normal desktop (depending on how many drives you put in). Either way, it's higher up-front costs but no monthly fees to use it, so figuring out if it makes sense for your use case relies on finding the break-even point and figuring out how long you want the storage.Starlight5 likes this. -
If you need increased storage capacity, local (non-cloud) storage, try DiskZIP.
I've been using it for a couple of years now with no problems at all.
It accelerates disk read speeds, even on SSD hardware, and especially on HDDs.
Underlying this is compression, which reduces physical data to be read.
Your mileage will vary, but all program files are highly compressible - so they guarantee about 5 GB free space on a brand-new PC, just from Windows alone.
That nicely matches the free storage tiers offered by all cloud service providers, and that's the minimum space savings you'll be enjoying.
Anything on top of that is based on the types of data you store on your disks, but it doesn't hurt, and all of your data is local, away from the potentially prying eyes of any service provider as may be. -
How well this works will depend very heavily on your mix of storage. If you're storing typical image files, videos, audio, and that, you won't get much if any compression, since those files are already compressed and it takes rather more work to compress them further. Executables and text files will compress much more effectively. So maybe you'll get your guaranteed 5GB but not much more.
In the case of third party compression (or encryption) software, you also need to worry about long-term access to the data -- OS changes that break the compression software, or the vendor going out of business or stopping support, might leave your data inaccessible.
Finally, there are compression algorithms (not great ones, but they'll work fine on text or executable code) that can run at SATA SSD speed. How well they'll work if you have NVMe/PCIe storage is another question you need to look at; they may not keep up and thereby slow your system down some. Probably not critically, but that NVMe SSD might not be all it was advertised as if you have data compression in front of it.Jarhead likes this. -
Also, you’re last paragraph implies some sort of encryption, and compression != encryption. -
To clarify, my last paragraph does not refer to encryption at all. When you upload something to the cloud, its potentially in the hands of hackers, criminals, thieves, ill-intentioned corporations. Think of the latest Facebook scandal - need I say more?
If you're running short of space, the last thing you need to do is upload to the cloud. You can boost your local storage, have uninterrupted access to your data, and rest assured that your data remains yours - safe - from prying eyes and hands and ears... -
Compression doesn’t improve the performance of your storage drives, it merely allows you to store more data (a little or a lot more, depending on what data you’re storing), but at the cost of an increased load on the CPU (because you need to compress and decompress the files you’re using). So actually, you’re running a bit (or a lot, again depending on the situation) slower when you’re using compression, not faster. In some cases it can *appear* that your storage I/O is performing better, for example decompressing a highly-compressed file (since you’re not transferring as many bytes from the drive than you would with that data being stored uncompressed), but again at the cost of CPU performance being used up.
Again, compression isn’t a security tool. Your local, compressed data is just as secure as your local, uncompressed data. If you, for example, manage to get spyware installed on your computer, you’re just as boned either way. So, assuming security of your data is a concern, you need to look towards encryption as a solution (you can use both compression and encryption, by the way; they aren’t mutually exclusive). As for cloud solutions, you always have the option to encrypt your data stored online, be it with the cloud provider’s encryption tools (assuming you trust them) or your own tools before upload.Vistar Shook likes this. -
I'll give you some real-world examples from my production build system. The underlying hardware is 3x 2TB Samsung 960 Pro `gumstick` SSDs in a RAID 0 configuration.
Let's start with the boot drive. This drive has tons of applications (compilers, databases, office and design apps), games (many A-list titles such as GTA5, Assassin's Creed, X-Plane, Prepar3D), and gigs upon gigs of data installed. In short, its a line-of-business partition, with many games added on top.
Boot Drive uncompressed capacity: 2,310.37 GB
Boot drive compressed capacity: 3.38 TB
Boot drive compression ratio: 1.5:1
Then, the VM partition. The compression ratio here is an astounding 3.1:1!
Uncompressed size: 1,725 GB
Compressed size: 555 GB
Of course, VM's are always going to compress remarkably well, since each VM contains almost the same files for the Windows operating system.
Finally, I've got the backup partition, which contains mostly incompressible data such as videos and music, with a sprinkling of application installers used for reinstalls.
The compression ratio is still better than I'd expect at 1.2:1.
Uncompressed size: 3,388 GB
Compressed size: 2,771 GB
I leave it for you to decide whether having all this extra free space - plus disk read speed acceleration - is a `bad` thing. -
Did you check the benchmarks on the DiskZIP site? They are in line with my findings locally. In short, the time it takes for your disk to deliver compact data and then your CPU to decompress that data is less than the time it takes to read uncompressed data:
Decompression Time(CPU) + Read Time(Compact Data) < Read Time (Uncompressed Data)
This is especially true for HDD and SATA connected SSDs, but it even applies to NVMe SSDs, at the fastest compression settings. Of course, for HDDs, you're going to see the most benefit at the strongest compression settings.
Thank you for bringing up the malware example. While I realize compression is not a security tool, I've been saved from more than one malware infection using DiskZIP's built-in System Refresh functionality. It offers one-click reset to the last time you had compressed your disk. Your data of that time is always impregnable in the compressed disk file.
I also thought encryption and compression cannot be combined (at least, you cannot compress encrypted data and expect any space savings). So I fully agree that compression is not a security play on its own. However, it helps me store local and skip the cloud - which is the security I'm specifically referring to here (in addition to the one-layer malware protection). -
Of course you don’t have full control over data online; those thinking an online storage solution or a social media website are private are laughably ignorant. However, you do have control over who can read your data if you encrypt it before uploading it and keep the keys to yourself. That way, even if the data is taken somehow it will still be safe, assuming you’re using a current encryption algorithm with a decent key length (AES-256 would take approximately 3x10^51 years to brute force). Really, the only thing to really worry about wrt online storage would be availability, for example the service could shut down in the future. As for Facebook, et al, it only has whatever information you choose to provide them (including your friend network and their information on you).
Anyway, as for the benchmarks, I would take a grain of salt the benchmarks of a product that are provided by the people trying to convince customers to use said product. That said, you didn’t provide CPU performance with your benchmarks.
My whole point in the previous post is that the storage gain by using compression isn’t free. Whether or not the CPU performance penalty is worth it to you is your own judgement call. I was merely trying to correct some incorrect implications.Vistar Shook likes this. -
Even 30 years ago when we had single core CPUs running at meager 10's of MHz only, disk compression speeded up systems - for example, Windows 3.1 booted in 21 seconds instead of 23, due to the same formula I've shared above. And the formula still holds.
Raw CPU performance is of course unaffected by disk compression, since its got nothing to do with disk I/O whatsoever.
Could you clarify what kind of CPU performance you have in mind exactly?
You might be far better at testing this stuff than me, so if you don't have faith in my real-world reporting, I'd definitely encourage you to run your own benchmarks with DiskZIP and share them with us here.
I'd definitely be very interested in some third party formal testing results, not in the least because I'm keeping an open mind!hmscott likes this. -
FYI - The author of DiskZIP posts here in this thread, and elsewhere, in his words:
https://encode.ru/threads/2763-DiskZIP?p=52780&viewfull=1#post52780
DiskZIP - Encode.ru
https://encode.ru/threads/2763-DiskZIP
30th May 2017, 02:46 #1 diskzip
Member
Join Date May 2017
Location Australia
Posts 108
DiskZIP
"As the developer of DiskZIP, I would like to personally present the product in this forum, and ask for product feedback.
The product is commercial, I hope this is not a problem.
It has two components.
File Compression: Plug-in based. Currently ships with two plug-ins. One is based on the 7-Zip stack (version 16.04) with custom handling of RAR archives for improved UNRAR handling compared to stock 7-Zip. Another is based on the Bricolsoft ZIP stack for WinZip compatible ZIPX (de)compression, including JPEG (de)compression. Ticket to fame is the shell namespace extension, which to my knowledge is the only stable implementation for Windows. Actions such as copy/paste, drag/drop, double-click seamlessly compress, extract, view, and even optionally update archives. Advanced actions such as Check-Out are available directly from the shell namespace extension as well. Of course, full right-click integration in File Explorer is also provided. Tasks are started in parallel when underlying SSDs are detected for peak performance. Outlook Add-Ins and Preview Handlers are also available in the base package. Stand-alone archive apps are also present, but these would not stand out in any way.
Disk Compression: Patent-pending. Online and offline modes available. Offline mode includes built in dedup, malware protection (system state as it was last compressed is impregnable to malware, reducing attack surface to changes made since the last compression pass only), and is shown to increase disk read speeds, even on already very fast SSD hardware. An additional benefit discovered after product went to market is disk lifespan increase - in a customer test on identical hardware/software setup, the DiskZIP compressed disk was used 20 times less over a 30 day testing period, according to S.M.A.R.T. results. Offline disk compression can compress and recompress disks on-the-fly, without requiring any external storage. For 100% data protection in the event of unexpected power loss or hardware errors, a Backup Disk may be used. Compressed disks may also be uncompressed as long as space is sufficient. The offline mode can also be used as a generic disk imaging solution, including applications for cloning PC's from one set of hardware to another.
While the software is commercial, gamification offers discount coupons up to 70% - reducing the software cost very substantially. When all benefits are included, the software is a terrific bargain, even without any gamification discounts.
A free trial is available at www.diskzip.com. The trial never expires, is completely unlimited for file compression, but has some limitations for disk compression (one-time offline compression only, and for online compression, ends processing after 1 GB free space is created - although repeatable indefinitely).
I would appreciate feedback on the product, most particularly with the shell namespace extension based file compressor, and the offline disk compressor.
Looking forward to it!
Thank you."
There are many more posts by diskzip, current through Jan 28, 2018 in that thread, and additional discussions in other threads, here's one:
https://encode.ru/threads/130-PowerArchiver?p=52811&viewfull=1#post52811Last edited: Mar 28, 2018msintle likes this. -
When you run compression software on your computer, the code in said software is executed by the CPU. The CPU not only runs the software itself, but also performs mathematical calculations on the data you are compressing in order to transform it into compressed data (and vice versa). The basic chain looks something like this:
Read raw data from storage > RAM > CPU reads raw data > CPU outputs results > RAM > Results are stored in storage
It’s a rather simplified explaination, though if you’d like to dive into the details of how compression works, the Wikipedia articles on various popular compression algorithms are fairly readable to non-CS people I think(?). One notable thing to take home is that different algorithms make different trade offs in terms of compression ratio and CPU performance / algorithm speed; RAM is also affected, which is something I forgot to mention. (it’s especially important if you fiddle with dictionary sizes).
There are some situations where compression can happen without involving other computer hardware like the CPU. Some hard drives and SSDs have built-in compression that they handle themselves (you’re not aware of it as a user, and software you run has no control over it). In addition, the majority of media that you own is already fairly compressed (JPEG, MP3, MP4, etc).
There’s also the issue of lossy and lossless compression (the formats I mentioned previous are lossy). Lossless compression does not change the quality of the data you’re compressing, however lossy compression will degrade the quality. A fun little game to see this in action would be to repeatedly compress an image in a lossy format like JPEG until it becomes a blurred mess.
—————-
My two cents on compression are as follows:
For the average user, it’s probably not worth the hassle. Most data being used is either not very compressable (as seen in your benchmarks) or is already compressed.
Compression is especially useful if most of the data is highly compressable and that there is enough of it to vastly outweigh the cons of compression. A large bank of text files, source code for multiple programming projects, executables, etc. For example, I used to play a game called World of Tanks and would store the replay files for my matches. A weird quirk of the game is that reply files only worked with the executable for the game version it was created with (9.0 replays only worked with the 9.0 client, for example), so I had to store each version of the game each time a new version came out. By the time I had quit, I ended up with about 250GB-300GB of clients/replays. Ended up compressing this data, choosing the lrzip program (optimized for large files and large RAM capacity) using the bzip2 algorithm and a large dictionary size (forgot how much at the moment) and ended up with a total of ~120-150 compressed data. This took the better part of a day to run on my computer, legged my CPU to 100% and used up 13GB our of 16GB of RAM I had.Last edited: Mar 28, 2018Vistar Shook and hmscott like this. -
The official DiskZIP benchmarks - and my personal experience - show that the storage subsystem runs measurably faster with disk compression enabled than without. DiskZIP works exactly as designed in this regard, accelerating disk read speeds.
You've referenced "CPU benchmarks" which doesn't make sense, as a pure CPU benchmark (say wPrime) would be 100% unaffected by disk compression. I've asked you to clarify what other kind of CPU benchmark you've had in mind, to which you responded with the platitudes above.
What would be helpful to other users (and myself) is if you actually dug in and compiled the CPU benchmarks you actually have in mind (if any) and post the results here.
Referencing completely unrelated *file* compression with lrzip just doesn't serve any purpose other than obfuscate the conversation.Last edited: Mar 28, 2018 -
But to try to make it a bit more clear: the performance properties of your storage drive haven't changed. It still performs as its spec sheet says, give or take. For example, if your sequential reads/writes are 500MB/s and 500MB/s, using compression to store the data doesn't suddenly make your drive's sequential writes faster than 500MB/s (and vice versa).
You are correct in that a CPU benchmark, running alone, would not be affected by how data is stored on the disk. However, I invite you to run wPrime (or whatever your favorite tester is) both when you're not decompressing from disk / compressing to disk and compare that to a CPU benchmark that is ran while you are performing said decompression/compression. I'd be very surprised if you do not see a difference in the two benchmarks. Unfortunately, I'm not able to run anything at the moment since I'm at work and posting from mobile.
I'm not sure what you mean by the lrzip comment. All compression that you are performing on your computer is file compression (which is what DiskZIP is doing, by the way), unless you're talking about filesystem compression (such as what is built into NTFS). I had simply gave an example of where I found compression useful and gave my experiences with it using the given software/algorithms. Generally I don't compress since most of my data is already compressed and it's simply easier (for me) to just buy another drive or upgrade existing drives.Last edited: Mar 28, 2018 -
Anyway, reading through what little there is to read on DiskZIP's website, and the previous posts, I think you're being a bit too sucked into the marketing
.
That said, I'm curious why they're asking for an email address and claiming this to be a malware prevention solution... -
Your claim that it does file compression just proves that you already presume to know what it is, without actually having that knowledge. What's your rush?
As for your `revised` benchmark methodology: I would be very interested to find out about the results, because unlike you, I try and keep an open mind.
When you are away from `mobile` ensure to run the same worker payload on the same device for accurate results.
For example, if you're copying a file in the background while running wPrime in the foreground, ensure to copy the exact same file. -
"Your email is optional and helps us serve you better."Last edited: Mar 28, 2018 -
Well, if you’d like to explain how the software works, I’m all ears. You are correct in that I’m assuming what the tool does, since the website for it is shockingly lacking in detail and I have a decent idea how compression schemes work (it being part of my coursework back in uni...). However, from what I can gather from the posts here, it is not 1) a filesystem with built-in compression (please correct me if I’m wrong, but it appears you’re running on top of NTFS, the default Windows filesystem) and it is not 2) a disk-level compression (which would be transparent to you and the OS). That leaves a file-level compression.
And I absolutely will run benchmarks once I return home, though to be up front I’m not paying money for software which I can get freely elsewhere.
It’s currently 2:22pm eastern. You’re going to have to wait for a few hours.
——————
As I’ve stated previously, I said that compression is beneficial in some cases and not so beneficial in others. It’s yet another tool that one can use to acheive whatever goals they are going after, assuming said tool helps in those goals.
The only issue I have is that it seems like you’re trying a hard sell on a particular, paid compression tool. Call me cynical, but my impression at this point is that you might be a company representative trying to advertise it here (and other threads where you’ve posted). -
This is certainly #2 - disk compression - transparent to the user and the OS.
Since I am not the author, I am not privy to implementation details. Possibly when their patent has been issued, you can look it up in detail at the USPTO.
Their trial will compress your disk at the fastest compression setting without payment. That should be sufficient for benchmarking. You can investigate more upon running the trial. -
Not sure how I feel about compressing my disk with a trial version, using non-optimized settings or being able to tune it, giving his posts he gets very specific on particular options he prefers in other products, so I'd buy it right off the start so I had the best shot at evaluating it.
At one point I was doing that for lots of disk compression software, in the mood so to speak, and in that situation would purchase a few and evaluate to get the best.
BTW, Stacker was the hardware compression software/hardware combo, diskzip the user posts mentioned Stacker, and it came back to me.msintle likes this. -
Hmm, so if that’s the case, then it sounds like to me that it’s either using the (free and bundled with the OS) NTFS compressor or is using something else to duplicate that functionality. *Maybe* it’s using a different algorithm than what NTFS uses (from what I understand, a modified LZ77)?
Vistar Shook likes this. -
Well, currently running a decompression on a 7zip archive of CSV files I have (27GB compressed, 185GB decompressed) using the bog-standard settings in the 7zip software. Forgot that Passmark is paid software, so I just grabbed something decent enough for the task at hand (Novabench, since it contains a CPU benchmark).
Posted screen captures of my results so far here: https://imgur.com/gallery/mQviV
Will update with a compression test once this has finished decompressing and I run those tests.
EDIT: Compression tests: https://imgur.com/gallery/C3HYZ
Note that the storage drive scores are not accurate, since it is measuring my NVMe drive and I have the files going to/from the HDD on this computer. Since I'm testing the affects of compressing/decompressing things on the other parts of the system, I think this is okay.
If for whatever reason the above link doesn't work, I'll summarize below (with updates for compressing once I do those tests):
- CPU utilization during the decompression hovered between 15% to 25% on an i7-7700HQ in Windows' "Best Performance" energy setting, no overclocks, and nothing else running in the background other than 7zip and the Task Manager.
- RAM usage increases somewhat compared to rest, sitting at about 0.1GB-0.5GB extra RAM used for 7zip's default settings.
- Interestingly enough, the GPU was also affected, though I suspect only indirectly as a consequence of the CPU being affected. Not something I was testing, but an interesting note nevertheless.
- At rest, my CPU utilization hovers around 1%-10%, averaging somewhere in between. RAM usage hovers at around 5GB (out of 16GB).
- There is a dip in performance in the CPU benchmark by 15 points. RAM benchmark suffered slightly as well.
- CPU utiliztion during the compression was far higher than either resting or during decompression, sitting at around 90% usage.
- For reference, I've included the hardware utilization just after quitting Cities: Skylines (CPU intensive game) in a moderately-sized city. It was sitting at about 65% usage, higher than decompression but lower than compression.
- RAM usage stayed roughly similar to resting usage, at about 4.4GB-4.5GB (resting being at around 4.4GB just before this run).
- The CPU benchmark result comes in at about 110 points lower than resting.
--------------------------------
Whether or not we're compressing a whole partition or a collection of files, there will be a performance it on the system. When setting up a whole-partition compression, the entire partition needs to be compressed at that time, then selectively compressing and decompressing data as it is used and added to the partition. In contrast, compressing/decompressing by file only happens whenever you explicitly do those actions on that file, and will not run automatically like a partition-level compression scheme would do.
I would hypothesize that I'd see similar levels of CPU activity if I were to set up partition-level compression on this computer, as well as whenever I add/view files on said partition, assuming a similar compression algorithm being used as in 7zip's default settings. Vice versa, I'd expect to see a drop in CPU utilization similar to my resting usage whenever I'm not touching the files on a system with partition-level compression, or if I remove that partition-level compression and return to a more normal storage setup.Last edited: Mar 28, 2018Vistar Shook likes this. -
You have been claiming that DiskZIP's patent-pending acceleration has the exact opposite effect, namely that it slows down a PC.
You did not test for that at all. I'll help clarify exactly what you need to do:
1) First run whatever CPU benchmark you have in mind.
2) Next, install DiskZIP. Yes, you didn't even install DiskZIP - I have no idea what you are testing.
3) Run DiskZIP Offline, and compress your boot disk. Just to be clear, this happens offline and not on-the-fly or on a per-file basis as you have been assuming throughout this conversation.
4) Then run the exact same benchmark you ran in step #1 above in fully identical testing conditions.
Anything else is completely unrelated and does not help anything other than further obfuscate the `conversation`. -
Oh boy, alright. I tested my claims of “running compression will affect CPU performance” just fine. I grabbed a benchmark, I grabbed a compression tool, and I ran them. The results were my benchmark scores for the CPU were lower when running compression/decompression than when I wasn’t running them, plain and simple. I even showed CPU usage in the task manager to again hammer that point home.
My claim was that because CPU resources were required to run compression, it would affect CPU performance. I showed that just fine by running the same test under the same conditions, with the only variable being if I was running a compression/decompression.
Sorry that you have a problem with that. Perhaps if you feel my testing did not accurately show CPU performance during compression, perhaps you can run your own test to show that I’m wrong?
——-
Just a FYI, but your supposed testing procedure doesn’t have a control to compare again, so how do you suppose verifying the CPU performance claim? You need to add a “benchmark without any compression running, before installing the compression software” step.Last edited: Mar 29, 2018 -
With so many Cores / Threads, and so much higher clocked CPU's than when I did full disk compression regularly, I find it hard to believe that CPU usage is anything but completely inconsequential.
Full file compression on large files is going to take "time" over draining raw throughput from the device, but for the most part the IO is so incredibly slow compared to decompression it does for the most part on average take less "wall time" to run a compressed volume than an uncompressed volume.
I don't think CPU usage is going to be a big drawback in current technology high performance computers.
Right now I don't want to take the time to get "involved", given other things going on and not having any particular need for compressing files to recover space.
If you are at the limits of storage, and have a modern CPU with 4c/8t or more, and have a good backup regime in place so you can quickly restore and not lose data, I'd give it a shot - although I'd spend some time looking at the whole current market of available solutions before jumping into the deep end and doing a complete disk, partition, system compression project.
It's all for fun, right?msintle likes this. -
Thank you hmscott, that’s the whole point I’m trying to drive home: the act of compressing data is not a “free” operation, there are performance implications.
hmscott likes this. -
There may in fact be times where the compute is fully engaged with coming up with the results being written, and that there is an actual wall time cost to the compression, which is why hardware assisted storage controller "compression" is helpful.
It's happened long ago and far away with Usenet data, very compressible, but also super high volume when you are a major node, solutions at the time were multiple spindles spreading the categories across volumes - instead of the high overhead of using kernel software spanning. The cost of the overhead was high enough to slow down the feeds.
Knowing the application and directing how it spread data across controllers was the common sense solution, removing the software overhead on an already 100% CPU usage service solved the resource limitations.
For the most part, user data work on a personal workstation / laptop is exactly the right application for this kind of compression storage stretching, especially if your data is highly compressible - with low CPU utilization cost per volume - which is likely very low.
It's a good point to make to be on the lookout for the overhead cost of software resource usage against utilization benefits. For the most part there are enough CPU cycles free to make it "Free" most of the time, just not always.Last edited: Mar 29, 2018 -
The problem is entirely yours when you don't understand even the most basic thing about DiskZIP, and pretend to have proven your point testing software that has nothing to do whatsoever with DiskZIP.
I'd also like to ask you why, despite my having pointed out several times already, you would insist on maintaining your incorrect assumptions about DiskZIP? They are incorrect assumptions, leading to your incorrect conclusions. I understand if you don't want to install the software, but why pretend to understand something you clearly don't? -
Pertaining to your last point (hmscott), I do agree and I even said so way earlier in the thread. The cost/benefit analysis is different for different use cases and while compression (or any other computation) is never “free”, the acceptance of that performance hit depends on the tolerance for that hit by the user in exchange for the benefits from said hit.
Vistar Shook and hmscott like this. -
-
So there is certainly the *hit* of time it takes to compress your disk offline, during which time you cannot do anything else on your PC.
However after that when you boot back into Windows, you would have seen that everything runs faster than before, because of this formula:
Decompression Time(CPU) + Read Time(Compact Data) < Read Time (Uncompressed Data)
I'm sure you have a virtual machine or two you can safely experiment on, even if you don't trust the software to begin with. It would have been faster to just try.hmscott likes this. -
A couple of comments:
1) Perhaps I'm showing how long in the tooth I've gotten, but I feel like I'm back discussing DoubleSpace and Stac's Stacker with my peers. The good ole days... that was back before I used backup software and lost entire drives (multiple) due to compression.
2) @msintle, does your virtualization use snapshots? Have you benched/compared times moving back and forth among 30 or 40 different snapshots inside a single VM? There's a lot of disk i/o going on during those operations. Curious if the large load of seeking and decompressing among those snapshot deltas is slowed by the compressed implementation.
3) Also, have you benched / compared when using the DiskZip stuff on an SSD? If the decompression routine will make use of temp space or place that data somewhere else on the drive, it is placing extra wear-n-tear on the SSD.
4) Finally on the SSD front, if the drive is using something like the SandForce (SF) controller, which already has built in compression, my guess is there is a performance hit as the compressed data is compressed by the controller when it is stored by the SF controller. Again, this is a guess, but it would be something to keep an eye on.Last edited: Mar 29, 2018 -
2-3-4) I have a feeling these questions are best suited for the software author. You can probably reach out to him directly at that Russian forum referenced on this thread.
I am just a very fond user of this software. In my experience (and yes, my VM's have snapshots, though not in the tens of snapshots per VM), this software makes my PC faster and bigger - with an elementary line of defence against malware, and a very convenient clone/backup mechanism. Your own mileage may vary of course. I will never stop singing its praise though, maybe out of nostalgia for those olden/golden days, and maybe out of the sheer and time-tested convenience of the thing, which at least in my experience, has been constant.
when will be 8TB 2.5 portable hard drive in the consumer market?
Discussion in 'Desktop Hardware' started by kenny1999, Oct 28, 2017.