Compress large files

Compress large files and historical archives for linux file server

I have a samba server on linux that acts as a fileserver for your entire network. There are at least 500GB of data. So I had to look for a way to compress large files.

Keep everything under control is really a difficult undertaking. To do this I use a research strategy and a series of commands/tool utility.

Strategy

That strategy I follow? Let’s start with the hunt more directories. For this reason I warm the command:

du. –max-depth = 1-h

It starts from the directory that is more important to control … as the RS NFS or samba and you continue to repeat the command in all say the most demanding.

If then localizzaimo a series of file type, with a readily identifiable pattern, then we can use gnu zip compression in our favor. Take it easy and watch the disk and cpu load.

Example. We store the backup of the Microsoft Outlook client? Sometimes enormous files and which remain untouched for years. Then here:

Find method. -type f-name "*.pst"-exec gzip {} ;

You might even think of schedule the regular repetition, this command for directories and file types to compress.

Leave a Reply

Your email address will not be published. Required fields are marked *

Un progetto da realizzare?

Start up, think tank, idee da sviluppare in sistemi e software ...
angular
Python

© ICT360. All rights reserved. Powered by ICT360.