Compress large files

Compress large files and historical archives for linux file server

I have a samba server on linux that acts as a fileserver for your entire network. There are at least 500GB of data. So I had to look for a way to compress large files.

Keep everything under control is really a difficult undertaking. To do this I use a research strategy and a series of commands/tool utility.

Strategy

That strategy I follow? Let’s start with the hunt more directories. For this reason I warm the command:

du. –max-depth = 1-h

It starts from the directory that is more important to control … as the RS NFS or samba and you continue to repeat the command in all say the most demanding.

If then localizzaimo a series of file type, with a readily identifiable pattern, then we can use gnu zip compression in our favor. Take it easy and watch the disk and cpu load.

Example. We store the backup of the Microsoft Outlook client? Sometimes enormous files and which remain untouched for years. Then here:

Find method. -type f-name "*.pst"-exec gzip {} ;

You might even think of schedule the regular repetition, this command for directories and file types to compress.

find, pst, zip

Leave a Reply

Your email address will not be published. Required fields are marked *

For Inspirations, Special Offers and Much More

Via Scoglio grosso 28
05100 Terni, Italy
+39 3388383150
venturini@ict360.it

cf: VNTFNC73P11L117N
p.iva: 00723880555

© 2018 Francesco Venturini. All Rights Reserved