Archiving and Compression

 in
Chapter 8 from Scott Granneman's new book "Linux Phrasebook", the pocket guide every linux user needs. Linux Phrasebook offers a concise reference that, like a language phrasebook, can be used "in the street." The book goes straight to practical Linux uses, providing immediate solutions for day-to-day tasks.
Archive and Compress Files Recursively Using gzip

     -r

If you want to use gzip on several files in a directory, just use a wildcard. You might not end up gzipping everything you think you will, however, as this example shows.

   $ ls -F
   bible/ moby-dick.txt paradise_lost.txt
   $ ls -l *
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 508925 paradise_lost.txt

   bible:
   -rw-r--r-- scott scott 207254 genesis.txt
   -rw-r--r-- scott scott 102519 job.txt
   $ gzip *
   gzip: bible is a directory -- ignored
   $ ls -l *
   -rw-r--r-- scott scott 489609 moby-dick.txt.gz
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz

   bible:
   -rw-r--r-- scott scott 207254 genesis.txt
   -rw-r--r-- scott scott 102519 job.txt

Notice that the wildcard didn't do anything for the files inside the bible directory because gzip by default doesn't walk down into subdirectories. To get that behavior, you need to use the -r (or --recursive) option along with your wildcard.

   $ ls -F
   bible/ moby-dick.txt paradise_lost.txt
   $ ls -l *
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 508925 paradise_lost.txt

   bible:
   -rw-r--r-- scott scott 207254 genesis.txt
   -rw-r--r-- scott scott 102519 job.txt
   $ gzip -r *
   $ ls -l *
   -rw-r--r-- scott scott 489609 moby-dick.txt.gz
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz

   bible:
   -rw-r--r-- scott scott 62114 genesis.txt.gz
   -rw-r--r-- scott scott 35984 job.txt.gz

This time, every file — even those in subdirectories — was gzipped. However, note that each file is individually gzipped. The gzip command cannot combine all the files into one big file, like you can with the zip command. To do that, you need to incorporate tar, as you'll see in "Archive and Compress Files with tar and gzip."

Get the Best Compression Possible with gzip

     -[0-9]

Just as with zip, it's possible to adjust the level of compression that gzip uses when it does its job. The gzip command uses a scale from 0 to 9, in which 0 means "no compression at all" (which is like tar, as you'll see later), 1 means "do the job quickly, but don't bother compressing very much," and 9 means "compress the heck out of the files, and I don't mind waiting a bit longer to get the job done." The default is 6, but modern computers are fast enough that it's probably just fine to use 9 all the time.

   $ ls -l
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   $ gzip -c -1 moby-dick.txt > moby-dick.txt.gz
   $ ls -l
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 571005 moby-dick.txt.gz
   $ gzip -c -9 moby-dick.txt > moby-dick.txt.gz
   $ ls -l
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 487585 moby-dick.txt.gz

Remember to use the -c option and pipe the output into the actual .gz file due to the way gzip works, as discussed in "Archive and Compress Files Using gzip."

Note - If you want to be clever, define an alias in your .bashrc file that looks like this:

alias gzip='gzip -9'

That way, you'll always use -9 and won't have to think about it.

Uncompress Files Compressed with gzip

     gunzip

Getting files out of a gzipped archive is easy with the gunzip command.

   $ ls -l
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz
   $ gunzip paradise_lost.txt.gz
   $ ls -l
   -rw-r--r-- scott scott 508925 paradise_lost.txt

In the same way that gzip removes the original file, leaving you solely with the gzipped result, gunzip removes the .gz file, leaving you with the final gunzipped result. If you want to ensure that you have both, you need to use the -c option (or --stdout or --to-stdout) and pipe the results to the file you want to create.

   $ ls -l
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz
   $ gunzip -c paradise_lost.txt.gz > paradise_lost.txt
   $ ls -l
   -rw-r--r-- scott scott 508925 paradise_lost.txt
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz

It's probably a good idea to use -c, especially if you plan to keep behind the .gz file or pass it along to someone else. Sure, you could use gzip and create your own archive, but why go to the extra work?

Note - If you don't like the gunzip command, you can also use gzip -d (or --decompress or --uncompress).

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Unzipping Password Protected Zips

Anonymous's picture

You left out how to unzip ZIP files that are password protected in Linux. I'm searching for this elusive bit of information on the internet right now...

Password protectedly adding files by PHP code was not found

Farrukh Shahzad's picture

Password protectedly adding files by PHP code was not found on the internet when i was searching for it... so i come across your article and it gave me the idea to why not issue a system command by php to add files in zip and even protect the files by password ;)

RAR

Amelia's picture

RAR is good and free too. It supports passwords and can make SFX archives.

No mention of lzma?

Brian Cain's picture

How about rzip or lzma? I recall an article in the print edition within the last ten or eleven issues that compared the cpu overhead of each compression method against compression ratios (and possibly other parameters). Anyways, rzip is memory and cpu intensive, IIRC, but has the potential to make enormous savings. I think it's the same as burrows-wheeler over larger data sets, possibly. Worthwhile for stuff that won't be frequently decompressed, IMO.

rzip

Anonymous's picture

actually rzip levels are in search buffer sizes:

-0 = 100MB
-1 = 100MB
-x = x00MB for x>0 and x<=9

cpu intensive? well depends. I hacked bzip2 compression hooks out of the rzip and it's one of the fastest pre archiving filters with best compression ratio for mysql dump of dbmail database.

yup found bug but only in decompression algorithm - not the data itself. yes - made Andrew to fix it.

Correction to wording

DAKH's picture

Scott,

In the section "Archive Files with tar", paragraph 3, you state that tar is "designed to compress entire directory structures". I think this should read "designed to archive...", since this section deals only with tar's standalone use as an archival tool and since this article/chapter is intended to highlight the difference between archiving and compressing. Other than that, this is a very handy primer on archiving and compressing in *nix.

bzip2 -9

Chris Thompson's picture

The article states that the default block size for bzip2 is -6. The man page for my system (Ubuntu 6.06) states that -9 is the default, and I am unaware of any system where -6 is the default.

TROGDOR STRIKES AGAIN!

TROGDOR's picture

Making -9 the default

Craig Buchek's picture

An easier way to default to the best (-9) compression level would be to export GZIP='-9' and ZIPOPTS='-9' into your environment.

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix