Archiving and Compression

 in
Chapter 8 from Scott Granneman's new book "Linux Phrasebook", the pocket guide every linux user needs. Linux Phrasebook offers a concise reference that, like a language phrasebook, can be used "in the street." The book goes straight to practical Linux uses, providing immediate solutions for day-to-day tasks.
Archive and Compress Files Recursively Using gzip

     -r

If you want to use gzip on several files in a directory, just use a wildcard. You might not end up gzipping everything you think you will, however, as this example shows.

   $ ls -F
   bible/ moby-dick.txt paradise_lost.txt
   $ ls -l *
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 508925 paradise_lost.txt

   bible:
   -rw-r--r-- scott scott 207254 genesis.txt
   -rw-r--r-- scott scott 102519 job.txt
   $ gzip *
   gzip: bible is a directory -- ignored
   $ ls -l *
   -rw-r--r-- scott scott 489609 moby-dick.txt.gz
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz

   bible:
   -rw-r--r-- scott scott 207254 genesis.txt
   -rw-r--r-- scott scott 102519 job.txt

Notice that the wildcard didn't do anything for the files inside the bible directory because gzip by default doesn't walk down into subdirectories. To get that behavior, you need to use the -r (or --recursive) option along with your wildcard.

   $ ls -F
   bible/ moby-dick.txt paradise_lost.txt
   $ ls -l *
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 508925 paradise_lost.txt

   bible:
   -rw-r--r-- scott scott 207254 genesis.txt
   -rw-r--r-- scott scott 102519 job.txt
   $ gzip -r *
   $ ls -l *
   -rw-r--r-- scott scott 489609 moby-dick.txt.gz
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz

   bible:
   -rw-r--r-- scott scott 62114 genesis.txt.gz
   -rw-r--r-- scott scott 35984 job.txt.gz

This time, every file — even those in subdirectories — was gzipped. However, note that each file is individually gzipped. The gzip command cannot combine all the files into one big file, like you can with the zip command. To do that, you need to incorporate tar, as you'll see in "Archive and Compress Files with tar and gzip."

Get the Best Compression Possible with gzip

     -[0-9]

Just as with zip, it's possible to adjust the level of compression that gzip uses when it does its job. The gzip command uses a scale from 0 to 9, in which 0 means "no compression at all" (which is like tar, as you'll see later), 1 means "do the job quickly, but don't bother compressing very much," and 9 means "compress the heck out of the files, and I don't mind waiting a bit longer to get the job done." The default is 6, but modern computers are fast enough that it's probably just fine to use 9 all the time.

   $ ls -l
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   $ gzip -c -1 moby-dick.txt > moby-dick.txt.gz
   $ ls -l
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 571005 moby-dick.txt.gz
   $ gzip -c -9 moby-dick.txt > moby-dick.txt.gz
   $ ls -l
   -rw-r--r-- scott scott 1236574 moby-dick.txt
   -rw-r--r-- scott scott 487585 moby-dick.txt.gz

Remember to use the -c option and pipe the output into the actual .gz file due to the way gzip works, as discussed in "Archive and Compress Files Using gzip."

Note - If you want to be clever, define an alias in your .bashrc file that looks like this:

alias gzip='gzip -9'

That way, you'll always use -9 and won't have to think about it.

Uncompress Files Compressed with gzip

     gunzip

Getting files out of a gzipped archive is easy with the gunzip command.

   $ ls -l
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz
   $ gunzip paradise_lost.txt.gz
   $ ls -l
   -rw-r--r-- scott scott 508925 paradise_lost.txt

In the same way that gzip removes the original file, leaving you solely with the gzipped result, gunzip removes the .gz file, leaving you with the final gunzipped result. If you want to ensure that you have both, you need to use the -c option (or --stdout or --to-stdout) and pipe the results to the file you want to create.

   $ ls -l
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz
   $ gunzip -c paradise_lost.txt.gz > paradise_lost.txt
   $ ls -l
   -rw-r--r-- scott scott 508925 paradise_lost.txt
   -rw-r--r-- scott scott 224425 paradise_lost.txt.gz

It's probably a good idea to use -c, especially if you plan to keep behind the .gz file or pass it along to someone else. Sure, you could use gzip and create your own archive, but why go to the extra work?

Note - If you don't like the gunzip command, you can also use gzip -d (or --decompress or --uncompress).

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Unzipping Password Protected Zips

Anonymous's picture

You left out how to unzip ZIP files that are password protected in Linux. I'm searching for this elusive bit of information on the internet right now...

Password protectedly adding files by PHP code was not found

Farrukh Shahzad's picture

Password protectedly adding files by PHP code was not found on the internet when i was searching for it... so i come across your article and it gave me the idea to why not issue a system command by php to add files in zip and even protect the files by password ;)

RAR

Amelia's picture

RAR is good and free too. It supports passwords and can make SFX archives.

No mention of lzma?

Brian Cain's picture

How about rzip or lzma? I recall an article in the print edition within the last ten or eleven issues that compared the cpu overhead of each compression method against compression ratios (and possibly other parameters). Anyways, rzip is memory and cpu intensive, IIRC, but has the potential to make enormous savings. I think it's the same as burrows-wheeler over larger data sets, possibly. Worthwhile for stuff that won't be frequently decompressed, IMO.

rzip

Anonymous's picture

actually rzip levels are in search buffer sizes:

-0 = 100MB
-1 = 100MB
-x = x00MB for x>0 and x<=9

cpu intensive? well depends. I hacked bzip2 compression hooks out of the rzip and it's one of the fastest pre archiving filters with best compression ratio for mysql dump of dbmail database.

yup found bug but only in decompression algorithm - not the data itself. yes - made Andrew to fix it.

Correction to wording

DAKH's picture

Scott,

In the section "Archive Files with tar", paragraph 3, you state that tar is "designed to compress entire directory structures". I think this should read "designed to archive...", since this section deals only with tar's standalone use as an archival tool and since this article/chapter is intended to highlight the difference between archiving and compressing. Other than that, this is a very handy primer on archiving and compressing in *nix.

bzip2 -9

Chris Thompson's picture

The article states that the default block size for bzip2 is -6. The man page for my system (Ubuntu 6.06) states that -9 is the default, and I am unaware of any system where -6 is the default.

TROGDOR STRIKES AGAIN!

TROGDOR's picture

Making -9 the default

Craig Buchek's picture

An easier way to default to the best (-9) compression level would be to export GZIP='-9' and ZIPOPTS='-9' into your environment.

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState