An Automated Reliable Backup Solution

Creating an unattended, encrypted, redundant, network backup solution using Linux, Duplicity and COTS hardware.
GnuPG Key Setup

After setting up SSH key authentication, I created a GnuPG key that Duplicity would use to sign and encrypt the backups. I created a key as my normal user on the client machine. Having the GnuPG key associated with a normal user account prevents backing up the entire filesystem. If I decided at some point that I wanted to back up the entire filesystem, I simply would create a GnuPG key as the root user on the client machine. To generate a GPG key, I used the following command:

$ gpg --gen-key
Keychain

Once both the GnuPG and SSH keys were created, the first thing I did was make a CD containing copies of both my SSH and GnuPG keys. Then I installed and set up Keychain. Keychain is an application that manages long-lived instances of ssh-agent and gpg-agent to provide a mechanism that eliminates the need for password entry for every command that requires either the GnuPG or SSH keys. On a Debian client machine, I first had to install the keychain and ssh-askpass packages. Then I edited the /etc/X11/Xsession.options file and commented out the use-ssh-agent line so that the ssh-agent was not started every time I logged in with an Xsession. Then I added the following lines to my .bashrc file to start up Keychain properly:

/usr/bin/keychain ~/.ssh/id_dsa 2> /dev/null
source ~/.keychain/`hostname`-sh

After that, I added an xterm instantiation to my gnome-session so that an xterm in turn starts an instance of bash, which reads in the .bashrc file and runs Keychain. When Keychain is executed, it checks to see whether the key is already cached; if it is not, it prompts me once for my key passwords every time I start my computer and log in.

Using Duplicity

Once Keychain was installed and configured, I was able to make unattended backups of directories simply by configuring cron to execute Duplicity. I backed up my home directory with the following command:

$ duplicity --encrypt-key AA43E426 \
--sign-key AA43E426 /home/username \
scp://user@backup_serv/backup/home

After backing up my home directory, I verified the backup with the following command:

$ duplicity --verify --encrypt-key AA43E426 \
--sign-key AA43E426 \
scp://user@backup_serv/backup/home \
/home/username

Suppose that I accidentally removed my home directory on my client machine. To recover it from the backup server, I would use the following command:

$ duplicity --encrypt-key AA43E426 \
--sign-key AA43E426 \
scp://user@backup_serv/backup/home \
/home/username

However, my GnuPG and SSH keys are normally stored in my home directory. Without the keys I cannot recover my backups. Hence, I first recovered my GPG and SSH keys from the CD on which I previously saved my keys.

This solution also provides the capability of cleaning up files on the backup server for a specified date and time. Given this capability, I also added the following command to my cron tab to remove any backups more than two months old:

$ duplicity --remove-older-than 2M \
--encrypt-key AA43E426 --sign-key AA43E426 \
scp://user@backup_serv/backup/home \
/home/username

This command conserves disk space, but it limits how far back I can recover data.

Conclusion

This solution has worked very well for me. It provides the key functionality that I need and meets all of my requirements. It is not perfect, however. Duplicity currently does not support hard-links; it treats them as individual files. Hence, in a backup recovery that contains hard-links, individual files are produced rather than one file with associated hard-links.

Despite Duplicity's lack of support for hard-links, this is still my choice of backup solution. It seems that development of Duplicity has recently picked up, and maybe this phase of development will add hard-link support. Maybe I will find the time to add this support myself. Either way, this provides an unattended, encrypted, redundant network backup solution that takes very little money or effort to set up.

Andrew J. De Ponte is a security professional and avid software developer. He has worked with a variety of UNIX-based distributions since 1997 and believes the key to success in general is the balance of design and productivity. He awaits comments and questions at cyphactor@socall.rr.com.

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Duplicity for Windows

Nic Sandfield's picture

Those with Windows clients should check out the wonderful duplicati implementation

Actually, you will also have

Anonymous's picture

Actually, you will also have the added complication of file system issues if backing up the forked HFS+ file system on the Mac to the single fork file system on the Linux box.

Dave from nanovor game

I was not able to covery every aspect.

Kevin Horn's picture

I was not able to covery every aspect. Getting it working on Mac OS X is pretty close to what is required for getting it working on Linux. However, Windows is a completely different experience, it required a huge amount of work on my part and I have not had a chance to write it all up yet in final form. Work has been consuming most of my time as of late, but I am still trying to get something out to help people like yourself.
Kevin Horn - club penguin

awesome

Neal's picture

This article is fantastic. Great work. Just what I needed to jumpstart my move to this solution without having to learn too much before I get it working.

Thanks again.

-N

Any updates on sourcing of components?

gmaya's picture

Andrew:
Are there any updates on sourcing of components and their features?

I started by looking at

Cristiano's picture

I started by looking at small form-factor motherboards that I might use. I had used Mini-ITX motherboards in a number of other projects and knew that there was close to full Linux support for it. Given that this project did not require a fast CPU, I decided on the EPIA Mini-ITX ML8000A motherboard, which has an 800MHz CPU, a 100Mb network interface and one 32-bit PCI slot built in to it.

Unclear

adeponte's picture

I am having difficulty understanding what you are specifically referring to. If you are referreing to the hardware and the functionality of it, not much has change since the article was released. If not, please drop me an e-mail at cyphactor@socal.rr.com with further questions.

Is something missing....?

PatrickT's picture

When I read this article I was lead to believe that since the author has "12 computers, which run a combinations of Linux, Mac OS X, and Windows. Losing my work is unacceptable!" we were going to a see a solution that provided for backup of all the OSs he listed. Unfortunately it appears, only Linux like OSs are supported. Foiled again!

Patrick

Try BackupPC

Muyiwa Taiwo's picture

You may want to check out BackupPC here. I've done a write-up here about integrating Windows Active Directory clients with the BackupPC server.

Limitations of Reality

adeponte's picture

You are correct, when you did read the article it did lead you to beleive I have 12 computers running a variety of operating systems Linux, Mac OS X, and Windows. The limitations of reality are that there is a word limit for articles. Hence I was not able to covery every aspect. Getting it working on Mac OS X is pretty close to what is required for getting it working on Linux. However, Windows is a completely different experience, it required a huge amount of work on my part and I have not had a chance to write it all up yet in final form (if I can remember all that I did). Work has been consuming most of my time as of late, but I am still trying to get something out to help people like yourself. My ultimate goal is to expand this current solution into a more complete feature filled solution that is pretty trivial to setup. Sadly it isn't there yet, but it is on the back burner. If you have any questions feel free to e-mail me at cyphactor@socal.rr.com.

Actually, you will also have

Anonymous's picture

Actually, you will also have the added complication of file system issues if backing up the forked HFS+ file system on the Mac to the single fork file system on the Linux box.

Backup for Windows

Tabare Perez's picture

Maybe a solution for your Windows machine is a free software called Cobian Backup (http://www.educ.umu.se/~cobian/cobianbackup.htm). It works very well.

Best regards.
Tabare

Rsync backup for Windows to a Linux server

Alan's picture

Not that Rsync is the best solution out there(I do really like the duplicity backup solution outlined above)there is a way to use Cygwin and Rsync to a Linux server.
Check it out here http://www.gaztronics.net/rsync.php I have not tried it, but I may if I cannot get Duplicity to play well with Cygwin

Try using this page--Running Duplicity in Cygwin

Alan's picture

I haven't set this up yet, but tomorrow's the day. I will try to post to let you know how it goes. See this site for instructions on running duplicity in Cygwin. I don't see why it wouldn't work.... http://katastrophos.net/andre/blog/2006/04/03/duplicity-042-on-cygwin/

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState