Presenting squid-deb-proxy! Speed up your update downloads!

Are you like me and have multiple Ubuntu machines under one roof? Are you tired of downloading the same update multiple times? Sick of what seems to be duplicate work? Let me introduce you to my little friend... squid-deb-proxy.

Squid-deb-proxy is a new package for Ubuntu 10.04, and it's designed to make your life easier and allow faster updates if you manage more than one Ubuntu machine. Conceptually, squid-deb-proxy consists of two pieces, a client and a server. The server package is "squid-deb-proxy" and the client package is "squid-deb-proxy-client". The "squid-deb-proxy" server package is basically a squid caching server, with an out-of-the box configuration that allows it to cache .deb packages and make them accessible to the local area network. The "squid-deb-proxy-client" package is basically an include file to your standard apt configuration that makes apt aware of the squid-deb-proxy.

To install the server, simply "sudo apt-get install squid-deb-proxy avahi-tools" on the machine that you wish the server to be on. This will install the squid caching server and the avahi (Bonjour) auto-configuration network utilities, and start both servers, so your new caching squid proxy will start broadcasting its availability on your network. Then, a "sudo apt-get install squid-deb-proxy-client" on each Ubuntu 10.04 machine (including the squid-deb-proxy server) will install the apt configuration. You'll want to install the client on the server as well, so whenever the server downloads updates those updates get cached by the squid proxy. This will also allow the server to install already-fetched updates via the proxy.

Once this is done, squid-deb-proxy is transparent to the user. Each machine's apt program will look on the network for a squid-deb-proxy, and if it finds one, it'll pass its requests through that. The proxy will cache any .deb packages that come through it, and make them available for the next update client that needs them. The second client to request these same updates will pull them down from the squid proxy, rather than having to get them from the Internet. You get the benefit of a local repository without the hassle of setting one up!

The beautiful part about the squid-deb-proxy solution is that it is completely transparent. If you have the squid-deb-proxy client installed on your laptop and you choose to download an update while on a business trip, your laptop will grab the updates from the main repository in your sources.list file, since the proxy isn't on that local area network, broadcasting its services via avahi. There's no need to modify your sources.list in any way, because apt becomes proxy-aware automagically. It's really cool stuff.

______________________

Bill Childers is the Virtual Editor for Linux Journal. No one really knows what that means.

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Does it work?

rEnr3n's picture

I installed this a couple of weeks ago. But I noticed that it doesn't work. I tried updating the server first before the client. But when I did, the client didn't retrieve the packages from my server. It still retrieves the packages on the internet. I followed the steps but I think I'm still missing something; or the transparency just don't work for me. How do I know that it is working? What should I look for? Or should I try another alternative?

Try AptOnCD

Graeme's picture

I have been using AptOnCD (install via: apt-get install aptoncd) to distribute all *.deb updates to many other systems. So I only every download updates from one PC. It's simple to use and works like a charm - you can even add your own *.deb files that didn't come via the updates channel.

Answers

Charles's picture

@Jesseca:
Squid is a popular caching http proxy server. It is often used for security and performance reasons on networks for web browsing. The performance principle of a caching proxy is similar to any type of cache: Machine A requests a file (e.g. a web page or package); the proxy fetches it, then passes it back to A. Later, machine B then requests the same file. The proxy now already has the file, so it doesn't have to go over the Internet to fetch it, but can instead return it immediately to B.

@sshvetsov
I use apt-cacher-ng, and like it. The only real advantage I see to this is that while on the "business trip", you will have to temporarily comment out the proxy setting, or apt won't work (or will be really slow if you're vpn'd to your home network!)

@Funtime:
I'm 99.5%+ sure that it won't upload a file back to squid -- I don't think squid even has the capability to accept such an upload.

avahi tools

Gilbert Dion's picture

apt get can't find avahi-tools. How come and what's to do?

I think you mean avahi-utils?

Tal's picture

I think you mean avahi-utils?

avahi-tools!

Gilbert Dion's picture

I pasted it form the article:
"sudo apt-get install squid-deb-proxy avahi-tools"

apt-get can't find avahi-tools.

Um, why don't we just merge

MLC's picture

Um, why don't we just merge some p2p features into the updates sometime? I'm sure that'd speed things up to a ridiculous extent. XD

Sounds a lot like apt-cacher-ng

sshvetsov's picture

...only instead of installing squid-deb-proxy-client on every client machine and avahi on the server, you have to insert string

Acquire::http::Proxy "http://apt-cacher-ng-host:3142/";

into /etc/apt/apt.conf. Otherwise works exactly the same, no need to change sources.list files and also available on Ubuntu starting version 8.04

Backwards Transparency?

Funtime's picture

If I am on travel with one of my client systems and I download an update, when I connect it back to my home network, will it advise the squid-deb-proxy server that it has an updated package which is not in the squid-deb-proxy server cache? And will the server then retrieve the package from the client? Or would the update need to be run on the server as well in order to retrieve new packages?

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix