Staying Current without Going Insane

How tools like SuSE's YaST2, Red Hat's up2date and Debian's apt-get can help you maintain some measure of patch sanity.

Security is a moving target and chasing it can be positively dizzying. One of the most unrelenting and tedious security tasks on powerful systems like Linux is keeping its myriad applications, commands and libraries current. As anyone who subscribes to the BugTraq mailing list can attest, new software vulnerabilities are constantly being discovered, exploited and patched against. But given the hundreds of software packages installed by the typical Linux distribution, how can you ever hope to keep up?

The bad news is, you can't. Even if you had a way of instantly patching against every new vulnerability published on every vulnerability and incident report mailing list, sooner or later you would become the subject of an incident report yourself; some vulnerabilities don't become common knowledge until they've been exploited.

The good news is that since you can't possibly keep up, any progress you do make is a win. If you get cracked via a three-week-old software vulnerability, at least you're more elite than someone who gets cracked via a three-year-old 'sploit. (On the other hand, either way you're cracked.)

In all seriousness, though, from a purely statistical standpoint, fewer unpatched bugs means fewer exploitable vulnerabilities. Despite how thankless and endless the task seems, it is worth trying to keep your Linux software current. And luckily, recent versions of popular Linux distributions include new tools for automating much of this task, including SuSE's YaST2, Red Hat's up2date and Debian's apt-get. (Some of these tools are even secure!)

Packages vs. Source

I'll start with a key piece of advice that I've arrived at after years of skepticism: wherever possible, stick to your distribution's supported packages. To many people, this probably sounds obvious; why build from source if you don't have to? But to some of us old-school types, building from source is as much a habit as a skill. Maybe this is because way back in the early 1990s (you know, back before we had Slashdot or powered flight), there was a lot less software for Linux than there is now, which meant that many of the things we ran we had to build from sources developed on other platforms. Additionally, the whole concept of packaging Linux into distributions was much less mature: there were fewer distributions, and they didn't change nearly as frequently.

(Have you ever had to compile ps from source because your brand-new kernel wasn't backward-compatible with the older version of ps from your distribution? Some of us have. We also had to walk ten miles to school each day through snake-filled swamps, etc.)

But things are different now. Don't get me wrong: I don't mean that we should all be slaves to binary packages. Sometimes you need features that are available in the very latest version of an application but not in your distribution's version of it; sometimes you want a leaner-and-meaner build than the one-size-fits-all juggernaut that your distribution provides. However, there are some important advantages to sticking with binary packages most of the time.

The first is convenience: downloading a large group of applications from a single site is faster and easier than downloading each from its developer's site (which is why we have distributions in the first place), and installing a binary package is much faster and less prone to mishaps than compiling from scratch. Convenience is not something we *nix bigots admit to valuing, but there it is.

The second advantage of packages is stability: the major distribution packagers put a good deal of testing and research into deciding whether to include a given application in their distribution, and if so, which version is the most stable and which compile-time options best suit their distribution. There have been notable exceptions to this, but most distributions nowadays do quite well with quality assurance.

Stability is unquestionably a big factor in security: where there are bugs there are vulnerabilities. Even bugs that don't have obvious security ramifications often can be exploited in, for example, denial-of-service attacks (the object of which is to crash a system, which some bugs do effectively).

This leads us to one of the more frustrating paradoxes in application security: although some of the most widespread vulnerabilities on the Internet stem from poorly maintained applications (i.e., obsolete and/or known-vulnerable versions), newer is not always better. The security community has rightfully lambasted Microsoft over the years for being slow to acknowledge and provide patches against security vulnerabilities in their products. But we complain equally bitterly when Microsoft does provide a speedy patch that affects stability because this potentially mitigates any benefit derived from fixing the original bug.

Ignoring for a moment stability's desirability in its own right, ask yourself this: supposing an application has an obscure buffer-overflow condition that is theoretically exploitable for root privileges (but only by a skilled assembly programmer with a working knowledge of the RC4 stream cipher), and you patch it with code that introduces a denial-of-service opportunity that can be exploited even by attention-span-deficit script kiddies—how much security have you really gained? The answer to this will vary depending on the precise circumstances and on who you ask, but the important point is that software upgrades often have ramifications of their own.

I'm beating this point to death, but it's an important point because it debunks the notion of instant software updates being some sort of panacea. (Plus it's been on my mind for a long time; it's taken me years to fully understand why, for example, the very latest version of OpenBSD still ships with a hack of BIND v.4.9.8, which in computer/dog years is really ancient.)

It's also important because, getting back to my original topic, there's something to be said for letting your Linux distributor make the difficult patching decisions, and thus for waiting to patch a vulnerable application until your distribution releases an official (and hopefully tested) patch, which of course is part of the deal when you rely on binary packages rather than cold hard source code. It's difficult enough to keep up with Linux distributors' application updates; I can only imagine how futile it would be to patch and recompile all my critical applications myself and waiting anxiously to see whether the patching broke anything else.

So to summarize these pearls of wisdom (at least I hope they're pearls, instead of some other small spherical secretion): keeping current is a virtue, using binary packages is a virtuous form of laziness (props to Larry Wall) and relying on secondhand security updates from your Linux distributor rather than “going cowboy” isn't laziness at all, it's prudence.

______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState