Package Management - Avoiding the Two Step

apt-get, up2date, yum, pkgtool, dpkg, rpm -- we have lots of ways to avoid compiling programs. For the most part, I don't think that it's because we don't like to compile programs, but rather because most of the modern package management tools take care of dependancies, versioning, etc. I must admit, I even avoid the traditional "make; sudo make install" -- because I don't want to make my system messy. What I wonder, is if my desire to keep the system "in order" sacrifices some of the advantages compiling garners.

Here's a quick list of pros and cons off the top of my head. I'd love to hear your thoughts on the matter. Does package management help determine the distribution you use? Do package managers in general annoy you?

Pros for using a package management system

  • Installing applications is fast
  • Dependancies are usually automatically installed
  • Many distributions notify when updates are available
  • Uninstalling is easier
  • It's what most people do, so peer support is prevalent
  • Did I mention dependancies are usually automatically installed?

Cons for using a package management system

  • Compile time options are chosen by the package maintainer, not you
  • The newest version is often not available right away, so your compiling friends will make fun of you
  • You get very little control over where and what is installed
  • Your CPU will get lazy and overweight if it never has to compile your stuff

I could add to both lists, but the trend I see is that package management gives convenience, and compiling gives freedom. It's pretty clear that as a community, we're pretty big on freedom, so does that mean using apt-get is stealing our rights?

Uh, no. See, the beauty is that even though things like apt-get and synaptic make installing programs as easy as double clicking on setup.exe -- the difference is that we have a choice as to whether or not we pick the convenience of package management. It's the freedom to choose that makes Linux and open source so great.

Now it's your turn. What do you think?


Shawn Powers is an Associate Editor for Linux Journal. You might find him chatting on the IRC channel, or Twitter


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Package Managers vs Compiling

John Keels's picture

Currently, I am using OpenSUSE 10.3 and am quite happy with YAST actually. I sure would not want to have to compile everything I install. However, there are times I am willing to compile a package if I need to esp if its a program I need that is not in the repositories. For instance, I recently wanted Geany which I am sure some of you are familiar with (it is an IDE). Anyway, I could not find it in the YAST repositores so I downloaded and compiled it myself and works fine. One of the good things about free software and Linux in particular is that we can all choose a distribution that works the way we want. If some people want easy package management then Ubuntu with Synaptic or even OpenSUSE 10.3 with YAST are really quite easy (and YAST is not so slow anymore though still a bit more so than Synaptic perhaps). If other people want Slackware or Gentoo where packages are more frequently compiled for careful optimization then thats great too. We can each have our own thing! :)

Compiling some critical stuff, no mess

Sir Robin's picture


I like to keep my installation clean so nothing is messing around cleanly installed package manager given stuff, BUT I have a way around and reason to do so.

Most critical things I love to compile with personal optimization and feature choices. I have Fedora 7 on one and Debian Etch on second computer.

First, kernel: Compiling and installing it without package manager wont mess really anything and removing old kernel installations is easy. There is a LOT to do to optimize the kernel for your needs and specially in leaving any unnecessary features and modules off - official binary releases are compiled with everything but the kitchen sink so there is support for whatever hardware or features someone might need - but no single person needs more than a small fraction of modules... For example if you only have ext3 filesystems and cdroms there is a huge load of filesystems you wont need support for.
Also with Fedora I actually know how to install latest kernel source rpm, compile and pack it in nice RPM file, just like the pre-compiled one :)

On RPM systems you can also build binary RPM from SRPM with some optimization options available. Result is an RPM that is equivalent to one offered by package repository in terms of package compliancy.

Thirdly: When I want the most feature and optimization choices and the latest version there is I simply compile from sources in traditional way - the magic is in 'configure' script - among other things I use '--prefix=/opt/usr' so the program will be installed under /opt hierarchy and will not mess the finely managed installations that pajkage managers install under /usr directories.

Kernel, X, window manager, any software that consumes noticable amounts of resources and is often in use -> it's really good for your system's performance to compile them by yourself. And your default installations and locations will stay in order and wont get "messy".

The newest version is often

Just passing by's picture

The newest version is often not available right away, so your compiling friends will make fun of you

Get kinder friends.

Your CPU will get lazy and overweight if it never has to compile your stuff

Get Folding@Home.

You get very little control over where and what is installed real solution to this, how about getting into BDSM roleplaying? It won't help with control over system but feeling generally in control of something/someone might help (dependencies: dominant role).

Compile time options are chosen by the package maintainer, not you

There are times in life you just want stuff to get installed and working without any hassle, for everything else there's "make checkinstall".


Shawn Powers's picture

hehehe. You made me chuckle.

Shawn Powers is an Associate Editor for Linux Journal. You might find him chatting on the IRC channel, or Twitter


Leslie Viljoen's picture

I use Synaptic for simple things, but I use Wajig on the command line for more advanced things. Wajig provides a common command-line interface for Debian's many package tools, which means you only learn one set of commands to use apt-get, dpkg, apt-cache, etc.

Wajig also requests the root password if you forget to 'sudo', which saves me time.

For a overview, see:

What about mix packaging ?

Fabien Niñoles's picture

With different needs came different solutions, no ? So why get stuck with only one package management scheme ? Using a debian system, here my current practice in term of package management:

  1. aptitude+testing (or stable, when it just get release and testing is in a big mess). Testing is stable enough for my need and aptitude not only take care of dependencies, but also on removal of them when I don't need them anymore.
  2. apt-src+unstable: I used them when I need a more recent version compiled with my current libraries, a quick fixes or a different set of options enables in the package. apt-src is a wonderful package management tool with feature to kept your patches, upgrade your package, installing build-dependencies, etc. And, at the end, everything get manage inside your apt repository.
  3. debianized upstream: Sometime, I need the upstream source, either because is more recent or still not available in debian. When the source is not already debianized, I try to debianize it. There is a lot of scripts packages for helping in the job and if the source is well done, most of it is automatized. My prefer packages for those jobs is dh-make (especially for autotools source) and consors. Also, you have some nice revision controls tools (*-buildpackage) for helping you keep in sync with upstream. The result is not necessarily to the standard of a real package (this need far more time), but it's good enough for my need.
  4. upstream+stow : When the debianization of the srcs seems too complex (either because of a bad build script or whatever), I try to make a clean install using stow. Stow is a small installer which take an install directory and symlink the files in it inside your own directory structure. For example, I can have gcc install into /usr/local/stow/gcc-4.2.4-1/... and stow gcc-4.2.4-1 will install it inside /usr/local, creating only the necessary symlinks and warning about conflicts. stow -D gcc-4.2.4-1 will remove the package once remove. I always put a version on my stow directory and create an tar.bz2 archive of my install packages to ease the restauration in case of problems.

With this methods, my system is kept very clean most of the time, and I always can find exactly where stuff came from and how to uninstall them. There is even some other tool I never try like checkinstall which can help you kept track of what `make install` does on your system. However, since I know, I never have an occasion to try it, so it's still not part of my package management toolbox but it's surely a serious candidate .

Hope this will help someone,


Arch plug

mrtiro's picture

As an Arch Linux fanboy, I have to mention it's package manager. Dissatisfaction with package managers was the primary reason for my distro hopping. I've tried most of the big linux distributions (Debian, Slackware, Gentoo, Fedora, etc) and the only package manager capable of stopping my distro hopping was pacman from Arch. It does dependency resolution, but when you uninstall a package, the default is to only uninstall that package. You can also uninstall the dependencies (those that aren't required by other packages) with an command flag. If is also easy to compile your own packages from source with an ancillary program (makepkg, which uses PKGBUILD files - similar in concept to Gentoo ebuilds). For most programs, it is dead simple to roll your own packages, which is the type of solution I think goblin is looking for.

Most (F)OSS people probably

mats's picture

Most (F)OSS people probably would agree that freedom of choice is a good thing. Nonetheless, in the real world I think that the whole compiling your own vs. prepackaged applications is more an issue of needs and trade-offs than a matter of beauty: "To achieve result C, I need functionality B, which is only available in the SVN version of application A". There you go, rolling your own.
Does package management help determine the distribution you use?
Definitely. The more simple the better - a point of view that is certainly debatable, but personally I just cannot stand monster package managers from hell (tm) anymore.
Installing applications is fast
Not necessarily. Think Yast for example, in which it can take literally dozens of minutes on an older machine just to sync up with its repositories.
Dependancies are usually automatically installed[...]Uninstalling is easier
Dependancy checking is a very two-edged blade - given the fact that some package managers out there tend to automatically *un*install dependancies, should the user decide to remove a package from his/her system. Over the years, I have found the ability to easily remove eg a Slackware package (a process that does not automatically remove all depending packages like in apt or yast) without breaking an unaware user's entire system might be a more sensible approach.

"Installing applications is

Deanjo's picture

"Installing applications is fast
Not necessarily. Think Yast for example, in which it can take literally dozens of minutes on an older machine just to sync up with its repositories."

That's been cured. Now YaST is probably one of the fastest package managers out there. Night and day difference in speed comparing say openSUSE 10.1 vs 10.3 and even faster yet in 11.0.

A debian child

Anonymous's picture

A debian child you should know about. sidux
I kind of suck at math..

Dependency manager?

JK Wood's picture

I'm really glad that you mentioned pkgtool. There's a myth floating about that Slackware doesn't have package management, which is a bald-faced lie. We don't have automated dependency resolution, which I am perfectly okay with. I've used other distros, notably Ubuntu, which have a terrible habit of attempting to uninstall half the system when I want to remove what would constitute one package under Slackware.

Perhaps the best part of Slackware's package management lies in the fact that you CAN compile software without making your system messy. It's trivial in most cases to create a package from source, which is enhanced significantly by the availability of such resources as, which provides build scripts that have been vetted by veteran Slackware users.

apt-get and synaptic

lapubell's picture

I don't know if it is just the way my brain works, but I have never felt at home on a system that doesn't use apt-get or synaptic. I started with Fedora on my journey into linux and quickly left it for pure debian. Then, after getting too confused being a newb and not understanding much about how to configure things in linux, I heard about Blag and switched back to rpms. Ever since heading of Linux Mint I have been a fan and have stuck with it (including converting my girlfriend over, who is not a techie). At work I am using pure ubuntu, but deep down they are really the same.

The only thing that I think needs to be a little more polished is the .deb installer. I think that for end users to really use linux with no problems, that double click to install software experience is the biggest piece missing. When I find a .deb file and download it, sometimes the installation goes super smooth, and other times it throws a red error message that says what dependancy is missing. Couldn't it search the apt-cache and flag it for install with verification?

Let the tar.gz folks laugh at us for using two month old versions, while we all laugh at the window crowd that is still using a six year old OS.

Package management verses compiling

Anonymous's picture

You are confusing two separate concepts. Package managers are not necessarily an alternative to compiling. The Gentoo distro, for instance, has an excellent package manager called Portage which not only handles dependencies but will also compile the installed source code for you. The added flexibility this scheme offers is at the expense of waiting for hours (days in some cases) for the install to complete.

Well, yes...

Shawn Powers's picture

You're right, but I was really talking about package management versus downloading a tar.gz file and compiling it by hand. Portage, for example, works much like binary-only package managers, but compiles during the process. Functionally, it works much like apt-get, so even though it compiles, it does so in an orderly manner. :)

Shawn Powers is an Associate Editor for Linux Journal. You might find him chatting on the IRC channel, or Twitter

More flexibility

Anonymous's picture

But you have more flexibility with Portage then you have with any other binary solution. You can choose to use such and such option before compiling (called a USE flag). So you have a better costomization of your package, in addition to have to benefit to have it optimized for your CPU. The only thing that differ from the conventional "./configure; make; make install" is that you don't define where to install it (--prefix). For any other option, there is usually a 'USE flag' for it. So, in that sense, Portage is really closer to "pure compiling" then to "apt-get".

Gentoo USE flags

Fabien Niñoles's picture

Last time I see it, a change in the USE flag will not recompile every package that depends on it. This as lead me to some problem when I try to switch from, say arts to esd for example. My older packages where still looking for an arts daemon when the new one where using esd. I also find quite difficult to patch a package and not have my change overwrite by emerge.

Is those issues fixed now ?


Binary install in Gentoo

goblin's picture

Maybe I would only want to compile 4-5 packages a year. Do I have the option with Gentoo/Portage to install binary packages for 99.9% of my installation (like with yum et al), and then do the compiling install for the remaining .1% ?

That would really present the best of both worlds to me: Convenience, flexibility and minimal time spent on installation.

RE: Binary install in Gentoo

zenforce's picture

For large packages like openoffice or firefox, Gentoo provides binaries in the package tree. Gentoo doesn't really give you an easy option of using binaries for most things and only compiling a few others unless you or someone you know is hosting a binary repository with packages built with the options you want: useful in environments where you may have several machines that are the same and you want to use one as your build machine and want to install on the others systems. Gentoo out of the box is not source based distro, though, by default it will compile.

One of the really tough points at providing pre-rolled packages is that it makes accommodating individual USE flag differences more difficult. Binary distros include the many optional features of packages as additional packages. With Gentoo, this would increase the number of items in the package manager considerably and be a lot of extra work for maintainers.

I know it's a long shot, but maybe other distros should look into including something similar to USE so that for example, when multimedia packages get installed, they always include the support for Feature X that you want to enable. It may be easier to add that capability to a distro already breaking up packages into pieces than to splinter Gentoo's system into an infinitely larger number of packages.

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState