Package Management - Avoiding the Two Step
apt-get, up2date, yum, pkgtool, dpkg, rpm -- we have lots of ways to avoid compiling programs. For the most part, I don't think that it's because we don't like to compile programs, but rather because most of the modern package management tools take care of dependancies, versioning, etc. I must admit, I even avoid the traditional "make; sudo make install" -- because I don't want to make my system messy. What I wonder, is if my desire to keep the system "in order" sacrifices some of the advantages compiling garners.
Here's a quick list of pros and cons off the top of my head. I'd love to hear your thoughts on the matter. Does package management help determine the distribution you use? Do package managers in general annoy you?
Pros for using a package management system
- Installing applications is fast
- Dependancies are usually automatically installed
- Many distributions notify when updates are available
- Uninstalling is easier
- It's what most people do, so peer support is prevalent
- Did I mention dependancies are usually automatically installed?
Cons for using a package management system
- Compile time options are chosen by the package maintainer, not you
- The newest version is often not available right away, so your compiling friends will make fun of you
- You get very little control over where and what is installed
- Your CPU will get lazy and overweight if it never has to compile your stuff
I could add to both lists, but the trend I see is that package management gives convenience, and compiling gives freedom. It's pretty clear that as a community, we're pretty big on freedom, so does that mean using apt-get is stealing our rights?
Uh, no. See, the beauty is that even though things like apt-get and synaptic make installing programs as easy as double clicking on setup.exe -- the difference is that we have a choice as to whether or not we pick the convenience of package management. It's the freedom to choose that makes Linux and open source so great.
Now it's your turn. What do you think?
Getting Started with DevOps - Including New Data on IT Performance from Puppet Labs 2015 State of DevOps Report
August 27, 2015
12:00 PM CDT
DevOps represents a profound change from the way most IT departments have traditionally worked: from siloed teams and high-anxiety releases to everyone collaborating on uneventful and more frequent releases of higher-quality code. It doesn't matter how large or small an organization is, or even whether it's historically slow moving or risk averse — there are ways to adopt DevOps sanely, and get measurable results in just weeks.
Free to Linux Journal readers.Register Now!
|Secure Server Deployments in Hostile Territory, Part II||Jul 29, 2015|
|Hacking a Safe with Bash||Jul 28, 2015|
|KDE Reveals Plasma Mobile||Jul 28, 2015|
|Huge Package Overhaul for Debian and Ubuntu||Jul 23, 2015|
|diff -u: What's New in Kernel Development||Jul 22, 2015|
|Shashlik - a Tasty New Android Simulator||Jul 21, 2015|
- Hacking a Safe with Bash
- Secure Server Deployments in Hostile Territory, Part II
- Home Automation with Raspberry Pi
- Huge Package Overhaul for Debian and Ubuntu
- The Controversy Behind Canonical's Intellectual Property Policy
- Shashlik - a Tasty New Android Simulator
- KDE Reveals Plasma Mobile
- Embed Linux in Monitoring and Control Systems
- diff -u: What's New in Kernel Development
- Purism Librem 13 Review