At the Forge: Why Linux?
Break out the champagne! This month, Linux Journal is celebrating its 100th issue, and I've decided to take a break from my exploration of open-source web/database technologies to join the party.
There are plenty of good reasons for Linux users (and advocates of open-source software in general) to be happy. Despite the downturn in the high-tech economy, open-source software development continues at an extremely rapid pace. When Linux Journal was first published, few people had ever heard of the free operating system created by a Finnish student. Nowadays, many people have heard of Linux, even if they don't understand what it is or what it can do for them.
Indeed, while many of my clients know that I push for open-source solutions, they are always curious to know why I favor them and, more importantly, why choosing such solutions is in their interest as well. So at the risk of preaching to the converted, this month's column reviews some of the reasons why Linux is such an excellent platform for building server-side applications. I hope some of the ideas I put forth here will help you evangelize free software solutions with your own colleagues and clients in the years to come.
Hackers are interested in technologies and tools that teach new skills and perspectives. But in the real world, people are interested in getting their jobs done as quickly and cheaply as possible. Software is a means to an end, rather than an end in and of itself.
For this reason, I've found that the best way to sell people on open-source software is to say that it does more and costs less. Either one of these factors isn't enough by itself; it's easy to find expensive, high-quality software and useless to install cheap, poor-quality software. As consumers, my clients are always eager to get more for less, and free software appeals to them in this way.
When I pitch solutions to my clients, I begin by explaining that I'm offering them something they might have thought impossible: inexpensive software that does what they want, without crashing. When I explain to Windows users that I have yet to see a Linux system crash in over six years of running dozens of systems, they are shocked and incredulous. When I tell them that this software is freely available on the Internet, they find it even harder to believe.
My clients often wonder who is supporting the software and what happens if things go wrong. They are relieved to hear that not only can I offer them the support they need but that they can look for support elsewhere if they don't approve of my work. This, of course, contrasts sharply with the attitude and restrictions that many consulting firms impose on software installations. The open-source approach is thus friendlier to consumers than the traditional software model, reducing costs and encouraging competition.
Of course, not all free software is of high quality, and not all consultants really know what they're doing. The community development process can produce excellent results, but that doesn't mean everything released on the Internet is guaranteed to be safe and stable. Indeed, it's clear that many programs, including some popular ones, were uploaded without undergoing any testing. Programs like these give the entire Open Source community a bad name and often do more harm than good. Several times per year, clients call me in to fix a program they have downloaded that worked fine at first, but eventually proved itself to be insecure, unstable or full of bugs.
Even if you find that your server depends on a bug-ridden, insecure, open-source application, all is not lost. That's because the nature of free software ensures that you can modify it to suit your needs or fix it when problems arise. In this way, shared-source licenses, which allow users to view the source code but not to modify or fix it, miss the point. Buying a house or a car entitles you to fix it on your own; why should software be any different?
True, the shared-source license does mean that more people will look over the code, so security and stability problems will be identified and fixed more quickly. But being able to read the source code isn't nearly as important as being able to improve it. Moreover, folding these improvements back into the community version means that everyone else will benefit from your adjustments and be able to make further improvements. Thus, contributing to the community process is in the interest of everyone who uses open-source software; it's not simply a nice thing to do.
Because I tend to use mature tools such as Linux, Apache, Perl and Python, it's relatively rare for me to find bugs in the software I download. But several times per year, I will discover a problem or limitation in the software I use. Having access to the source code guarantees that I can get up and running as quickly as possible, and it also means that others will not have to suffer through the bugs that I've fixed.
It's ironic that I can still use this argument today, given that a similar problem with printer drivers was what drove Richard Stallman to found the Free Software Foundation, whose GNU Project has been crucial to the success of Linux and free software. It's also amazing to discover how quickly we get used to having the source available and to being able to inspect or modify every part of our computer systems.
Along the same lines, Linux systems tend to come with “batteries included”, to borrow a phrase from the Python world. I recently began work on a project that will be deployed on Solaris, and I soon remembered how much richer and better-stocked a typical Linux distribution is when compared with a standard Solaris installation. True, I can spend half a day downloading and installing gcc, Perl, Python and the rest. But after years in which gcc was available on every machine I ran, it felt like I had been thrown back into the Dark Ages of UNIX.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Stunnel Security for Oracle
- SourceClear Open
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Google's SwiftShader Released
- Non-Linux FOSS: Caffeine!
- Parsing an RSS News Feed with a Bash Script
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide