At the Forge - Issue 200
So, Linux Journal has reached issue 200! As many of you know, I've been writing for this magazine for much of that time. According to my count, this is my 168th monthly column. I started back in 1996, long before I got married, became a father or began my PhD studies. It's hard to remember a time before Linux Journal was a standard item on my monthly calendar.
When I look back over the years, it's amazing how many things have changed when it comes to Web technologies. And yet, so many things also have remained the same. This month, I celebrate this issue of the magazine with a bit of nostalgia, reminding you where we've been and describing where we're headed. Along the way, I discuss some of the topics I intend to address in the future in this space.
At least a few readers of this column presumably remember a time when the Web and Internet weren't ubiquitous. My children always are amazed to hear that I was one of the only kids in my grade to have a home computer. It's hard for them to understand that when my mother told us that we should look something up, she meant we should drive to the local public library, find books (in a paper card catalog) on the subject and search through those books to find the answer. Today, the Internet in general and the Web in particular are fixtures in our daily lives. But back in 1988, just after I started college, my friends gave me a funny look when I asked them if they had Internet e-mail addresses. When we put the MIT student newspaper on the Web in 1993, we had to tell people how to install a Web browser on their computers. All of this is clearly a thing of the past. If nothing else, it's hard to find an advertisement without a URL at the bottom inviting you to learn more.
After decades of discussion and development of hypertext systems, it wasn't necessarily obvious that the World Wide Web, the brainchild of Tim Berners-Lee, would become a major hit. And yet, to those of us who used it in those early days, the Web had a number of clear advantages over its competitors. It was easy to set up a server and site. The protocols were simple to understand, easy to implement and easy to debug (because they were text-based). The addresses were unique, easy to read and easy to write. Clarity, ease of implementation and ease of use were critical in jump-starting the Web revolution. The success of a simple, easy-to-use approach is easy to spot today as well—look no further than Twitter, LinkedIn or Facebook.
The biggest thing missing from the early Web was the ability to write custom applications. It was simple to set up a server that would make HTML (and other) files available to the general public. But it was the invention of CGI—a standard protocol that allowed HTTP servers to communicate with external programs—that made it possible for programmers to write dynamic Web applications. The idea that the Web was a new application platform was a bit hard for many of us to swallow. I remember bristling at my title, “Web application developer”, when I worked at Time Warner in 1995, saying it was ridiculous to think that we were developing “real” software applications. Today, of course, Web applications have overtaken their desktop counterparts in many aspects.
The Apache Web server was one of the most important contributors to Web development in a number of ways. It was one of the first well-known open-source projects that was clearly superior to any of its commercial competitors. (Did you even know that there was once a market for commercial HTTP servers?) Apache's power and flexibility convinced many large companies that they should cooperate and communicate with, and even contribute to, open-source projects that did not compete directly with their core businesses. If I remember my history correctly, I believe it was IBM's interest in donating money to Apache's development, but the developers' lack of any formal infrastructure that could accept the money (let alone sign a contract) that led to the development of the Apache Software Foundation, one of the most prominent players in the Open Source community today.
Apache also demonstrated the advantages of modular software design. Because Apache was intended to serve many different populations, its developers created it as a set of modules, and each of which could be included or excluded from the final product, depending on the site's needs.
Finally, Apache made it possible to create custom Web applications without having to suffer from the performance problems associated with CGI programs or from the development time associated with writing custom HTTP-enabled applications. By writing your own module (in C), you could do just about anything, attaching your custom functionality to one or more of the hooks that Apache posted during an HTTP request's life span. Eventually, it became possible to write custom applications using Perl and Python, rather than just C—and anyone who moved from CGI programs in Perl to mod_perl benefited from a tremendous increase in both speed and flexibility.
By the end of the 1990s, most people were using a relational database behind the scenes to keep track of their data, after discovering that text files just weren't fast or flexible enough to do the trick. Many applications used commercial databases, wishing that someday we could enjoy the power of SQL without having to fork over enormous amounts of money to a large corporation. And indeed, starting in the late 1990s, things began to improve, both in terms of open-source licensing and functionality. MySQL was re-issued under the GNU General Public License and started to move in the direction of ACID compliance, and PostgreSQL began to improve its usability, shedding such issues as a laughably small maximum tuple width.
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide
- SUSE LLC's SUSE Manager
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- Google's SwiftShader Released
- SuperTuxKart 0.9.2 Released
- Rogue Wave Software's Zend Server
- Parsing an RSS News Feed with a Bash Script