At the Forge - Issue 200

Reuven reminisces about the past of the Web, describes its present state and makes some predictions for the future.

So, Linux Journal has reached issue 200! As many of you know, I've been writing for this magazine for much of that time. According to my count, this is my 168th monthly column. I started back in 1996, long before I got married, became a father or began my PhD studies. It's hard to remember a time before Linux Journal was a standard item on my monthly calendar.

When I look back over the years, it's amazing how many things have changed when it comes to Web technologies. And yet, so many things also have remained the same. This month, I celebrate this issue of the magazine with a bit of nostalgia, reminding you where we've been and describing where we're headed. Along the way, I discuss some of the topics I intend to address in the future in this space.

The Past

At least a few readers of this column presumably remember a time when the Web and Internet weren't ubiquitous. My children always are amazed to hear that I was one of the only kids in my grade to have a home computer. It's hard for them to understand that when my mother told us that we should look something up, she meant we should drive to the local public library, find books (in a paper card catalog) on the subject and search through those books to find the answer. Today, the Internet in general and the Web in particular are fixtures in our daily lives. But back in 1988, just after I started college, my friends gave me a funny look when I asked them if they had Internet e-mail addresses. When we put the MIT student newspaper on the Web in 1993, we had to tell people how to install a Web browser on their computers. All of this is clearly a thing of the past. If nothing else, it's hard to find an advertisement without a URL at the bottom inviting you to learn more.

After decades of discussion and development of hypertext systems, it wasn't necessarily obvious that the World Wide Web, the brainchild of Tim Berners-Lee, would become a major hit. And yet, to those of us who used it in those early days, the Web had a number of clear advantages over its competitors. It was easy to set up a server and site. The protocols were simple to understand, easy to implement and easy to debug (because they were text-based). The addresses were unique, easy to read and easy to write. Clarity, ease of implementation and ease of use were critical in jump-starting the Web revolution. The success of a simple, easy-to-use approach is easy to spot today as well—look no further than Twitter, LinkedIn or Facebook.

The biggest thing missing from the early Web was the ability to write custom applications. It was simple to set up a server that would make HTML (and other) files available to the general public. But it was the invention of CGI—a standard protocol that allowed HTTP servers to communicate with external programs—that made it possible for programmers to write dynamic Web applications. The idea that the Web was a new application platform was a bit hard for many of us to swallow. I remember bristling at my title, “Web application developer”, when I worked at Time Warner in 1995, saying it was ridiculous to think that we were developing “real” software applications. Today, of course, Web applications have overtaken their desktop counterparts in many aspects.

The Apache Web server was one of the most important contributors to Web development in a number of ways. It was one of the first well-known open-source projects that was clearly superior to any of its commercial competitors. (Did you even know that there was once a market for commercial HTTP servers?) Apache's power and flexibility convinced many large companies that they should cooperate and communicate with, and even contribute to, open-source projects that did not compete directly with their core businesses. If I remember my history correctly, I believe it was IBM's interest in donating money to Apache's development, but the developers' lack of any formal infrastructure that could accept the money (let alone sign a contract) that led to the development of the Apache Software Foundation, one of the most prominent players in the Open Source community today.

Apache also demonstrated the advantages of modular software design. Because Apache was intended to serve many different populations, its developers created it as a set of modules, and each of which could be included or excluded from the final product, depending on the site's needs.

Finally, Apache made it possible to create custom Web applications without having to suffer from the performance problems associated with CGI programs or from the development time associated with writing custom HTTP-enabled applications. By writing your own module (in C), you could do just about anything, attaching your custom functionality to one or more of the hooks that Apache posted during an HTTP request's life span. Eventually, it became possible to write custom applications using Perl and Python, rather than just C—and anyone who moved from CGI programs in Perl to mod_perl benefited from a tremendous increase in both speed and flexibility.

By the end of the 1990s, most people were using a relational database behind the scenes to keep track of their data, after discovering that text files just weren't fast or flexible enough to do the trick. Many applications used commercial databases, wishing that someday we could enjoy the power of SQL without having to fork over enormous amounts of money to a large corporation. And indeed, starting in the late 1990s, things began to improve, both in terms of open-source licensing and functionality. MySQL was re-issued under the GNU General Public License and started to move in the direction of ACID compliance, and PostgreSQL began to improve its usability, shedding such issues as a laughably small maximum tuple width.

______________________

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix