ATF Jubilee Edition

To celebrate his 50th column, Reuven casts his vision to the future of web development and suggests some current trends that will affect how the job gets done.

Welcome to the jubilee installment of At the Forge. This is the 50th column that I have written for Linux Journal (or for SSC's short-lived Websmith magazine) since early 1996. Over the last few years, we have explored a large number of Web-related technologies, techniques and applications, ranging from simple CGI programs to sophisticated database-backed applications written using mod_perl.

This month, I want to spend a bit of time prognosticating, looking into the future of web application development. On the one hand, things have never been more exciting for web application developers; the technology continues to advance at a remarkable rate, making it easier and easier to create sophisticated applications. At the same time, the increasingly crowded field of embedded programming languages, application servers and database adaptors makes it harder to decide which technology is most appropriate.

Because this column describes where I believe web technologies and application development are headed in the coming years, it should also serve as a sort of guideline for what future issues of ATF will contain. You can think of this month's installment as an indication of where my consulting firm is headed professionally, and thus what you can expect me to suggest and describe in the year (or more!) ahead. Since this is Linux Journal and Linux is my company's primary server platform, I will focus here on items that run with Linux and, preferably, those that are free software.

Where Have We Been?

Web application development began soon after the Web itself was formed. Ever since the first dynamically generated content was sent to the first browser—an act which predates the CGI standard, to say nothing of Netscape, Internet Explorer and Apache—programmers have been designing increasingly sophisticated applications for use on the Web.

CGI, or the “common gateway interface”, soon arrived on the scene. CGI got its name because dynamically generated content was originally a means to give a web interface to non-web applications. With the advent of CGI, it was suddenly possible to create portable server-side programs. Most web applications continue to be written using CGI, because of its simplicity and its extreme platform-independent nature, as well as the fact that web space providers can give their clients CGI access without endangering the server's stability.

You can write a CGI program for any web server, in any language, on any operating system, and be virtually guaranteed that it will work. However, CGI has a number of drawbacks. In particular, it requires that the web server spawn a new process for each HTTP request aimed at a CGI program. In other words, a web site that receives 100 hits/minute is spawning more than one new process every second.

By itself, this should not scare you. After all, a basic Linux box should be able to handle the creation of one new process each second, right? However, the size of the new process, as well as the speed with which it starts up, are both important factors.

Perl, my programming language of choice for the last few years, has proven itself as a powerful means for creating CGI programs. The CGI.pm module provides an amazing array of functions that do nearly everything you would ever want from a CGI program (as well as a number of things that I would never consider doing). Moreover, Perl includes a powerful pattern-matching engine, along with modules that handle most popular Internet standards and protocols. The DBI (database interface) module has proven to be an additional boon, making it easy to include the output from an SQL query in a dynamically generated page.

However robust, flexible and secure Perl might be, the CGI standard was never designed for producing a large volume of dynamically generated pages on the fly. Each invocation of a CGI program written in Perl forces the computer to create a new process, load Perl into memory, load your program into memory, compile your program into Perl's internal opcodes and then, finally, interpret it using the Perl run time mechanism. This all takes time and means that CGI programs will not scale well over the long term. Indeed, it does not take a lot of concurrently running CGI programs to bring a typical server to its knees.

At the same time, CGI has been successful because it's so easy to use. With no other API can you write a “hello, world” program as simple as the following:

#!/usr/bin/perl -wT    use strict;
    use CGI;
    my $query = new CGI;
    pring $querry->heder("text/html");
    print $query->start_html;
    print "P>Hello, world!</P>\n";
    print $query-> end_html;
______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState