Tcl/Tk: The Swiss Army Knife of Web Applications

Tcl/Tk offers many uses to the web programmer. Mr. Schongar describes a few of them.
Processing User Data

Since you'll want to do more than just print out some environment variables or static data, here's how to process incoming data in Tcl. The first step involves getting access to the environment variables, which we've covered. This will tell us whether the data is coming in by the GET method, stored in the QUERY_STRING environment variable, or as a POST, stored in standard input. Let's find out using the slightly modified version of our previous program shown in Listing 2. Yes, this program is a lot longer. The extra length, however, comes from dealing with getting the user input, formatting it and storing it in an array. Once I explain how those parts work, I'll show you a much shorter way of writing this program.

The parse (cgiParse) and decoding (urlDecode) procedures are taken from Brent Welch's Practical Programming in Tcl/Tk, with minor modifications. The parse routine is reasonably straightforward. It determines whether the data is stored in QUERY_STRING or standard input, then stores it into a variable called text.

Special characters can cause problems for CGI programs, so the server encodes percent signs and slashes as their hexadecimal equivalents and spaces as plus signs. Your program must convert the data to its original form. Doing that in C and other languages can be difficult, but Tcl makes it fairly easy. As you can see, each time data is being processed, either for the name or value of the variable, that data is sent to urlDecode. There, the regsub command works its magic. To use it, specify (in this order) a search pattern, the original data, its replacement and the variable in which to store it.

Notice that the foreach loop at the end of the cgiParse procedure is doing two things that our previous foreach loop didn't do. First, it is specifying more than one temporary value for use in each loop. That is, the first time through the loop, element 1 of the list will be stored in name, and element 2 will be stored in value. The next loop will use elements 3 and 4, and so on. Any number of variables can be specified in this way, which makes processing long lists a breeze. The second thing it is doing differently is directly using the results of a command as the list, rather than creating a variable to hold the list first. You can do it either way, but in this example you save a fractional bit of processing time and memory.

Once the parsing and decoding libraries have been laid out, the program starts its run. The content header is sent to the browser, and cgiParse is run in order to store all user-entered values (from a form or some other way) into the variable array cgi. Then it loops through each element in the cgi array and prints out the names and values of all the elements.

One benefit to the way the parsing functions are set up is that you can test user-input values on the command line. Since it doesn't rely on finding a GET or POST method, it will get the data wherever possible, defaulting to the command line. So, you could easily test your cgi script before uploading it to the server, without having to create an elaborate wrapper to set environment variables.

Function Libraries—Yours and Everyone Else's

Tcl procedures, or procs, are your subroutines. If you have created some procs, you can easily put them in their own Tcl script, then use the source command to load those scripts so they are ready for use. To keep your code to a minimum, you may want to use the cgiParse and urlDecode routines shown in Listing 2. If you saved them as “cgistuff.tcl”, you could rewrite the script in Listing 2 as:

source cgistuff.tcl
puts "Content-type: text/html \n\n"
foreach foo [array names cgi] {
   puts "Variable: $foo Value: $cgi($foo)"

The source command loads and executes a Tcl script, so be careful that you don't have any unwanted commands hiding in that script outside of a procedure.

Before you go off writing too many of your own procedures, though, you'll want to take a look at what is already available. A lot of talented people have put time and effort into writing well-documented, very functional procedure libraries, such as Don Libes' cgi.tcl library, which covers everything from basic parsing to cookies and file uploads. (See Resources.)


White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState