Scripting GNU in the 21st Century

Scripting in the GNU environment and parsing HTML in bash.

Most people tend to encounter shell scripts as attempts to write portable utilities that run properly on most, if not all, UNIX workalikes. Instead of making the best use of the shell and related programs, these scripts restrict themselves to the absolute most portable baseline of features. The extreme example of this sort of programming can be seen when one looks at the configure scripts generated by the autoconf program.

But this is 2004, and the full GNU environment is now commonplace. The advanced shells and utilities now are the default user environment for GNU/Linux systems, and they are available as install options on BSD-based systems. Even proprietary UNIXes often have complete sets of GNU software laid atop them to bring them up to date with the modern world. Because GNU software can be obtained at little or no cost, there is no excuse to continue scripting in a retrograde proprietary environment.

The BART Script

I live in the San Francisco Bay Area, mere walking distance from one of the Bay Area Rapid Transit (BART) stations. I do not drive, so I rely on the system for my trips downtown. The BART Web site offers an on-line trip planner, but I perform the same sort of query often and find the interface less convenient than a command-line script.

In order to save time, I decided to write a shell script that would fetch the train arrival information for my station and display it in a colored ASCII table on stdout. It should accept station codes for any arbitrary trip but use defaults specified in a per-user configuration file. I did not want to write the schedule analysis code, so I decided to perform a screen scrape of the BART trip planner. wget would submit the trip planner form, and the resulting Web page would be formatted with various tools.

Getting Started

The first line of most shell scripts begins with #!/bin/sh, which causes the script to be interpreted by the venerable old Bourne shell. This is used largely because classic Bourne is the only shell guaranteed to be on all UNIX and UNIX-like systems. Because this script is designed to work in a modern GNU system, it begins with #!/bin/bash. Users of BSD systems may wish to change it to #!/usr/local/bin/bash or perhaps #!/usr/bin/env bash.

Using bash instead of the classic Bourne shell provides us with some useful features we can put to good use in a moment. For one thing, bash allows us to break down our script using functions. The bash string manipulation routines also can save us some time by performing operations in-line that otherwise would have to be fed into an external sed or awk process.

Any good program has configuration files, so we can set up a traditional rc filesystem. The rc at the end of many configuration files stands for run commands, and it typically refers to the fact that the configuration file is loaded in like a script.


	test -r /etc/bartrc && source /etc/bartrc
	test -r ~/.bartrc && source ~/.bartrc

We also should set the default departure and arrival station codes to Rockridge and Embarcadero, respectively. We use the compact bash syntax for an alternate value if a variable is undefined, so users can set the BARTSTART and BARTDEST variables in their own environments if they like.


	BARTSTART=${BARTSTART:-ROCKR}
	BARTDEST=${BARTDEST:-EMBAR}

Functions

The first function we write is the basic usage guideline message, which helps guide development of the rest of the program.


	function usage {
		echo "Usage:"
		echo " $(basename $0) [-hl] [ [<source>] <destination> ]"
		echo "  To change defaults, set the BARTSTART and BARTDEST"
		echo "  variables in your ~/.bartrc"
		echo
		echo "Flags:"
		echo "  -l, --list	List station codes with names"
		echo "  -h, --help	This message"
	}

We now have a simple usage command available that prints out the argument format for the script. Notice that we used $(basename $0) to determine automatically the filename of the script. We also allow an optional destination station code as an argument, which may be preceded by an optional departure station code.

Grabbing the Form Output

HTTP has two methods for submitting selections to a form, GET and POST. The POST method is the most powerful, but the GET method allows us to specify values in the URL itself. This makes the GET method most convenient for scripting, because we can specify all relevant form fields as an argument to a simple tool, such as wget.

First, we set up the base URL to the form, specifying options to minimize the amount of formatting around the data.


	baseurl="http://bart.gov/textonly/stations/schedule.asp?ct=1&amp;format=quick&amp;print=yes"

Looking at the form's HTML source code, we determine which fields have which names and begin to construct additions to the above URL. The date we're interested in is the current moment, and we use the date command's own formatting options to construct the date and time portion of the form.


	date_now=$(date +"&time_mode=departs&depart_month=%m&depart_date=%d&depart_time=%I:%M+%p")

The $( ... ) syntax is simply a more explicit version of the backticks, allowing us to use the output of a command as part of a line of shell code.

Next, we use the BARTSTART and BARTDEST variables to enter the stations in which we are interested.


	stations="&origin=${BARTSTART}&destination=${BARTDEST}"

Then, we use the wget utility to submit the form, redirecting all warning messages to /dev/null so as not to confuse our script. The full function looks like this:


	function submitform {
		baseurl="http://bart.gov/textonly/stations/schedule.asp?ct=1&format=quick&print=yes"
		date_now=$(date +"&time_mode=departs&depart_month=%m&depart_date=%d&depart_time=%I:%M+%p")
		stations="&origin=${BARTSTART}&destination=${BARTDEST}"

		wget -O - -o /dev/null ${baseurl}${date_now}${stations}
	}

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Re: Scripting GNU in the 21st Century

Anonymous's picture

It should be noted that the LSB is POSIX-based.

Re: Scripting GNU in the 21st Century

Anonymous's picture

Overall, for someone learning bash, this is probably a reasonable example. I have many similar ones myself (grabbing satellite wildfeed data, for example).

However, as a means of introducing a newcomer to bash, or as a convincing description of why the newcomer should be using bash, I feel it falls short.

For example, some simple timing shows that his "$(basename $0)" construct is almost 100x slower than using "${0##*/}", although he does use another version of the same construct later, meaing that he is aware of it!

His repeated use of the backslash character as a line continuation does not improve the readability of the script; in fact, it makes it worse. Leave it out! Yes, it still works -- if the line ends with a character which indicates there's more needed, the line continuation character is redundant. The pipe symbol (vertical bar) is such a character.

In general, all variable usage should be enclosed in double quotes (ie, "Dollar signs in Double quotes"). This technique is only wrong 1 time out of 100, so the programmer will be correct 99% of the time. :) Yes, it may mean the double quotes are redundant in some cases, but there's a lot to be said for consistency, and hence, readability. Only when dealing with word splitting (where you want the word split), will the double quotes be incorrect.

He appears to use brackets in "if" statements when the POSIX (?) technique would be double parentheses (brackets are for string comparisons, parens are for numeric comparisons, and the doubled form of each is recommended since they turn off I/O redirection, wildcarding, and word splitting). Maybe he doesn't know?

Lastly, the "liststations" function seems overly complex to me. First, is it really necessary to specify the entire XPath all the way from the "html" element down to the "select" element and its attribute?? I haven't seen the data, but I'd be willing to bet that just the "select" element and attribute would be enough (since select is non-functional outside of forms anyway, and the attribute specifies the name of the select element!). Regardless of whether simplification is possible, change the delimiter of the regexpr! Use something other than a slash and avoid LTS ("leaning toothpick syndrome", per Larry Wall). With the text thus cleaned up visually, maybe let sed also do the elimination of text up to and including the equals sign? That eliminates the need for cut, although it may hurt readability. Additionally, sed can also replace the while loop; tell sed to match on the "select" element and attribute, then read three more lines into the holding space, appending them to whats already there. Now run a substitution on the hold space and print the result. (Or if you're not comfortable with sed, use awk, and you still eliminate the cut and while loop.) YMMV.

Overall, I agree with the other comment posted here that the standard for scripts should be POSIX, not a particular tool. Of course, POSIX has its own problems (quite a few, actually!), but that's a decision that individual organizations need to make: portability vs. speed/usability.

The sh POSIX standard

Anonymous's picture

The sha-bang ( #!) at the head of a script tells your system that this file is a set of commands to be fed to the command interpreter indicated. The #! is actually a two-byte [1] "magic number", a special marker that designates a file type, or in this case an executable shell script (see man magic for more details on this fascinating topic). Immediately following the sha-bang is a path name. This is the path to the program that interprets the commands in the script, whether it be a shell, a programming language, or a utility. This command interpreter then executes the commands in the script, starting at the top (line 1 of the script), ignoring comments. [2]

#!/bin/sh
#!/bin/bash
#!/usr/bin/perl
#!/usr/bin/tcl
#!/bin/sed -f
#!/usr/awk -f

Each of the above script header lines calls a different command interpreter, be it /bin/sh, the default shell (bash in a Linux system) or otherwise. [3] Using #!/bin/sh, the default Bourne Shell in most commercial variants of Unix, makes the script portable to non-Linux machines, though you may have to sacrifice a few Bash-specific features. The script will, however, conform to the POSIX [4] sh standard.

REF:

1) Advanced Bash-Scripting Guide
2) The Single UNIX Specification, Version 3

The Horror

Anonymous's picture

While it's a neat hack to parse HTML using bash, and I respect the authors significant contributions to Free Software (LNX-BBC, GAR - "We're not worthy!"), isn't this really a sign that scripting activities on GNU/Linux (and UNIX systems, if you must) should really be employing proper languages like Python and [insert favourite "agile" language here]?

Re: The Horror

Anonymous's picture

No. These days anything goes and Bash is appropriately qualified. Respect is due to people who enjoy time-tested languages.

Re: Scripting GNU in the 21st Century

Anonymous's picture

Somebody go tell those who are making the 'GNU' autoconf and automake??

Re: Scripting GNU in the 21st Century

Anonymous's picture

.. of course, the enitre point of GNU autotools is to enable you to code programs wuch that they compile regardless of what is and isn't avaliable on the build, host and target platforms, and if we start assuming they're fully GNU compatible it sort of defeats the point a bit, no?

Caution: Theater-Wide Monitor Required (NT)

Anonymous's picture

.

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix