Work the Shell - Special Variables I: the Basics

 in
Dave begins a new series of columns on shell variable notation.

There I was, trying to come up with a topic for this column, when I did what I usually do when stumped: I sent a question out to my Twitter followers. This time, I got a great answer, from John Minnihan: “How about how special vars inside a script, for example, #!/bin/bash script="${0##*/}" current=`dirname "$0"` cd $current; make?”

That's a good topic, so let's dig into it, starting with the basics this month, shall we?

The Easy Special Variables

The basic notation of variables in the shell is $varname, but I bet you've already used a few special notations without really thinking about it. For example, want to know how many positional parameters (aka starting arguments) you received when the script was invoked? Using $# gives you the value:

echo "you gave me $# parameters"

Want to get a specific positional parameter from the starting command line? That's done with other special variables: $1, $2, $3 and so on. These are rather odd cases actually, and the shift command shifts them all down one, so you easily can parse and trim command flags.

Try this snippet to see what I'm talking about:

echo "arg1 = $1" ; shift ; echo "now arg1 = $1"

The variable $0 is a special one in this sequence. It's the name of the script or program invoked. This can be quite helpful, because it means you can add multiple commands to your Linux shell with a single shared script base.

Let's say that you want to add “happy” and “sad” as two new command-line options, but you want to do it within a single script. Easy! Write the script, save as “happy”, create a symbolic link that means “sad” points to “happy”, and put this in the script itself:

if [ "$0" = "happy" ] ; then
  echo "I am so darn happy too, hurray!"
else
  echo "Sorry you're sad. Why not take a walk?"
fi

See how that works? It turns out that there's a nuance to this usage, however, because you often get the full path in the $0 variable, so most people use $(basename $0) instead of just utilizing the $0 directly.

Checking Your Status

Another special variable that you might have encountered is the status variable, $?. In a script, this contains the return value of the most recently executed (external) command.

This is where you need to read man pages so you know what to expect on success and failure, but as an example, consider the test command. According to the man page, “if [the expression] evaluates to true, it returns a zero (true) exit status; otherwise it returns 1 (false). If there is no expression, test also returns 1 (false).”

This means you could do this:

test 1 --eq 3
if $? ; then

Quick, now, would we be within this conditional statement or not? That's where it's tricky because zero = true and nonzero = false, which is somehow opposite to how we naturally think of conditional tests (well, how I think of them, at least). In fact, the above test would be testing 1, because the “test” would evaluate to false, and its return value also would be false.

Now, using test like this is a sort of daft example, but what if you wanted to create a subdirectory and then test to see if it was successful? That's a perfect use for $?, actually:

mkdir $newdir
if [ $? --ne 0 ] ; then
   echo "We failed to make the directory $newdir"

It turns out that you also can streamline this sort of thing by having the “if” directly evaluate the return code:

if mkdir $newdir ; then

That's a better coding style, although it can be confusing if you are used to having conditional expressions be value tests, not actually commands that do something.

A Few More Useful Special Variables

A special variable that I use with great frequency for helping create temporary file names is $$, which expands to the current process ID in the system. For example:

$ echo $$
3243

If you're doing a lot with subshells or spawning subcommands, another useful variable is $!, which is the process ID of the most recently spawned background command. I've never used this in any of my shell scripts, but you might find a situation where it's helpful.

The last example I'll talk about here is most useful when you want to hand starting parameters to subshells. The two options are $* and $@, and it's so convoluted to explain the difference that it's easier just to demonstrate.

Let's start with a tiny script that simply reports how many parameters it's given:

#!/bin/sh
echo "I was given $# parameters"
exit 0

I'll call that subshellcount.sh and utilize it like this:

#!/bin/sh
echo "you gave me $# variables and the first is $1"
echo "unprotected parameters:"
./subshellcount.sh  $1 $2 $3 $4
echo "or, more succinctly:"
./subshellcount.sh $*
echo "but when we put \$* in quotes:"
./subshellcount.sh "$*"
echo "by comparison, same thing with \$@:"
./subshellcount.sh "$@"

Watch what happens when I invoke it with three parameters, one of which has a space embedded:

$ sh test.sh I love "Linux Journal"
you gave me 3 variables and the first is I
unprotected parameters:
I was given 4 parameters
or, more succinctly:
I was given 4 parameters
but when we put $* in quotes:
I was given 1 parameters
by comparison, same thing with $@:
I was given 3 parameters

Can you see the difference here? When we don't take efforts to protect the space in the third positional parameter (either by just referencing $3 or using the $@ without quotes), it splits into two parameters to the subshell, and we get a count of four.

Quoting by itself doesn't do the trick either, because of the difference between $@ and $*. With the latter, everything expands without “breaking out of” the quotes, so $* ends up being a single positional parameter to the subshell. Fortunately, $@ works exactly as we'd like, and the subshell gets three parameters, not one, not four.

It seems a bit trivial, but when you start working with filenames that have spaces in them, for example, you quickly will learn just how tricky it is to get all of this correct!

I'm going to stop here, and starting next month, we'll delve into the more obscure and complex shell variable notation. It's interesting stuff.

Dave Taylor is a 26-year veteran of UNIX, creator of The Elm Mail System, and most recently author of both the best-selling Wicked Cool Shell Scripts and Teach Yourself Unix in 24 Hours, among his 16 technical books. His main Web site is at www.intuitive.com, and he also offers up tech support at AskDaveTaylor.com. You also can follow Dave on Twitter through twitter.com/DaveTaylor.

______________________

Dave Taylor has been hacking shell scripts for over thirty years. Really. He's the author of the popular "Wicked Cool Shell Scripts" and can be found on Twitter as @DaveTaylor and more generally at www.DaveTaylorOnline.com.

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState