Debugging Web Sites

Finally Some Scripting

To identify these rewrite fails, I had to create a script—and fast. After all, while the internal linkages still might work, the thousands of external links from sites like Popular Science, the Wall Street Journal, Wired and elsewhere were not broken. Yikes—not good at all.

I started out on the command line with one that I knew failed. Here's what happened when I used curl to grab a bad URL on the new site:


$ curl
http://www.askdavetaylor.com/
↪schedule-facebook-photo-upload-to-my-fan-page.html
| head -5

% Total  % Received % Xferd  Average Speed  Time  Time  Time Current
                             Dload  Upload  Total Spent Left Speed
0     0  0    0     0     0      0     0 --:--:-- --:--:-- --:--:--
0<!DOCTYPE html>
<html lang="en-US">
<head>
<meta charset="UTF-8" />
<h3>Nothing found for
Schedule-A-Facebook-Photo-Upload-To-My-Fan-Page</h3>
100 31806   0 31806  0   0  110k  0 --:--:-- --:--:-- --:--:-- 110k
curl: (23) Failed writing body (0 != 754)

Ugh, what a mess this is, and it's not surprising because I forgot to add the -silent flag to curl when I invoked it.

Still, there's enough displayed here to provide a big clue. It's a 404 error page, as expected, and the <h3> indicates just that:


<h3>Nothing found for ...

So that's an easy pattern to search for:


curl -silent URL | grep '<h3>Nothing found for'

That does the trick. If the output is non-zero, the link failed and generated a 404 error, but if the link worked, it'll be the proper title of the article, and the words "Nothing found for" will appear.

That's most of the needed logic for the script. The only other step is to simulate the rewrite rule so that all the links that do work aren't flagged as a problem. Easy:


newname="$(echo $name | sed 's/\.html/\//')"

This is a super-common sequence that I use in scripts, actually with a subshell invocation $( ) echoing a variable's current value, just to push it through a sed substitution, in this case replacing .html with a trailing slash (which needs to be escaped with a leading backslash, hence the complexity of the pattern).

Wrap this in a for loop that steps through all possible *.html files, and here's what it looks like:


for name in *.html ; do
  newname="$(echo $name | sed 's/\.html/\//')"
  test=$($curl $base/$newname | grep "$pattern")
  if [ -n "$test" ]
  then
    echo "* URL $base/$name fails to resolve."
  fi
done

That's boring though, because while I'm at it, I'd like to know how many URLs were tested and how many errors were encountered. I mean, why not, right? Quantification = good.

It's easily added, as it turns out, with the addition of two new variables (both of which need to be set to zero at the top of the script):


for name in *.html ; do
  newname="$(echo $name | sed 's/\.html/\//')"
  test=$($curl $base/$newname | grep "$pattern")
  if [ -n "$test" ] ; then
    echo "* URL $base/$name fails to resolve."
    error=$(( $error + 1 ))
  fi
  count=$(( $count + 1 ))
done

Then at the very end of the script, after all the specific errors are reported, a status update:


echo ""; echo "Checked $count links, found $error problems."

Great. Let's run it:


$ bad-links.sh | tail -5

* URL http://www.askdavetaylor.com/whats_a_fast_way_to_add_a_
↪store_and_shopping_cart_to_my_site.html fails to resolve.

* URL http://www.askdavetaylor.com/whats_amazons_simple_
↪storage_solution_s3.html fails to resolve.

* URL http://www.askdavetaylor.com/whats_my_yahoo_
↪account_password_1.html fails to resolve.

* URL http://www.askdavetaylor.com/youtube_video_
↪missing_hd_resolution.html fails to resolve.

Checked 3658 links, found 98 problems.

Phew. Now I know the special cases and can apply custom 301 redirects to fix them. By the time you read this article, all will be well on the site (or better be).

______________________

Dave Taylor has been hacking shell scripts for over thirty years. Really. He's the author of the popular "Wicked Cool Shell Scripts" and can be found on Twitter as @DaveTaylor and more generally at www.DaveTaylorOnline.com.

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState