An Ajax-Enhanced Web-Based Ethernet Analyzer

Combine Ruby, Ajax and bash with CGI scripts to monitor server-bound processes.
Running dns-watcher.rb

It's time to give dns-watcher.rb a spin:

sudo ruby dns-watcher.rb

The output from one such invocation is shown in Figure 1. Note that there are not 50 lines of output, as might be expected. Remember, the program's if statement checks to see whether the captured DNS message is a query going to the server and processes the message only if it is. All other DNS messages are ignored by the program, even though they still contribute to the overall count of DNS packets processed.

Figure 1. Running dns-watcher.rb from the Command Line

To run the analyzer for a longer amount of time, change the NUMPACKETS constant to some value greater than 50. As shown in Figure 1, it took the analyzer just more than 40 seconds to process 50 DNS messages (on my PC, on my network segment—your mileage will vary). It is not unreasonable to assume that changing the constant value to something like 250 could result in several minutes of processing. Obviously, piping the output to a disk file or to less allows you to review any results at your leisure.

Creating a Web-Based Ethernet Analyzer

With my little analyzer up and running, I started thinking it would be cool if I could provide a Web-based interface to it. As every Web developer knows, long-running, server-bound processes and the Web tend not to go together, as there's nothing worse than waiting at a browser for long periods of time while such a process executes. During the years, a number of solutions to this problem have been proposed, which involve techniques that employ redirection, cookies, sessions and the like. Although such techniques work, I've always thought they were rather clunky, and I've been on the lookout for something more elegant. Having just completed Reuven M. Lerner's excellent series of LJ articles on Ajax programming [see the October, November and December 2006 issues of LJ], I wondered if I could combine my analyzer with an Ajax-enabled Web page, updating a part of the Web page with the output from the analyzer as and when it was generated.

My strategy is simple enough. I provide a starter Web page that starts the network analysis on the Web server as a backgrounded CGI process, and then redirects to another Web page that displays the results in an HTML text-area widget, updating the text area with the results from the network analysis. The little HTML Web page in Listing 2 gets things moving. All this Web page really does is provide a link that, when clicked, calls the startwatch.cgi script. The latter is itself straightforward CGI, written as a bash script. Here's the entire script:

#! /bin/sh

echo "Content-type: text/html"
echo ""

sudo /usr/bin/ruby /var/www/watcher/dns-watcher.rb \
         > /var/www/watcher/dns-watcher.log &

echo '<html><head>'
echo '<title>Fetching results ... </title>'
echo '<meta http-equiv="Refresh" content="1;'
echo 'URL=/watcher.html">'
echo '</head><body>Fetching results ... </body>'
echo '<html>'

The key line of script is the one that invokes Ruby and feeds the interpreter the dns-watcher.rb program, redirecting the latter's standard output to a file called dns-watcher.log. Note the trailing ampersand at the end of this command, which runs the analyzer as a background process. The script continues by sending a sort HTML Web page to the browser that redirects to the analysis results page, called watcher.html, which is shown in Listing 3.

The results Web page loads in some JavaScript code (dns-watcher.js) within its header section, and then creates a simple HTML results page that contains an initially empty text-area widget called watcherarea. A call to the startWatcher JavaScript method occurs as soon as the browser loads the body section of the results Web page.

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Nice!

millerb's picture

Thanks for the excellent follow-up to your TPR article. There are plenty of use cases for a network analyser that do not require promiscuous mode - which would simplify the approach. In general, I prefer not to operate in promiscuous mode since I am only interested in monitoring point-to-point traffic.

As a consultant who frequently performs network analysis and tuning I am keenly interested in a solution like this. While I like the Ajax/Apache approach my customers aren't likely to be crazy about me installing the entire kit on their boxes. Wrapping this approach into one executable would be ideal. Or, simply installing an agent which would send the results to an Apache instance running on my laptop :-)

Cheers

Wrapping this approach into one executable would be ideal

barryp's picture

Thanks for the positive comment. If all you need is the results, all your customers need is Ruby (or Perl) installed on their boxes, with a little script that wakes up every now & then and sends stuff to you. My article simply used the analyzer as a way to generate a lot of server side data which allowed me to demo the Ajax solution.

--
Paul Barry
IT Carlow, Ireland
http://glasnost.itcarlow.ie/~barryp

Paul Barry

broken link

Anonymous's picture

Thanks for this nice article!

There is only one thing I want to add:
The above link to the sources appears to be dead...
Here is the fixed one:
ftp://ftp.linuxjournal.com/pub/lj/listings/issue157/9614.tgz

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState