An Ajax-Enhanced Web-Based Ethernet Analyzer

Combine Ruby, Ajax and bash with CGI scripts to monitor server-bound processes.

Listing 4 contains the dns-watcher.js code. A lot of what happens here has been covered by Reuven's excellent Ajax articles. The code starts by declaring some global variables that are used throughout the remainder of the code:

var capturing = false;
var matchEnd = new RegExp( "END run" );
var r = new getXMLHttpRequest();

The capturing boolean is set to true while the analyzer is capturing traffic, and to false otherwise. A regular expression is created to match against a string containing the words “END run”. Finally, an Ajax request object is created with a call to the getXMLHttpRequest method, which is taken directly from Reuven's examples.

The startWatcher method starts the heavy lifting by calling the updateCaptureData method every 1.5 seconds and setting capturing to true:

function startWatcher() {
  setInterval( "updateCaptureData()", 1500 );
  capturing = true;

It is within the updateCaptureData method that the Ajax call occurs, with the request object being used to execute another CGI script that accesses the dns-watcher.log disk file and returns its contents. (Listing 5 contains the get_watcher_data.cgi script, which is written in Ruby.) Once the CGI script has been invoked on the Web server, a call to displayCapture occurs:

function updateCaptureData() {

  if (capturing) { "GET",
            false );
    r.send( null );


The displayCaptureData method is adapted from Reuven's code and processes the results of the Ajax call, which are available from the request object. These are used to update the watcherarea text-area widget within the results Web page:

te.value = r.responseText;

Note the use of the following line of JavaScript to scroll the text area to the bottom of the results:

te.scrollTop = te.scrollHeight;

And, finally, note that the displayCaptureData method sets the capturing boolean to false as soon as a line that matches the regular expression appears within the data coming from the Ajax request (see Figures 1 and 2 to convince yourself that this in fact matches at the end of the network capture):

if ( matchEnd.test( te.value ) ) {
  capturing = false;

This check is very important. Without it, the Web browser continues to send an Ajax request to the server every 1.5 seconds for as long as the watcher.html results page is displayed within the browser, even after the analyzer has finished and isn't generating any more data. With this check in the code, the Ajax behavior is switched off, reducing the load on the Web server (and keeping the Apache2 access log from quickly growing large).

To deploy my solution, I created a simple shell script to copy the required components into the appropriate directory locations on my Web server (which is Apache2 on Ubuntu):

sudo cp watcher.html /var/www/
sudo cp startwatcher.html /var/www/
sudo cp dns-watcher.js /var/www/js/
sudo cp dns-watcher.rb /var/www/watcher/
sudo cp get_watcher_data.cgi /usr/lib/cgi-bin/
sudo cp startwatch.cgi /usr/lib/cgi-bin/

These directory locations may not match those of your Apache2 installation, so adjust accordingly. You also may need to create the js and watcher directories. And, of course, make sure the CGIs have their executable bit set.



Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.


millerb's picture

Thanks for the excellent follow-up to your TPR article. There are plenty of use cases for a network analyser that do not require promiscuous mode - which would simplify the approach. In general, I prefer not to operate in promiscuous mode since I am only interested in monitoring point-to-point traffic.

As a consultant who frequently performs network analysis and tuning I am keenly interested in a solution like this. While I like the Ajax/Apache approach my customers aren't likely to be crazy about me installing the entire kit on their boxes. Wrapping this approach into one executable would be ideal. Or, simply installing an agent which would send the results to an Apache instance running on my laptop :-)


Wrapping this approach into one executable would be ideal

barryp's picture

Thanks for the positive comment. If all you need is the results, all your customers need is Ruby (or Perl) installed on their boxes, with a little script that wakes up every now & then and sends stuff to you. My article simply used the analyzer as a way to generate a lot of server side data which allowed me to demo the Ajax solution.

Paul Barry
IT Carlow, Ireland

Paul Barry

broken link

Anonymous's picture

Thanks for this nice article!

There is only one thing I want to add:
The above link to the sources appears to be dead...
Here is the fixed one: