Run, RabbIT, Run: Making 56k Go Fast
As those of you who have been following me and my articles here know, I'm stuck on a 56k dial-up. So when something called RabbIT crossed Freshmeat, I had to check it out. Rabbit is a caching, compressing web proxy written in Java. It filters ads by URL fragment and works, more or less, with HTTP/1.1. I say more or less because I didn't have any problems feeding it HTTP/1.1 from Galeon, but the web page claims only "almost complete" compliance.
To run RabbIT, you need an upstream machine with a faster link that you can run a process on (a lot of hosting sites let you do this), Java 1.1 or better, ImageMagick and a browser that groks compressed streams and JPEGs. My upstream box is a 1.1GHz Athlon running Red Hat 7.3, IBM Java 2.14, ImageMagick 22.214.171.124-1 and a 384k SDSL line. The downstream system is a 1.1GHz Duron running Debian Woody, with Galeon upgraded to 1.2.7.
Being Java, this little application is all but point, click and surf. I pulled down the tarball, exploded it into a directory and twiddled the configuration file a bit. I moved the cache size down from 12 to 5MB, told it to cache only images, because most of my surfing is to places like Slashdot where the text on a given URL changes all the time. Then, I tinkered a bit with the ad blocking and saved it off. Making sure the paths to java and convert were in my PATH, I ran the jr script to start the server. The server cranked just fine, so I got into Galeon's Preferences:Network menu, set for Manual Configuration, entered my upstream information, closed the dialog and reloaded a test page.
It worked, but the Dust Puppy on Slashdot's home page still is animated, which means it's still a GIF. And the images seem to be loading at more or less the same rate, although the text sure is coming in fast. Ergo, there must be a problem with the converter. Sure enough, a check of the log file says /usr/bin/convert wasn't found. Convert, according to rpm -ql ImageMagick, is in /usr/X11R6/bin. Got to fix that.
The README on the web site says this thing can be configured remotely, but when I fed Galeon the appropriate URL, I received a 400 Bad Request, which is a Java exception. Well, that's okay; I'm a command-line type anyway--I'll just go edit the file. With the fix in place, I hit the URL to shut down the proxy, which, like all of the metapages, uses htaccess-type authentication. I verified it was down and then tried to restart it. The script said seems to run on [hostname], try to use that... and failed to start. After a few tries, I figured out the PID file it writes on startup still was intact, so I rmed it, and the server restarted fine. Now all those big GIFs were low resolution JPEGs, and they loaded in a hurry.
The JPEGs weren't good quality at all, though. But, not to worry; this is configurable. Go back through the dance of kill the server, edit the config, start the server and reload. A setting of 30 on the --quality line for the converter flags makes User Friendly look a lot better, and it doesn't take that much longer to load. It's a noticeable lag, but it's still a good bit faster than the standard.
I played with things for a while, loading various graphics-intensive sites (CNN made a good test site). I noted that while there was a bit of a delay as RabbIT downloaded images for the first time, once the cache was active and the data started flowing downstream, the pages loaded in a hurry. For sites like Slashdot or LiveJournal, where many of the graphics are the same, this is a boon.
Then Galeon took that opportunity to crash, which gave me a beautiful acid-test scenario. One of the nice things about Galeon is it has session recovery. This is not as pleasant as it sounds on the downstream end of a 56k link, especially when you had 24 windows open as I did. This set up is something that would have taken me 15 minutes or so to restore--carefully, so as not to overwhelm the modem link--from bookmarks, which is one of the session restore options Galeon offers. When Galeon asked to restore the session, instead of generating bookmarks, I told it to restore everything as it was. I watched the modem churn for a bit, then saw a sea of tabs with blue labels--Galeon's way of saying it has finished downloading a page. The whole process took four minutes, and I had all my text back in under three. Of course, by no means is this a scientific benchmark (the web site has figures if you're really curious). But, it was pretty clear to me that RabbIT is a good way to make life at 56k a lot happier.
Obviously, the ability to configure from remote needs work. And the Basic RFC-2068 based authentication for metapages needs upgrading, although the README admits this. Stunnel might be a good temporary alternative; time did not permit testing such an arrangement. But the configuration file is pretty easy to work with, and RabbIT can be configured to accept connections only from a given IP or IP range(s). You also can go so far as to set up proxy authentication so you have to log in to use it. Add those two features to a stunnel setup, and you have something that should get by the stickiest of security officers. And you can pass or not pass, cache or not cache, objects fairly arbitrarily simply by editing the config file.
All in all, RabbIT seems to be a nice system. Being Java, it requires a fair amount of memory but not a lot of CPU; even during the 24-window reload it managed to get only 9% of the CPU on the Athlon. Overall, it is reasonably well documented. It gets my thumbs up.
RabbIT is available from www.khelekore.org/rabbit under a BSD-no-ads-type license.
Next Week: For those of you wondering "What happened to the Ultimate Linux Box?", I have an answer. The kind folks at Monarch Computers, with whom we're designing the ULB, should have a test-bed in route to us as you read this. The first thing I'll be testing is some low-noise cooling components, welcome news to those of you who posted feedback about noisy workstations.
Glenn Stone is a Red Hat Certified Engineer, sysadmin, technical writer, cover model and general Linux flunkie. He has been hand-building computers for fun and profit since 1999, and he is a happy denizen of the Pacific Northwest.
Webinar: 8 Signs You’re Beyond Cron
On Demand NOW
Join Linux Journal and Pat Cameron, Director of Automation Technology at HelpSystems, as they discuss the eight primary advantages of moving beyond cron job scheduling. In this webinar, you’ll learn about integrating cron with an enterprise scheduler.View Now!
|Non-Linux FOSS: All the Bitcoin, None of the Bloat||May 26, 2015|
|Dr Hjkl on the Command Line||May 21, 2015|
|Initializing and Managing Services in Linux: Past, Present and Future||May 20, 2015|
|Goodbye, Pi. Hello, C.H.I.P.||May 18, 2015|
|Using Hiera with Puppet||May 14, 2015|
|Urgent Kernel Patch for Ubuntu||May 12, 2015|
- Initializing and Managing Services in Linux: Past, Present and Future
- Non-Linux FOSS: All the Bitcoin, None of the Bloat
- Dr Hjkl on the Command Line
- Using Hiera with Puppet
- Goodbye, Pi. Hello, C.H.I.P.
- Gartner Dubs DivvyCloud Cool Cloud Management Vendor
- Mumblehard--Let's End Its Five-Year Reign
- It's Easier to Ask Forgiveness...
- Infinite BusyBox with systemd
- Urgent Kernel Patch for Ubuntu