Linux in the Real World
The United States Army Publications and Printing Command (USAPPC) is, as implied by its name, that part of the Army charged with the creation, publication, and distribution of all Army publications. This can include everything from simple forms up to complete technical manuals. The Command uses CD-ROMs to distribute a list of some 40,000 publications to the field units, who use them as a basis for submitting orders for publications. These CDs are mailed out quarterly, but the process of getting the CDs cut and distributed can take as long as four or five months. See the problem? By the time some customers get their copy of the CD, it has already been replaced by the next one, and can potentially contain obsolete listings. How could this dilemma be solved? Linux, of course!
An ideal platform for distributing or ordering publications is the World Wide Web. The Command already has a TCP/IP link to the Internet; all it would take is a system to run as the server. The Command had looked into putting up a web server in the summer of 1995 but decided against it because they were told it would cost $60,000 to implement. This is where I became involved. After being informed of the situation in November 1995, I asked if I could attempt to set up a web server on one of the Pentiums the Command had. After being told that, due to budget constraints, no money could be spent on this project, I was given a PC. Now for the fun stuff.
Like most Linux enthusiasts, I have a number of CDs with various versions of Linux distributions at home. I brought them in and set to work. First, I installed the Fall release of Slackware 3.0. After figuring out the type of Ethernet card installed (nothing like blind guessing), the installation went smooth as silk. But there were some minor problems running 3.0, so I dropped back to version 2.3 for my base. I installed and configured the entire system in half a day. (It was easy after doing it hundreds of times on my home system.)
Now to find an HTTP server. I looked at the usual choices: NCSA, CERN, Apache... All are good programs, but I ended up going with WN. WN is a fast, flexible HTTPD that has built in search and image map capabilities as well as very strong security. It can be found at ftp://ftp.acns.nwu.edu/pub/wn/. Further information may be found at WN's home page: hopf.math.nwu.edu/. The main reasons for this choice were its easy installation and the built-in search engine, which would be perfect for what was needed. Once I had the system up and working, I started building the pages.
It turned out that constructing the web site was more effort than loading the OS. My first task was to set up an ordering system for publications. Since the application to process orders was on the mainframe, I set up a form that takes the necessary input and saves it to a file. Then, every night, a cron job sends the contents of this file to the mainframe using NCFTP. This way, the current system—with all its editing and security checks—can be used, and the procedures for submitting orders by e-mail and paper that were already in place did not have to change.
Next came the task of putting the contents of the publications CD-ROM on the web site. Using the program that came with the CD, I generated extract files for all the different types of publications. This totaled 7 files of over 39 MB. I put these on the server, and using the built-in search capabilities of WN, created a form to view and/or search the files for user-defined strings.
Once this was working, I started work on having a job run on the mainframe that would extract the publications data in the correct format from the original source that was used to make the CDs. This can be retrieved via NCFTP as often as needed so the current data is always available on the web site.
Right now, I'm working with the section that produces forms in electronic format. These forms, which are in Perform Pro and Formflow formats, are also distributed to customers on CD-ROM. I am currently building a page where customers can search and download the forms they need using FTP. This should be working by the time this article is published.
Future plans for this system include linking it up with a dial-up BBS so that customers without direct Net access will be able to access the ordering and search systems with the data shared between the BBS and the web site. From there, who knows? If you'd like to see what has been done on the USAPPC site, the address is www-usappc.hoffman.army.mil.
Because of this system, the cost savings for publications ordering and distribution will be quite large. All this was made possible by Linux; without Linux, there would be no USAPPC web site at all.
And I'd still be hacking away at JCL.
Joe Klemmer (firstname.lastname@example.org) is a 33-year-old civilian Informations Systems employee of the US Army, and has worked for them for over 10 years. A follower of Linux since version 0.12, he enjoys giving away Linux CDs to spread the faith. Other than Linux, his passions include his wife, Joy, and their four ferrets and six finches (as of this writing).
-- Indie Game Dev and Linux User Contact Info: http://about.me/joeklemmer "Running Linux since 1991"
Free DevOps eBooks, Videos, and more!
Regardless of where you are in your DevOps process, Linux Journal can help!
We offer here the DEFINITIVE DevOps for Dummies, a mobile Application Development Primer, and advice & help from the expert sources like:
- Linux Journal
- New Products
- Flexible Access Control with Squid Proxy
- Users, Permissions and Multitenant Sites
- Security in Three Ds: Detect, Decide and Deny
- High-Availability Storage with HA-LVM
- Tighten Up SSH
- DevOps: Everything You Need to Know
- Non-Linux FOSS: MenuMeters
- Solving ODEs on Linux
- diff -u: What's New in Kernel Development