Linux in the Real World
The United States Army Publications and Printing Command (USAPPC) is, as implied by its name, that part of the Army charged with the creation, publication, and distribution of all Army publications. This can include everything from simple forms up to complete technical manuals. The Command uses CD-ROMs to distribute a list of some 40,000 publications to the field units, who use them as a basis for submitting orders for publications. These CDs are mailed out quarterly, but the process of getting the CDs cut and distributed can take as long as four or five months. See the problem? By the time some customers get their copy of the CD, it has already been replaced by the next one, and can potentially contain obsolete listings. How could this dilemma be solved? Linux, of course!
An ideal platform for distributing or ordering publications is the World Wide Web. The Command already has a TCP/IP link to the Internet; all it would take is a system to run as the server. The Command had looked into putting up a web server in the summer of 1995 but decided against it because they were told it would cost $60,000 to implement. This is where I became involved. After being informed of the situation in November 1995, I asked if I could attempt to set up a web server on one of the Pentiums the Command had. After being told that, due to budget constraints, no money could be spent on this project, I was given a PC. Now for the fun stuff.
Like most Linux enthusiasts, I have a number of CDs with various versions of Linux distributions at home. I brought them in and set to work. First, I installed the Fall release of Slackware 3.0. After figuring out the type of Ethernet card installed (nothing like blind guessing), the installation went smooth as silk. But there were some minor problems running 3.0, so I dropped back to version 2.3 for my base. I installed and configured the entire system in half a day. (It was easy after doing it hundreds of times on my home system.)
Now to find an HTTP server. I looked at the usual choices: NCSA, CERN, Apache... All are good programs, but I ended up going with WN. WN is a fast, flexible HTTPD that has built in search and image map capabilities as well as very strong security. It can be found at ftp://ftp.acns.nwu.edu/pub/wn/. Further information may be found at WN's home page: hopf.math.nwu.edu/. The main reasons for this choice were its easy installation and the built-in search engine, which would be perfect for what was needed. Once I had the system up and working, I started building the pages.
It turned out that constructing the web site was more effort than loading the OS. My first task was to set up an ordering system for publications. Since the application to process orders was on the mainframe, I set up a form that takes the necessary input and saves it to a file. Then, every night, a cron job sends the contents of this file to the mainframe using NCFTP. This way, the current system—with all its editing and security checks—can be used, and the procedures for submitting orders by e-mail and paper that were already in place did not have to change.
Next came the task of putting the contents of the publications CD-ROM on the web site. Using the program that came with the CD, I generated extract files for all the different types of publications. This totaled 7 files of over 39 MB. I put these on the server, and using the built-in search capabilities of WN, created a form to view and/or search the files for user-defined strings.
Once this was working, I started work on having a job run on the mainframe that would extract the publications data in the correct format from the original source that was used to make the CDs. This can be retrieved via NCFTP as often as needed so the current data is always available on the web site.
Right now, I'm working with the section that produces forms in electronic format. These forms, which are in Perform Pro and Formflow formats, are also distributed to customers on CD-ROM. I am currently building a page where customers can search and download the forms they need using FTP. This should be working by the time this article is published.
Future plans for this system include linking it up with a dial-up BBS so that customers without direct Net access will be able to access the ordering and search systems with the data shared between the BBS and the web site. From there, who knows? If you'd like to see what has been done on the USAPPC site, the address is www-usappc.hoffman.army.mil.
Because of this system, the cost savings for publications ordering and distribution will be quite large. All this was made possible by Linux; without Linux, there would be no USAPPC web site at all.
And I'd still be hacking away at JCL.
Joe Klemmer (email@example.com) is a 33-year-old civilian Informations Systems employee of the US Army, and has worked for them for over 10 years. A follower of Linux since version 0.12, he enjoys giving away Linux CDs to spread the faith. Other than Linux, his passions include his wife, Joy, and their four ferrets and six finches (as of this writing).
-- Indie Game Dev and Linux User Contact Info: http://about.me/joeklemmer "Running Linux since 1991"
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
|Non-Linux FOSS: Seashore||May 10, 2013|
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- Validate an E-Mail Address with PHP, the Right Way
- Drupal Is a Framework: Why Everyone Needs to Understand This
- A Topic for Discussion - Open Source Feature-Richness?
- The Secret Password Is...
- RSS Feeds
- New Products
4 hours 23 min ago
- Keeping track of IP address
6 hours 14 min ago
- Roll your own dynamic dns
11 hours 28 min ago
- Please correct the URL for Salt Stack's web site
14 hours 39 min ago
- Android is Linux -- why no better inter-operation
16 hours 55 min ago
- Connecting Android device to desktop Linux via USB
17 hours 23 min ago
- Find new cell phone and tablet pc
18 hours 21 min ago
19 hours 50 min ago
- Automatically updating Guest Additions
20 hours 59 min ago
- I like your topic on android
21 hours 45 min ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?