It was P.T. Barnum who famously said, “I don't care what you say about me, but just spell my name right.”
So that's our rationale, here at Linux Journal, for enduring a bit of publicity that came our way via the December 14, 2000 issue of The Wall Street Journal. The “Digits” column on page B6 (the Technology Journal page) leads off with a six-inch item titled “Linux Battles”. Normally we scan pubs like the Journal for tidbits about anything and everything that might remotely relate to Linux. But this time the news wasn't just close to home—it was home. The story was about Linux Journal.
“Digits” is the WSJ's form of “UpFRONT” and shares the same appetite for irony. Unfortunately, the irony in question here involved the apparent fact that our modest little on-line store sold police-style barricade tape that says “Microsoft Free Zone” while the company that actually runs the store, WAS, Inc., was hardly Microsoft-free. It seems that the site was at least partly served up by (shudder) Microsoft Windows NT—at least while WAS moved its operation onto some kind of UNIX.
We have been working with WAS to hasten the end of this irony and expect to return the store to an equally Microsoft- and news-free condition.
Founded in 1989 at Michigan State University, the Julian Samora Research Institute has one purpose: to generate, transmit and apply knowledge that will serve the needs of Latino communities in the Midwest. Grants to the Institute help to fund empirical research done by scholars and to publish their research as books or monographs. This research looks at relevant social, economic, educational and political conditions of Latino communities in both the US as a whole and the Midwest in particular. The Institute's forthcoming data will serve as a resource on and for Latinos.
The Institute started publishing small books and reports a decade ago. Since then, it has increased both its research and publishing volume ten times over. Up until three years ago, the Institute could get away with publishing a book on paper and then file it away.
Danny Layne, who divides his time between network administration and publishing production, says: “If we needed to print a book, we'd pull it out of the file and then put it away.”
To keep up with the volume of research it had to publish, the Institute found itself producing more electronic files—files that kept getting larger and more complex. Researchers broke down chapters into multiple files. One book could consist of 20 different files. Researchers also generated electronic charts and graphics along with PowerPoint presentations. Books got published not only in hard-copy format but also on-line on the Institute's web site. Layne says: “To this end, we were generating new types of files that we never had before.” Disk space on a desktop personal computer couldn't handle the volume being churned out. Layne says that the Institute didn't want to start adding large hard disk drives to its desktop PCs. “If one PC's disk drive failed, then we'd have to restore files from a previous backup tape and recreate what we lost. That's inefficient.” So, with technology funds from the government and the University, the Institute decided to buy a central storage system to house all publications and the files for the web site. Since the Institute has a small computing staff and limited resources, the storage device had to be highly reliable, easy to set up and maintain and able to accommodate more storage space with the addition of more disk drives as needed. Layne notes, “Our search for a storage system brought us to Winchester Systems. We purchased a FlashDisk external RAID storage system with seven 9GB disk drives.”
Layne observes that just three years ago the Institute had virtually no storage—only a few desktop PCs. “During this time, the FlashDisk has allowed a small research department, within a large university, to turn itself into a publishing powerhouse on a small purse. Some of the other departments on campus are in awe of our storage system. And there are good reasons for it.” Two side-by-side Dell PowerEdge 2200s, one Windows NT and one Linux plug directly into the FlashDisk. It provides fast, highly reliable RAID 5 storage to multiple servers with different operating systems. This feature eliminates the expense of buying storage for each server. Layne says that managing one storage system is easier than managing two or three of them.
The Windows NT server, which connects to the intranet within the building, functions as a central repository for all active publications and for the databases used to inventory and to track these publications. The FlashDisk allows each researcher to have his or her own storage space, apart from the desktop. Using either a Windows-based PC or a Mac, researchers can access both Windows NT, Mac and Linux files stored on the FlashDisk.
Cross-platform programs allow the system to function as a quasi-network attached storage filer. These programs include Services for Macintosh running on the Windows NT server and Netatalk running on the Linux server. The latter program permits printers to function as network devices.
The FlashDisk also contains a large collection of artwork. Overall, the FlashDisk provides the researchers with fast access to a large bank of files: everything from text to graphics, regardless of the format, over an intranet. When a book is no longer going to be published, it gets archived to a CD-ROM or a DVD. Meanwhile, the Linux server, which connects to the external network, contains all of the Institute's web files, as well as the web site itself. About 700 web pages reside on the FlashDisk. The web site gets about 3,000 hits each day (100,000 hits a month). Setting the space aside on the FlashDisk to store the Linux files, as well as the Linux operating system, turned out to be easier than Layne thought it would be: “We just followed the FlashDisk's instructions in the manual and made one telephone call to technical support, and then we were up and running,” he says.
While the FlashDisk provides a large amount of disk space, Layne wants to avoid having it become clogged with multiple versions of old files. He says that keeping storage neat and trim shouldn't become a time-consuming burden for a network administrator. “We've mentored our researchers to perform a number of storage housekeeping procedures. After all, they're responsible for overseeing their flow of information, including creating it, updating it, storing it, deleting it or archiving it.” For example, researchers learn how to name their files so they can easily locate them and remove them if they get old. Layne has also put a regular storage clean-up program in place. Researchers have to go through their storage space and either delete multiple copies of files or move old files to a CD. While researchers do a good job of maintaining their space, Layne says that the Institute's publishing volume has a healthy appetite for more storage space.
According to Layne, “We're planning to upgrade our 9GB drives to 18GB drives to double our amount of storage. We can do this inexpensively because Winchester Systems will give us credit toward a trade-in on the drives. The service folks at Winchester Systems must feel like the repair people at Maytag. The FlashDisk has never broken down, not even hiccuped.”
Money from the University will allow the Institute to produce audio and video clips for the Web. Layne says, “We've already tested accessing and storing multimedia on the FlashDisk. Everything worked fine.”
—Elizabeth M. Ferrarini
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Google's SwiftShader Released
- Interview with Patrick Volkerding
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Tech Tip: Really Simple HTTP Server with Python
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Returning Values from Bash Functions
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide