Koha: a Gift to Libraries from New Zealand
The Maori word for a gift or donation is koha. It's also an integrated library system from New Zealand. Written for the Horowhenua Library Trust (HLT), it was licensed under the GPL and is now in use by libraries around the world.
In 1999, HLT made a momentous decision. They were using a 12-year-old integrated library system (ILS) that was no longer being developed. They knew the system wasn't Y2K-compliant, and they realized it no longer fit their needs. HLT also knew that buying a new system would cost them a lot of money up front and would require capital improvements they couldn't afford to make (communication lines and gear to support the new system).
Considering all of these factors, HLT, in consultation with Katipo Communications, decided to write their own system. They then decided to release this new system under the GPL, ensuring that other libraries could benefit from the work and also cooperate in future development of the system. This decision has had far-reaching effects.
Koha was developed during the fourth quarter of 1999 and went into production on January 1, 2000. There was a brief flurry of work on the system, and it was released to the world early that year. Koha won two awards in 2000: the 3M award for Innovation in Libraries and the ANZ Interactive Award (Community/Not-for-Profit Category).
Initially, Koha was picked up by other libraries in New Zealand (many of them hiring Katipo for support). One early adopter, Mike Mylonas, caught the vision of open-source software in libraries and began to contribute to the project. Mike currently supports Koha for four private libraries, one for his current employer and three for nonprofit organizations.
It didn't take long for Koha to cross the Pacific. In the fall of 2000 the rural Coast Mountain school district in British Columbia, Canada, was looking for a solution for their library needs. They had been running a home-brew system built on Apple II computers, and it had finally died. Finding the money for a proprietary solution would be difficult (a small elementary school in New England recently received a quote for $20,000 to install a new ILS—proprietary library automation isn't cheap), so they put one of their network technicians to the task of finding a better option.
Steve Tonnesen, Coast Mountain's network engineer, came across Koha and started to evaluate it. It took him about two days to get Koha up and running. Once he had that base to work from, he starting hacking. He cleaned up the circulation interface, added importing tools and wrote a Z39.50 client for querying other libraries. Z39.50 is a standard protocol libraries use to exchange data about books. Word of this new option spread quickly, and he soon had three schools running the new system. Steve's changes went back into the main Koha system, and he became a member of the development team.
During April and May of 2002, Koha development took another big step. Project leadership always had come from Katipo, but the development team was now much more international and new development goals were being proposed. One of the first steps was the beginning of the 1.2 release cycle. These releases have focused on building basic functionality and greater stability. So far, there have been four releases in this series. New features include an installation script, a fully template-driven on-line public access catalog (OPAC), which supports both translation and customization and bundled user documentation.
Right now, development is running in earnest on the 1.4 series, which features a new database schema that supports several flavors of MAchine Readable Cataloging (MARC), the cataloging standard used by libraries. The first development release in this series (1.3.0) was made on September 24, 2002. A second release occurred in October, and a 1.4.0 release is expected to occur in the first quarter of 2003.
Koha is pretty undemanding as library systems go and runs handily on a stock Linux server. HLT is a library with 25,000 patrons at four locations and a collection of 80,000 items. They run over 1,200 transactions a day on a system with dual P3 1GHz processors and 1GB of RAM.
At the Immaculate Heart of Mary School library in Madison, Wisconsin, Robert Maynord installed Koha on an AMD 1800-based system with 256MB of RAM. Coast Mountain's systems run on 200MHz Pentiums with 64MB of RAM located in each school.
Getting Koha running in a library used to be a rather daunting task, but two easy methods now are available. The easiest method is to download the CD image, burn a copy with a CD burner and boot the new Koha server from the CD. You also can use the install script to set up Koha on your hardware.
The CD can be run as a demo system, using the included data, or it can be used as your server. If you choose to use it as your server, you will need to create a set of data files on your server's hard drive. The CD provides an interactive tool to do this.
If you'd rather install your own copy, the process is a bit more involved, but it still is not difficult. Before you get started, you should make sure some basic components are installed, namely Perl, Apache and MySQL. You'll need a few Perl modules as well, but the install script helps you take care of those. The install script has made installing Koha pretty painless. An upgrade script also has been written to help ease the burden of keeping the system up to date.
-- -pate http://on-ruby.blogspot.com
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Doing for User Space What We Did for Kernel Space
- Parsing an RSS News Feed with a Bash Script
- Google's SwiftShader Released
- Rogue Wave Software's Zend Server
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide