Windows PCs vs. X Terminals: A Cost Comparison
The Mark O. Hatfield Library at Willamette University has used networked X terminals in its public and staff computing environments since 1995. The original workstations were NCD and Tektronix thin clients, but over the last two years, we have been replacing the systems with recycled PC hardware that otherwise would have been scheduled for replacement.
An X terminal is distinguished from a standalone personal computer in that X terminals rely on a networked computing model in which applications such as the desktop environment (window manager), Web browser and office software are hosted and run on a centralized application host located elsewhere on the network. The application host usually runs on heavy-duty server hardware, leaving the terminal workstation with the more trivial tasks of responding to input from peripherals such as the keyboard and mouse and drawing graphics to a monitor.
The Hatfield library currently has 25 of these X terminal systems deployed as specialized staff computers and kiosk-style public computing stations. They connect to two X client application hosts powered by x86-based PC server hardware running Red Hat Enterprise Linux, XFree86, XDM, the GNOME 2.4 desktop and the Mozilla Web browser.
In public computing environments, the model of centralized network computing has several advantages over traditional standalone workstations in that it:
Provides a no-cost alternative to expensive desktop deployment and cloning packages, such as Symantec Ghost, because new software is added to a single machine rather than deployed and run on each individual workstation.
Provides a centralized environment that is superior to standalone computing for backing up and maintaining user data as well as company proprietary data.
Enables institutions to maintain a homogeneous, flexible software environment even on PC hardware that has been purchased over a period of several years.
Extends the lifespan of personal computer hardware, yielding an overall decrease in investment in new hardware.
Is not susceptible to Windows-based viruses and spyware. Required security patches need be applied only to one system in order to update multiple systems.
As a rule of thumb, personal computing hardware at this institution is recycled out of the system after five to six years. Hardware and software manufacturers recommend a three-year purchasing cycle. The cost of replacing the 25 workstations deployed in our various labs and on staff workstations with new personal computers would be roughly $25,000.
Instead, we are replacing Windows with Linux on PCs that are six years old or even older and keeping those systems in service. In Addison to reducing the cost of new hardware and software purchases, it extends the return on investment of hardware already purchased.
Because these systems are being recycled out of service, there is no additional input cost for personal computing hardware. These literally are systems that otherwise would be thrown out.
Instead of purchasing new PCs, we instead make purchases on server hardware on a four-year cycle. Historically, one dual-processor x86 system can power applications for up to 25 X terminals.
By adopting this model, we have extended the lifecycle of our antiquated desktop hardware from seven to ten0 years, and we still are able to run current applications, including modern desktop environments, proprietary Java applications, Web browsers and office software.
Desktop PC model: 25 new PCs every three to six years.
Linux X Terminal Model: Two new application hosts (server hardware) purchased every four years with a two-year stagger. Adding recycled PCs as they become available.
Getting Started with DevOps - Including New Data on IT Performance from Puppet Labs 2015 State of DevOps Report
August 27, 2015
12:00 PM CDT
DevOps represents a profound change from the way most IT departments have traditionally worked: from siloed teams and high-anxiety releases to everyone collaborating on uneventful and more frequent releases of higher-quality code. It doesn't matter how large or small an organization is, or even whether it's historically slow moving or risk averse — there are ways to adopt DevOps sanely, and get measurable results in just weeks.
Free to Linux Journal readers.Register Now!
- Hacking a Safe with Bash
- Django Models and Migrations
- Secure Server Deployments in Hostile Territory, Part II
- The Controversy Behind Canonical's Intellectual Property Policy
- Huge Package Overhaul for Debian and Ubuntu
- Home Automation with Raspberry Pi
- Shashlik - a Tasty New Android Simulator
- Embed Linux in Monitoring and Control Systems
- KDE Reveals Plasma Mobile
- diff -u: What's New in Kernel Development