A Modern, Low Resources Linux Distribution and All the Real Reasons Why We Need It
GNU/Linux distributions keep improving at a very fast pace. Every release adds support for new hardware, new features and security improvements, both for server and desktop applications.
Unfortunately, this triumphant march has a pretty big down side. Although excellent software can be obtained freely, hardware never will be. Up-to-date, full-featured Linux applications, especially desktop ones, require almost as much hardware resources as proprietary ones.
Of course, everybody pretending to start serious video editing with a computer more than two years old should really change his mind, regardless of the operating system he happens to run. The problem is that even system administrators and experienced desktop users often find that the same things they were doing yesterday become slower after an upgrade. If your CPU is 20 times faster than ten years ago, why does it often take the same amount of time or even more to get from powering up to reading e-mail? Even, people with only obsolete computers available and who have limited programming knowledge are almost forbidden from entering the "Free" Software world.
Want the real reasons why this is a serious problem? Read on.
The standard attitude about unnecessarily heavy programs is "why should we care when desktop hardware is as cheap as it is today?". An example is this strategy letter, which, among other things, says, "I don't think anyone will deny that on today's overpowered, under-priced computers, loading a huge program is still faster than loading a small program was even 5 years ago. So what's the problem?"
The problem is that (even when it's true) this is a very limited and egotistic attitude: today's computer's are "under-priced" only for 20% of the world population. The rest still has to work many months or years to buy the stuff that makes KDE or GNOME look fast.
Even those with enough income to throw away a perfectly good, working PC every two years should not be forced to do so if their needs have not changed. Unfortunately, the two most frequent answers that one hears when raising this issue are:
"become a programmer and recompile/write from scratch yourself": snob answer, impossible in most cases.
"use an old distro": why? Why should anyone use a kernel with limited firewalling capabilities, compiled with obsolete libraries? Why should anyone run the open door that Sendmail was some years ago?
Schools, families, developing countries, public and private offices with almost null budget (pretty big segment nowadays) must save on all costs, no matter how low they already are. Often, the only PCs they can afford are donated and really old, and Free Software can't leave them alone. Besides, homework, word processing and spreadsheets don't need multimedia capabilities.
Domination of all table and wireless desktops is crucial in the long run. Whoever controls the majority of clients and unexperienced users eventually enslaves all servers too, regardless of quality.
Don't think that reducing the hardware requirements of Free Software confines it to die in some (big) obsolete hardware graveyard. Actually, the opposite is true.
Think of to all the new, low cost, internet appliances that are restricted to being only that because being able to run a current distribution would double their price. Even more important, think mobile computing. We are all supposed to surf, compute and produce wirelessly "real soon now" with really tiny boxes: watch-sized PDAs, third-generation cell phones, whatever. If mainstream Linux cleans up quickly its many existing desktop applications, it will dominate this market before the others finish saying "Hardware improves rapidly, let's just wait until they make the Pentium IV as small as a StrongArm."
Computers are useful, cool and among the most polluting kinds of domestic waste. They should be dumped (separately) only when they physically break, not because Super OS 2002 is free but won't run slower than one gigahertz.
Basic desktop computing is quickly placing itself next to the alphabet in the list of tools necessary to fully express oneself and build one's destiny. As such, it must be free not only from patents and licenses, as free as Free Speech, but it should also cost (hardware included) as close to zero as possible.
Equal opportunities is what Free Software is really about. I feel bad whenever I hear Free Software programmers still saying, "as long as I have the source and can program as I like, learn to compile by yourself and don't bother me"--even to grade school kids without money.
Articles about Digital Rights and more at http://stop.zona-m.net CV, talks and bio at http://mfioretti.com
Fast/Flexible Linux OS Recovery
On Demand Now
In this live one-hour webinar, learn how to enhance your existing backup strategies for complete disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible full-system recovery solution for UNIX and Linux systems.
Join Linux Journal's Shawn Powers and David Huffman, President/CEO, Storix, Inc.
Free to Linux Journal readers.Register Now!
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- Profiles and RC Files
- Astronomy for KDE
- Maru OS Brings Debian to Your Phone
- Understanding Ceph and Its Place in the Market
- Snappy Moves to New Platforms
- Git 2.9 Released
- OpenSwitch Finds a New Home
- What's Our Next Fight?
- The Giant Zero, Part 0.x
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide