A Modern, Low Resources Linux Distribution and All the Real Reasons Why We Need It
GNU/Linux distributions keep improving at a very fast pace. Every release adds support for new hardware, new features and security improvements, both for server and desktop applications.
Unfortunately, this triumphant march has a pretty big down side. Although excellent software can be obtained freely, hardware never will be. Up-to-date, full-featured Linux applications, especially desktop ones, require almost as much hardware resources as proprietary ones.
Of course, everybody pretending to start serious video editing with a computer more than two years old should really change his mind, regardless of the operating system he happens to run. The problem is that even system administrators and experienced desktop users often find that the same things they were doing yesterday become slower after an upgrade. If your CPU is 20 times faster than ten years ago, why does it often take the same amount of time or even more to get from powering up to reading e-mail? Even, people with only obsolete computers available and who have limited programming knowledge are almost forbidden from entering the "Free" Software world.
Want the real reasons why this is a serious problem? Read on.
The standard attitude about unnecessarily heavy programs is "why should we care when desktop hardware is as cheap as it is today?". An example is this strategy letter, which, among other things, says, "I don't think anyone will deny that on today's overpowered, under-priced computers, loading a huge program is still faster than loading a small program was even 5 years ago. So what's the problem?"
The problem is that (even when it's true) this is a very limited and egotistic attitude: today's computer's are "under-priced" only for 20% of the world population. The rest still has to work many months or years to buy the stuff that makes KDE or GNOME look fast.
Even those with enough income to throw away a perfectly good, working PC every two years should not be forced to do so if their needs have not changed. Unfortunately, the two most frequent answers that one hears when raising this issue are:
"become a programmer and recompile/write from scratch yourself": snob answer, impossible in most cases.
"use an old distro": why? Why should anyone use a kernel with limited firewalling capabilities, compiled with obsolete libraries? Why should anyone run the open door that Sendmail was some years ago?
Schools, families, developing countries, public and private offices with almost null budget (pretty big segment nowadays) must save on all costs, no matter how low they already are. Often, the only PCs they can afford are donated and really old, and Free Software can't leave them alone. Besides, homework, word processing and spreadsheets don't need multimedia capabilities.
Domination of all table and wireless desktops is crucial in the long run. Whoever controls the majority of clients and unexperienced users eventually enslaves all servers too, regardless of quality.
Don't think that reducing the hardware requirements of Free Software confines it to die in some (big) obsolete hardware graveyard. Actually, the opposite is true.
Think of to all the new, low cost, internet appliances that are restricted to being only that because being able to run a current distribution would double their price. Even more important, think mobile computing. We are all supposed to surf, compute and produce wirelessly "real soon now" with really tiny boxes: watch-sized PDAs, third-generation cell phones, whatever. If mainstream Linux cleans up quickly its many existing desktop applications, it will dominate this market before the others finish saying "Hardware improves rapidly, let's just wait until they make the Pentium IV as small as a StrongArm."
Computers are useful, cool and among the most polluting kinds of domestic waste. They should be dumped (separately) only when they physically break, not because Super OS 2002 is free but won't run slower than one gigahertz.
Basic desktop computing is quickly placing itself next to the alphabet in the list of tools necessary to fully express oneself and build one's destiny. As such, it must be free not only from patents and licenses, as free as Free Speech, but it should also cost (hardware included) as close to zero as possible.
Equal opportunities is what Free Software is really about. I feel bad whenever I hear Free Software programmers still saying, "as long as I have the source and can program as I like, learn to compile by yourself and don't bother me"--even to grade school kids without money.
Articles about Digital Rights and more at http://stop.zona-m.net CV, talks and bio at http://mfioretti.com
|The True Internet of Things||Sep 02, 2015|
|September 2015 Issue of Linux Journal: HOW-TOs||Sep 01, 2015|
|September 2015 Video Preview||Sep 01, 2015|
|Using tshark to Watch and Inspect Network Traffic||Aug 31, 2015|
|Where's That Pesky Hidden Word?||Aug 28, 2015|
|A Project to Guarantee Better Security for Open-Source Projects||Aug 27, 2015|
- The True Internet of Things
- Using tshark to Watch and Inspect Network Traffic
- September 2015 Issue of Linux Journal: HOW-TOs
- Problems with Ubuntu's Software Center and How Canonical Plans to Fix Them
- Concerning Containers' Connections: on Docker Networking
- Firefox Security Exploit Targets Linux Users and Web Developers
- Where's That Pesky Hidden Word?
- A Project to Guarantee Better Security for Open-Source Projects
- Build a “Virtual SuperComputer” with Process Virtualization
- My Network Go-Bag