If you have been following my postings over the last year, you will have read about my attempts to migrate to Linux. Some have been partially successful, others have been unmitigated disasters. I have heard comments from Linux is for smart people to You are right when I comment that the installation process should not be as hard as it sometimes is.
Now, before I regale you with my latest tale, I should point out that I have been using Linux since the early 1990s when you did have to be a rocket scientist to install it. In fact many of the early network drivers for the 3Com boards came out of NASA and their use of Linux. Compiling applications was routine and debugging the compile errors usually took longer than the actual process of compiling. If you were going to use Linux with more than what the distribution installed for you (which was sometimes barely more than the kernel, a shell and vi or emacs), you had to be smart. I installed a number of Slackware and Red Hat systems during the early days, mainly for fun, eventually settling on the Red Hat model for my preferred distribution.
By 2000, installing Linux was certainly less painful, but hardly a walk in the park. There were a greater range of drivers and other necessary modules prepackaged for use, either as an RPM or some binary distribution (such as Java support), but there was still a great deal of software that only came in a roll your own fashion. In some cases, specifically Apache, rolling your own was the only way to effectively get all the pieces and parts working together correctly.
Flash forward to 2009 and I would argue that Linux is mature. If I was to use an analogy, I would put it as the same level as Windows 2000 or NetWare 5.1. It generally installs without problem, and if you utilize Red Hat Enterprise Linux or SuSE's commercial version, you have dedicated support and stable technology ready to deploy in your datacenter. Now before I get flamed, I would not compare Linux to Windows 2000. That is a fruitless effort. There are features in Linux today that will never be incorporated into Windows and there are features that are just coming into Windows that have been in the core distributions of Linux or the kernel since the very beginning. I am simply trying to put a stake in the maturity ground. The reason I choose Windows 2000 is because that was really the first version of Windows that you did not have to fight to install. Yes, Redmond goofed a couple of things, and if you had a RAID array or board behind your system, it would really be a struggle (especially if you were still using ESIA technology) but the entire installation process worked better than it had at any point prior to that.
And that is where I feel the Linux install process has gotten too. It is easier to install Linux today than it has been in the past but it still has a fair distance to go to make it fool proof. This is not all Linux's fault. Many of the devices that are being sold as commodity hardware today really are Windows only either in terms of driver support or more than simple functionality. I have ranted about this before as have others and I do not want to plow that ground again.
What I want to do is highlight a recent experience that makes me feel good about the Linux installation experience. One of the things I mentioned before is that the Linux desktop is something that is ready for general consumption, either by the private individual or the commercial enterprise, but one of the largest detractors is the installation process. So it was with a certain amount of trepidation that I set about converting my Asus Eee PC from the installed Windows XP Home to some form of Linux.
One of the major leaps in the last few years that makes me just giddy is the Live CD. Originally made popular by the Knoppix distribution, today you can find Live CDs for almost every distribution. It is a great marketing tool. But what I really like, and where I find it to be a great utility, is the ability to put the Live CD onto a flash drive. I did this originally with Knoppix, creating an SD RAM based rescue chip that I could use on a variety of systems. I still carry it with me, along with USB keys with Fedora 11 and Ubuntu Netbook Remix. What I like, besides the great market potential, is the ability to test drive hardware before you go though the sometimes destructive process of installing the operating system only to find this or that piece of hardware does not work. This alone makes any of the Live CDs worth its weight in gold.
So I took my copy of Ubuntu and booted my netbook into it and played with it. Tested the features, made sure the hardware worked and that I could live with the changes. Nothing seemed too alien to me so I rebooted and selected the installation feature. For those who have never played with the Remix, you have the opportunity to install the operating system straight from the Live CD (which in this case is really a USB stick). The installation options are either a dual boot option (where it installs next to Windows) or a full blown destructive installation where it blows away the existing OS. Initially I went with the dual boot option as I was not sure I had finished copying all of my Windows files. This was less than satisfactory. There was not enough swap space allocated, nor was there enough space for the root allocated and, as a result, Ubuntu was slow. Glacially slow. Unacceptably slow. Now I will admit I did not tweak the install, I just took the defaults. So shame on me.
After I made sure I had backed up the files I needed, I regrouped and got serious. First, I did a custom disk format. I prefer 2x my RAM as my swap space (a hold over from early days...perhaps it uses more disk than needed, but it has worked for me, so that is what I do). Also, because I only have one disk, I do not see a need to partition the disk into root and var and usr. Instead I just set the initial partition to root and let the file system fall out below that. If I had a multi-disk system I might set it up differently, but when it is a single spindle, I don't see the value and have been slapped by the limitations when you restrictively partition. There really were not a lot of other choices to make beyond making sure the time zone and keyboard type were set and said install!
The installation was quick and painless. Dare I say bing, bang, boom? It was that quick. One reboot later and the system was up and ready to go. I was running Ubuntu and shortly there after the first set of patches were presented for me to install. The base installation is sufficient for most users. The basic package has Evolution, Firefox, Pidgin, and a couple of other tools that that most folks can get up and on-line with. To this I added a few things. I prefer Thunderbird to Evolution as my email package, so I installed that and a few packages for Amateur Radio (xaster and FBB being two of them) and a few other little things that I felt were necessary. In all, I have about 4 GB worth of operating system files installed on my 160 GB disk. As I write this (in Open Office Writer), I am connected wirelessly to a hotel network in Toronto and have done all the things that I normally would do in Windows. In fact, my wife has even checked her email (she has her own account) and done some web surfing with no difficulties.
All Linux installations should be this easy and this straight forward. Yes, Linux is an incredibly flexible and incredibly powerful operating system. This is one of its strengths. But if the OS is going to make a dent in the desktop market, especially the non-technical, end-user market, then the installation experience offered by the Ubuntu Netbook Remix is an experience that I would hold up as the gold standard. It has certainly been one of the easiest Linux installations I have done in quite some time.
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
|Introduction to MapReduce with Hadoop on Linux||Jun 05, 2013|
- Containers—Not Virtual Machines—Are the Future Cloud
- Non-Linux FOSS: libnotify, OS X Style
- Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer
- Linux Systems Administrator
- Introduction to MapReduce with Hadoop on Linux
- RSS Feeds
- New Products
- Weechat, Irssi's Little Brother
- Validate an E-Mail Address with PHP, the Right Way
- Tech Tip: Really Simple HTTP Server with Python
- Poul-Henning Kamp: welcome to
13 min 40 sec ago
- This has already been done
14 min 40 sec ago
- Reply to comment | Linux Journal
59 min 54 sec ago
- Welcome to 1998
1 hour 48 min ago
- notifier shortcomings
2 hours 12 min ago
3 hours 48 min ago
- Android User
3 hours 50 min ago
- Reply to comment | Linux Journal
5 hours 43 min ago
8 hours 33 min ago
- This is a good post. This
13 hours 46 min ago
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?