The Move to Linux - Netbook Remix
If you have been following my postings over the last year, you will have read about my attempts to migrate to Linux. Some have been partially successful, others have been unmitigated disasters. I have heard comments from Linux is for smart people to You are right when I comment that the installation process should not be as hard as it sometimes is.
Now, before I regale you with my latest tale, I should point out that I have been using Linux since the early 1990s when you did have to be a rocket scientist to install it. In fact many of the early network drivers for the 3Com boards came out of NASA and their use of Linux. Compiling applications was routine and debugging the compile errors usually took longer than the actual process of compiling. If you were going to use Linux with more than what the distribution installed for you (which was sometimes barely more than the kernel, a shell and vi or emacs), you had to be smart. I installed a number of Slackware and Red Hat systems during the early days, mainly for fun, eventually settling on the Red Hat model for my preferred distribution.
By 2000, installing Linux was certainly less painful, but hardly a walk in the park. There were a greater range of drivers and other necessary modules prepackaged for use, either as an RPM or some binary distribution (such as Java support), but there was still a great deal of software that only came in a roll your own fashion. In some cases, specifically Apache, rolling your own was the only way to effectively get all the pieces and parts working together correctly.
Flash forward to 2009 and I would argue that Linux is mature. If I was to use an analogy, I would put it as the same level as Windows 2000 or NetWare 5.1. It generally installs without problem, and if you utilize Red Hat Enterprise Linux or SuSE's commercial version, you have dedicated support and stable technology ready to deploy in your datacenter. Now before I get flamed, I would not compare Linux to Windows 2000. That is a fruitless effort. There are features in Linux today that will never be incorporated into Windows and there are features that are just coming into Windows that have been in the core distributions of Linux or the kernel since the very beginning. I am simply trying to put a stake in the maturity ground. The reason I choose Windows 2000 is because that was really the first version of Windows that you did not have to fight to install. Yes, Redmond goofed a couple of things, and if you had a RAID array or board behind your system, it would really be a struggle (especially if you were still using ESIA technology) but the entire installation process worked better than it had at any point prior to that.
And that is where I feel the Linux install process has gotten too. It is easier to install Linux today than it has been in the past but it still has a fair distance to go to make it fool proof. This is not all Linux's fault. Many of the devices that are being sold as commodity hardware today really are Windows only either in terms of driver support or more than simple functionality. I have ranted about this before as have others and I do not want to plow that ground again.
What I want to do is highlight a recent experience that makes me feel good about the Linux installation experience. One of the things I mentioned before is that the Linux desktop is something that is ready for general consumption, either by the private individual or the commercial enterprise, but one of the largest detractors is the installation process. So it was with a certain amount of trepidation that I set about converting my Asus Eee PC from the installed Windows XP Home to some form of Linux.
One of the major leaps in the last few years that makes me just giddy is the Live CD. Originally made popular by the Knoppix distribution, today you can find Live CDs for almost every distribution. It is a great marketing tool. But what I really like, and where I find it to be a great utility, is the ability to put the Live CD onto a flash drive. I did this originally with Knoppix, creating an SD RAM based rescue chip that I could use on a variety of systems. I still carry it with me, along with USB keys with Fedora 11 and Ubuntu Netbook Remix. What I like, besides the great market potential, is the ability to test drive hardware before you go though the sometimes destructive process of installing the operating system only to find this or that piece of hardware does not work. This alone makes any of the Live CDs worth its weight in gold.
So I took my copy of Ubuntu and booted my netbook into it and played with it. Tested the features, made sure the hardware worked and that I could live with the changes. Nothing seemed too alien to me so I rebooted and selected the installation feature. For those who have never played with the Remix, you have the opportunity to install the operating system straight from the Live CD (which in this case is really a USB stick). The installation options are either a dual boot option (where it installs next to Windows) or a full blown destructive installation where it blows away the existing OS. Initially I went with the dual boot option as I was not sure I had finished copying all of my Windows files. This was less than satisfactory. There was not enough swap space allocated, nor was there enough space for the root allocated and, as a result, Ubuntu was slow. Glacially slow. Unacceptably slow. Now I will admit I did not tweak the install, I just took the defaults. So shame on me.
After I made sure I had backed up the files I needed, I regrouped and got serious. First, I did a custom disk format. I prefer 2x my RAM as my swap space (a hold over from early days...perhaps it uses more disk than needed, but it has worked for me, so that is what I do). Also, because I only have one disk, I do not see a need to partition the disk into root and var and usr. Instead I just set the initial partition to root and let the file system fall out below that. If I had a multi-disk system I might set it up differently, but when it is a single spindle, I don't see the value and have been slapped by the limitations when you restrictively partition. There really were not a lot of other choices to make beyond making sure the time zone and keyboard type were set and said install!
The installation was quick and painless. Dare I say bing, bang, boom? It was that quick. One reboot later and the system was up and ready to go. I was running Ubuntu and shortly there after the first set of patches were presented for me to install. The base installation is sufficient for most users. The basic package has Evolution, Firefox, Pidgin, and a couple of other tools that that most folks can get up and on-line with. To this I added a few things. I prefer Thunderbird to Evolution as my email package, so I installed that and a few packages for Amateur Radio (xaster and FBB being two of them) and a few other little things that I felt were necessary. In all, I have about 4 GB worth of operating system files installed on my 160 GB disk. As I write this (in Open Office Writer), I am connected wirelessly to a hotel network in Toronto and have done all the things that I normally would do in Windows. In fact, my wife has even checked her email (she has her own account) and done some web surfing with no difficulties.
All Linux installations should be this easy and this straight forward. Yes, Linux is an incredibly flexible and incredibly powerful operating system. This is one of its strengths. But if the OS is going to make a dent in the desktop market, especially the non-technical, end-user market, then the installation experience offered by the Ubuntu Netbook Remix is an experience that I would hold up as the gold standard. It has certainly been one of the easiest Linux installations I have done in quite some time.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Paranoid Penguin - Building a Secure Squid Web Proxy, Part IV
- SUSE LLC's SUSE Manager
- Google's SwiftShader Released
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide