The Linux Desktop's Next Challenge: Layer 8
Computerworld’s Preston Gralla blogged the other day that Microsoft, in a Securities and Exchange Commission (SEC) filing, admitted that Microsoft is afraid of Linux, specifically Ubuntu maker, Canonical, and one of the grand daddies of Linux distributions, Red Hat. And with Windows 7 now released to the large volume customers, the debate of the desktop is again back in the cross hairs.
These sorts of discussions, especially on sites like Computerworld and here on Linux Journal tend to bring out the same comments from the evangelists (using the term loosely) that now is the time to move to Linux on the desktop and in response you get people saying the last time I tried to install Linux…. In both of these discussions, both points are valid and, in many cases, there is merit to the position.
As I have talked about many times in the past, installing Linux is not always easy to do, especially for the non-technical user or on hardware that is anything beyond standard - or both. However, as someone who has installed first generation versions of Linux, I will assure you that the installation process has come a long way in a very short period of time. I can also assure you that installing Windows is also not always a walk in the park either.
But I am also convinced that Linux has a lot to offer people as a desktop solution, especially on some of the new, less powerful netbooks that are beginning to be marketed as the next wave in laptops. Linux has a scalable footprint that has always been the Achillies heel in the Windows model, robust application support for the programs that people want to use and the ability to connect seamlessly and easily (for the most part) to most of the services that people want to use.
However, today I want to talk not about individual desktops, but the view from the enterprise. Not a dozen or so machines in a purpose built environment but thousands of machines in a true heterogeneous environment where ease of use, manageability, and program operations are essential and saying Let’s migrate to Linux takes on a whole new level of complexity and challenge.
Let me start with a story. About two years ago, I interviewed for a job with a federal agency (for those outside the United States, let me give you some background. The U.S. government is divided up into three branches: Legislative (Congress), Judicial (the Courts, like the Supreme Court) and Executive (everything else). The branches of the Executive, often referred Cabinet level, are also called Departments, like Department of Defense or Department of Homeland Security. These departments generally have subunits, usually called agencies but not always, with in them, such as the Federal Aviation Administration (FAA) or the United States Secret Service). I am interviewing with a contractor who has a job at Agency P, a sub to Department C. The contract was a review and validation of the desktop operating system being used by Agency P to do its day-to-day work. Its current platform at the time was Windows XP and Microsoft Vista had just been released. I was being interviewed to be the lead architect on the project. My first question: Is this a rubber stamp exercise to validate or invalidate the move to Windows Vista? I was assured it was not. It was supposed to be a true and open exploration with Windows XP, Windows Vista, Apple’s OS X, and Linux. I noted that they did not specify a windowing manager like KDE or Gnome, just the operating system, Linux.
We talked back and forth for a bit and then I started asking my tough questions. Is this an agencywide conversion or just the desktops in a certain group or is this part of a larger Departmental-level effort? I was told it was agency wide but not Departmental, meaning the parent organization was not involved. Those of you who have been involved in Federal contracting already have alarm bells going off, especially if you have worked in organizations with a strong centralized IT presence and watched agencies just do their own thing. It is worse when it is parts of an agency, but I digress. In this particular case, Agency P is one of those special agencies that does not interact much with its parent Department, something I knew from my first go around with them (this was not the first time I had worked with Agency P, I had been on the help desk some 15 years prior, but I did not feel the need to mention that).
What sort of applications and migration reviews are being considered for the backend systems, especially email, web services and databases? You would have thought I asked them to explain quantum physics. The short answer was, there were no plans to either review or migrate away from the currently installed base of Microsoft Server products, including Exchange, SQL Server and IIS. Does the project have the support of the agency’s CIO? The answer was no, it was being run by a senior manager in the Information Technology group. The interview was quickly concluded after that.
Surprisingly, I did not get the job, mainly because I said I did not want it. It was a disaster waiting to happen. Now before you start emailing me, consider this. If this project was to become more than a simple validation, which I was led to believe it would, then you have to consider all the angles. It is not enough to compare the features of Microsoft Word with Open Office Writer and say Yup, same functionality. In most cases the application functionality is not even an important driver. What is important is the integration of the application into the existing infrastructure, both at the software level and as the user level. Evolution is a wonderful desktop email program (or not – your mileage may vary) but it is not Outlook. For most people reading this, the answer is So? but for average users, changing their email program is a traumatic experience as anyone who has migrated people will tell you. Whether it is to Outlook from Notes or to GroupWise from CC:Mail, most of us with scars have done the migration once and hated every minute of it.
I only highlight one application, but most would argue it is a pretty significant one. When you start talking about swapping desktops, from a software perspective, you have a large number of things to think about, even if you are just doing your own. Interoperability, testing of document formats (although that has sort of standardized…OK, not really), management tool integration, email, web services, IM tools; all of these things have to work together and when you are talking about a fixed back end solution, making them work together becomes more than a simple challenge. This is not to say that it cannot be done, but it requires a high level of buy in and support from upper management, resources and trained personnel and that support was not there at Agency P.
It is also expensive.
The cost of software for any enterprise-level project is usually less than ten percent of the total cost of the project. Let me say it again, the cost of the software is usually less than ten percent. Using Open Source software will save you a buck, but that is not what ends up killing your budget. In any migration, done properly, you have to train your people, and the more complicated (or alien) the application, the more training you need to provide, both to your users and to your support staff. When you get into a migration, it becomes important to manage the training and the expectations. Linux has come a long way in, well let us call it normalizing, the desktop. I hesitate to say make it look more Windows-like, but really, that is what is happening. Further, the Linux desktop has most of the tools that most users need on a regular basis, so the real problem comes down to integration: connections to file shares, databases, email servers, printers, etc, etc, etc…you know the drill. Sometimes, making those connections is simple and straightforward, such as connecting to file shares. Sometimes, they are not.
I think many would agree that doing it the other way, connecting a Windows’s desktop to a Linux backend to be considerably less of a challenge as the first step, and certainly less expensive because fewer people are needed for the migration and to maintain the environment along with having a working environment that more users are familiar with.
And currently, that is probably a more significant issue. As much as Linux and Mac OS X have made inroads in the commercial market, the dominant desktop sold to the home consumer is still Windows. And having to adapt to a different desktop between home and work is a challenge for a large number of people, and that issue, more than the technology involved is a much harder issue to overcome. As anyone who has done an integration and migration project can tell you, layers 1-7 in the ISO stack are easy. Layer 8, the socio-electronic-financial layer, is hard.
Practical Task Scheduling Deployment
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.View Now!
|The Firebird Project's Firebird Relational Database||Jul 29, 2016|
|Stunnel Security for Oracle||Jul 28, 2016|
|SUSE LLC's SUSE Manager||Jul 21, 2016|
|My +1 Sword of Productivity||Jul 20, 2016|
|Non-Linux FOSS: Caffeine!||Jul 19, 2016|
|Murat Yener and Onur Dundar's Expert Android Studio (Wrox)||Jul 18, 2016|
- The Firebird Project's Firebird Relational Database
- Stunnel Security for Oracle
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- SUSE LLC's SUSE Manager
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Doing for User Space What We Did for Kernel Space
- Google's SwiftShader Released
- SuperTuxKart 0.9.2 Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide