Computerworld’s Preston Gralla blogged the other day that Microsoft, in a Securities and Exchange Commission (SEC) filing, admitted that Microsoft is afraid of Linux, specifically Ubuntu maker, Canonical, and one of the grand daddies of Linux distributions, Red Hat. And with Windows 7 now released to the large volume customers, the debate of the desktop is again back in the cross hairs.
These sorts of discussions, especially on sites like Computerworld and here on Linux Journal tend to bring out the same comments from the evangelists (using the term loosely) that now is the time to move to Linux on the desktop and in response you get people saying the last time I tried to install Linux…. In both of these discussions, both points are valid and, in many cases, there is merit to the position.
As I have talked about many times in the past, installing Linux is not always easy to do, especially for the non-technical user or on hardware that is anything beyond standard - or both. However, as someone who has installed first generation versions of Linux, I will assure you that the installation process has come a long way in a very short period of time. I can also assure you that installing Windows is also not always a walk in the park either.
But I am also convinced that Linux has a lot to offer people as a desktop solution, especially on some of the new, less powerful netbooks that are beginning to be marketed as the next wave in laptops. Linux has a scalable footprint that has always been the Achillies heel in the Windows model, robust application support for the programs that people want to use and the ability to connect seamlessly and easily (for the most part) to most of the services that people want to use.
However, today I want to talk not about individual desktops, but the view from the enterprise. Not a dozen or so machines in a purpose built environment but thousands of machines in a true heterogeneous environment where ease of use, manageability, and program operations are essential and saying Let’s migrate to Linux takes on a whole new level of complexity and challenge.
Let me start with a story. About two years ago, I interviewed for a job with a federal agency (for those outside the United States, let me give you some background. The U.S. government is divided up into three branches: Legislative (Congress), Judicial (the Courts, like the Supreme Court) and Executive (everything else). The branches of the Executive, often referred Cabinet level, are also called Departments, like Department of Defense or Department of Homeland Security. These departments generally have subunits, usually called agencies but not always, with in them, such as the Federal Aviation Administration (FAA) or the United States Secret Service). I am interviewing with a contractor who has a job at Agency P, a sub to Department C. The contract was a review and validation of the desktop operating system being used by Agency P to do its day-to-day work. Its current platform at the time was Windows XP and Microsoft Vista had just been released. I was being interviewed to be the lead architect on the project. My first question: Is this a rubber stamp exercise to validate or invalidate the move to Windows Vista? I was assured it was not. It was supposed to be a true and open exploration with Windows XP, Windows Vista, Apple’s OS X, and Linux. I noted that they did not specify a windowing manager like KDE or Gnome, just the operating system, Linux.
We talked back and forth for a bit and then I started asking my tough questions. Is this an agencywide conversion or just the desktops in a certain group or is this part of a larger Departmental-level effort? I was told it was agency wide but not Departmental, meaning the parent organization was not involved. Those of you who have been involved in Federal contracting already have alarm bells going off, especially if you have worked in organizations with a strong centralized IT presence and watched agencies just do their own thing. It is worse when it is parts of an agency, but I digress. In this particular case, Agency P is one of those special agencies that does not interact much with its parent Department, something I knew from my first go around with them (this was not the first time I had worked with Agency P, I had been on the help desk some 15 years prior, but I did not feel the need to mention that).
What sort of applications and migration reviews are being considered for the backend systems, especially email, web services and databases? You would have thought I asked them to explain quantum physics. The short answer was, there were no plans to either review or migrate away from the currently installed base of Microsoft Server products, including Exchange, SQL Server and IIS. Does the project have the support of the agency’s CIO? The answer was no, it was being run by a senior manager in the Information Technology group. The interview was quickly concluded after that.
Surprisingly, I did not get the job, mainly because I said I did not want it. It was a disaster waiting to happen. Now before you start emailing me, consider this. If this project was to become more than a simple validation, which I was led to believe it would, then you have to consider all the angles. It is not enough to compare the features of Microsoft Word with Open Office Writer and say Yup, same functionality. In most cases the application functionality is not even an important driver. What is important is the integration of the application into the existing infrastructure, both at the software level and as the user level. Evolution is a wonderful desktop email program (or not – your mileage may vary) but it is not Outlook. For most people reading this, the answer is So? but for average users, changing their email program is a traumatic experience as anyone who has migrated people will tell you. Whether it is to Outlook from Notes or to GroupWise from CC:Mail, most of us with scars have done the migration once and hated every minute of it.
I only highlight one application, but most would argue it is a pretty significant one. When you start talking about swapping desktops, from a software perspective, you have a large number of things to think about, even if you are just doing your own. Interoperability, testing of document formats (although that has sort of standardized…OK, not really), management tool integration, email, web services, IM tools; all of these things have to work together and when you are talking about a fixed back end solution, making them work together becomes more than a simple challenge. This is not to say that it cannot be done, but it requires a high level of buy in and support from upper management, resources and trained personnel and that support was not there at Agency P.
It is also expensive.
The cost of software for any enterprise-level project is usually less than ten percent of the total cost of the project. Let me say it again, the cost of the software is usually less than ten percent. Using Open Source software will save you a buck, but that is not what ends up killing your budget. In any migration, done properly, you have to train your people, and the more complicated (or alien) the application, the more training you need to provide, both to your users and to your support staff. When you get into a migration, it becomes important to manage the training and the expectations. Linux has come a long way in, well let us call it normalizing, the desktop. I hesitate to say make it look more Windows-like, but really, that is what is happening. Further, the Linux desktop has most of the tools that most users need on a regular basis, so the real problem comes down to integration: connections to file shares, databases, email servers, printers, etc, etc, etc…you know the drill. Sometimes, making those connections is simple and straightforward, such as connecting to file shares. Sometimes, they are not.
I think many would agree that doing it the other way, connecting a Windows’s desktop to a Linux backend to be considerably less of a challenge as the first step, and certainly less expensive because fewer people are needed for the migration and to maintain the environment along with having a working environment that more users are familiar with.
And currently, that is probably a more significant issue. As much as Linux and Mac OS X have made inroads in the commercial market, the dominant desktop sold to the home consumer is still Windows. And having to adapt to a different desktop between home and work is a challenge for a large number of people, and that issue, more than the technology involved is a much harder issue to overcome. As anyone who has done an integration and migration project can tell you, layers 1-7 in the ISO stack are easy. Layer 8, the socio-electronic-financial layer, is hard.
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
|Non-Linux FOSS: Seashore||May 10, 2013|
- RSS Feeds
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- A Topic for Discussion - Open Source Feature-Richness?
- Validate an E-Mail Address with PHP, the Right Way
- Drupal Is a Framework: Why Everyone Needs to Understand This
- What's the tweeting protocol?
- Tech Tip: Really Simple HTTP Server with Python
- Kernel Problem
53 min 37 sec ago
- BASH script to log IPs on public web server
5 hours 20 min ago
8 hours 56 min ago
- Reply to comment | Linux Journal
9 hours 28 min ago
- All the articles you talked
11 hours 52 min ago
- All the articles you talked
11 hours 55 min ago
- All the articles you talked
11 hours 56 min ago
16 hours 21 min ago
- Keeping track of IP address
18 hours 12 min ago
- Roll your own dynamic dns
23 hours 25 min ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?