Closing the Chasm
In most big bookstores, the books you see first don't last. They're out on the front tables for you to sample. If they sell well, there's no guarantee they'll stay on the shelves for months, much less years. Even the biggest bookstores don't have enough room to store a fraction of the new books that wash in and out, like foam on a tide.
Those that stay are part of the culture. You expect to find them at a bookstore because their appeal endures. With fiction, the appeal may last years, decades or centuries. Nonfiction books, however, bear the burden of relevance. Among the nonfiction categories, business books don't age as rapidly as sports and travel titles, unless they're about technology. Books about technology trends and companies tend to age as well as last week's meat.
When I look at my bookshelves, I wince at titles like The Big Tech Score and Managing Inter@ctivity. It's not that any of these books contain bad information; they simply speak of a time that is going or gone.
One exception is the work of Geoffrey A. Moore. Although his books are packed with examples from companies whose names are no longer in use, his insights about technology adoption remain relevant, especially (and ironically) for Linux. In fact, I think Moore's technology adoption model is a handy way to make sense of Linux's quietly growing success inside large enterprises. It also will be useful when it is time to grok Linux's inevitable success on desktops as well.
Moore's model is an old one: the adoption curve, which dates back to Everett M. Rogers' Diffusion of Innovations, first published in 1962 and still going strong, broken into pieces (Figure 1).
His main focus is the chasm between early adopters and the early majority. In Crossing the Chasm (1991) and Inside the Tornado (1995), Moore described the width of the chasm in terms of cultural opposites. According to Moore, the techie innovators and visionary early adopters on the left side of the chasm are radically different from the pragmatic early majority (Table 1).
What appeals to both groups is also radically different. Techies and visionaries care about product qualities such as speed, design, ease of use, price, novel features and functions, impressive demos and trade-press coverage. Pragmatists care about reliable solutions, third-party support, de facto standards, cost of ownership, quality of support and success of colleagues.
Notice something funny? Did Linux skip a generation here? As a product, isn't Linux almost profoundly pragmatic? And, isn't it strange that what made Linux's visionaries excited was a roster of mostly pragmatic values?
The graph in Figure 1 isn't Moore's; it's Don Norman's version of Moore's curve. Dr Norman (www.jnd.org), a scientist whose dozens of books include such classics as The Design of Everyday Things, The Invisible Computer and Things That Make Us Smart, is a fan of Moore and his books. About Crossing the Chasm he says (www.jnd.org/dn.mss/life_cycle_of_techno.html):
The classic marketing book for high-technology companies, widely read and discussed, but almost never followed. Why is it so difficult for a high-technology company to understand that late adopters of a technology are very, very different from the technology enthusiasts who made the company successful? Because the whole culture of the company is based on its wildly successful teenage years, and high-tech companies hate to grow up. Immaturity is embedded in the culture. Technology is easy to change. Culture is hard.
But the culture Moore and Norman talk about is on the supply side—it's the culture of companies who make their living selling technology. It's different with Linux. “Linux company” has always had an oxymoronic quality. Lately I've begun to think there never has been such a thing as a “Linux company”. Linux is too deep, too infrastructural, too free. Yes, you can productize and brand it, just as Pepsi productizes filtered water and sells it as Aquafina. But, like air and water, Linux is too elemental to be a product in itself. Here's how Don Norman puts the distinction:
There is a big difference between infrastructure products, which I call non-substitutable goods, and traditional products, substitutable goods. With traditional goods, a company can survive with a stable, but non-dominant market share. Coke and Pepsi both survive. Cereals and soaps have multiple brands. With infrastructure goods, there can be just one. MS-DOS won over the Macintosh OS, and that was that. MS-DOS transitioned to Windows, and the dominance continued. VHS tape triumphed over Beta. Most infrastructures are dictated by the government, which assures agreement to a single standard. When there is no standard, as in AM stereo or digital cellular options in the US, there is chaos.
When Don Norman wrote that paragraph, DOS/Windows was de facto infrastructure. What's happened since then is what many in our community have expected for a very long time: free forms of UNIX eventually will find adoption as universal infrastructure. Sure enough, today most web servers run on Linux or BSD. Sony, Matsushita, Philips and a pile of other consumer electronics giants are codeveloping a free and open embedded Linux distro for their own future product generations. DOS/Windows, which quietly used to serve as the OS for countless cash registers and point-of-sale (POS) terminals, is being replaced rapidly by new models that run on Linux.
In Inside the Tornado, Geoffrey Moore subdivides early adoption into two stages: The Bowling Alley and The Tornado. The Bowling Alley is “a period of niche-based adoption in advance of the general marketplace, driven by compelling customer needs and the willingness of vendors to craft niche-specific whole products.” The Tornado is “a period of mass-market adoption, when the general marketplace switches over to the new infrastructure paradigm.” These are followed by Main Street, “a period of after-market development when the base infrastructure has been deployed and the goal now is to flesh out its potential.”
For web servers, database servers, rendering farms, point-of-sale systems and a variety of other niche categories, Linux has been bowling strikes for several years now. Now we're moving inside the tornado, big time.
But it's a stealth tornado, because few companies on the supply side (IBM is a huge exception) even bother to make a big thing about it. The real leadership is happening on the demand side, among the pragmatists.
Here's Linux. It's lying around looking useful. A lot of technologists know their way around or can learn it easily. It's free. There's no vendor lock-in. There are no royalties or other fees to pay. It's extremely useful as building material. There are lots of tools you can use to build with it, and most of those are free too. The only problem is PR; it still looks like it's a Visionary Thing.
But that image fits a pattern too. This one is described by Clayton Cristensen in another classic, The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail (Harvard Business School Press, 1997). Cristensen says Great Firms are innovative by nature (witness Microsoft's defensive talk about their right to innovate), but their innovations are incremental. They improve the proven in gradual steps; they don't tip over their cash cows—and for very good reasons.
One is dependence on customers and investors as sole resources. Cristensen says:
While managers think they control the flow of resources in their firms, in the end it is customers and investors who dictate how money will be spent because companies with investment patterns that don't satisfy their customers don't survive. The highest-performing companies in fact are those that are best at killing ideas that their customers don't want. As a result, these companies find it very difficult to invest adequate resources in disruptive technologies—lower-margin opportunities that their customers don't want—until their customers want them. And by then it's too late.
Other reasons for avoiding disruptive technologies are “small markets don't solve the growth needs of large companies” and “markets that don't exist can't be analysed”. On that last point, Cristensen says:
Because the vast majority of innovations are sustaining in character, most executives have learned to manage innovation in a sustaining context, where analysis and planning were feasible. In dealing with disruptive technologies leading to new markets, however, market researchers and business planners have consistently dismal records. In fact, based upon evidence from the disk drive, motorcycle, and microprocessor industries, the only thing we may know for sure when we read experts' forecasts about how large emerging markets will become is that they are wrong.
Linux is even more disruptive than any of Cristensen's examples, because its primary movement isn't from companies to customers—it's from hackers to customers. The creators of free tools, components and applications make the products of their labors available to anyone. In this respect, Linux resembles such natural building materials as rocks, wood, iron and concrete. The difference is that Linux occurs in human nature.
So companies don't “adopt” Linux so much as they mine and harvest it. Only, because it's digital stuff, there's no scarcity. That makes it infinitely less expensive than cheap lumber, rock, minerals and petroleum. All of which make Linux that much easier to adopt and that much more disruptive.
The remaining challenge, then, is cultural. There the chasm remains intact. On one side we have a bunch of techies who care deeply about the principles that brought Linux into the world and that made it so useful to so many at so little cost. They get worked up over ethical issues and about licenses like the GPL, which respects the nature of Linux and its development while hardly caring at all about its commercial potential. On the other side we have a bunch of techies who care deeply about solving problems and making things work, who hardly care at all about the political and moral issues that get the first side so worked up.
In fact, the two sides aren't opposed. They're just different constituencies with different priorities. The market logic is and, not or. We'll know we've bridged that chasm when we stop making a big deal about it. At that point we'll all be on Main Street.
Doc Searls is senior editor of Linux Journal.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.View Now!
|The Firebird Project's Firebird Relational Database||Jul 29, 2016|
|Stunnel Security for Oracle||Jul 28, 2016|
|SUSE LLC's SUSE Manager||Jul 21, 2016|
|My +1 Sword of Productivity||Jul 20, 2016|
|Non-Linux FOSS: Caffeine!||Jul 19, 2016|
|Murat Yener and Onur Dundar's Expert Android Studio (Wrox)||Jul 18, 2016|
- The Firebird Project's Firebird Relational Database
- Stunnel Security for Oracle
- My +1 Sword of Productivity
- SUSE LLC's SUSE Manager
- Non-Linux FOSS: Caffeine!
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Parsing an RSS News Feed with a Bash Script
- Google's SwiftShader Released
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide