Understanding Infrastructure

Is Linux infrastructure? Or is it just another operating system, like Windows, MacOS and various Unixes?

How about the Internet? Is the Net infrastructure? Or is it just the #3 "service" in the "triple play" sold by your local phone or cable company?

How about the half million or more free software and open source applications out there? Are they infrastructural building material? Or are they just a bunch of computer applications that are exceptional mostly because nearly all of them cost nothing?

What is "infrastructure" anyway? Look it up on Google and you get over a hundred million results. Relatively few pertain to what infrastructure meant in the first place, which the American Heritage Dictionary says was about eight decades ago:

Usage Note: The term infrastructure has been used since 1927 to refer collectively to the roads, bridges, rail lines, and similar public works that are required for an industrial economy, or a portion of it, to function. The term also has had specific application to the permanent military installations necessary for the defense of a country. Perhaps because of the word's technical sound, people now use infrastructure to refer to any substructure or underlying system. Big corporations are said to have their own financial infrastructure of smaller businesses, for example, and political organizations to have their infrastructure of groups, committees, and admirers. The latter sense may have originated during the Vietnam War in the use of the word by military intelligence officers, whose task it was to delineate the structure of the enemy's shadowy organizations. Today we may hear that conservatism has an infrastructure of think tanks and research foundations or that terrorist organizations have an infrastructure of people sympathetic to their cause. The Usage Panel finds this extended use referring to people to be problematic, however. Seventy percent of the Panelists find it unacceptable in the sentence FBI agents fanned out to monitor a small infrastructure of persons involved with established terrorist organizations.

Linux and the Internet are infrastructure, in the sense that they qualify for the American Heritage Dictionary's #1 meaning: "An underlying base or foundation especially for an organization or system". Infra is Latin for under, and you can find Linux under countless applications on devices that range from supercomputers to cell phones to wireless picture frames. Linux is also the platform of choice for many (perhaps most) sturdy and popular Web sites and services, including Google's search and Amazon's S3 and EC2 (to name too few among way too many).

And you'll find the Internet under Linux. If we're stacking up connected people and devices, nothing could be more infra, more fundamental, than the Net itself. In fact, the Net is so infra that two legacy forms of widespread commercial infrastructure — telephony and television — now run on it as well. Your cable and phone bills may suggest otherwise, but the fact remains that voice and video are just two among many forms of data that travel via the Internet protocol.

Yet neither Linux nor the Net are generally seen as infrastructure. If you're not a Linux geek or a Net-head, your answers to the opening questions above are likely to be the latter ones. How come?

Here are a few reasons:

  1. While Linux and the Net are clearly infra, they are not structure in the physical sense. Linux is software, while the Net is a set of protocols. There are soft forms of structure, but those are not what the word brings first to mind.
  2. They are products of neither public nor private enterprise. They are odd breeds that exceed the scope of both categories. Linux began with a single programmer and grew to become the project of a development community comprised of thousands of individual programmers — with nearly as many different employers. (Significantly, most of those programmers will tell you they are not under anybody's command, at least as far as their kernel hacking work is concerned.) The Internet grew out of academic and defense work, while its most familiar subsystem, the World Wide Web, grew out of work at high energy physics laboratories. And while both Linux and the Net were popularized by commercial activities, they are better built to support business than to make money for themselves. This is why...
  3. They generate relatively little wealth. Instead they support an incalculable sum of it. That is, while much money has been made with both Linux and the Net, that sum is dwarfed by the amount of money made because of both. Their main job is to support countless purposes other than their own "business models", which don't exist. In this sense they are like geology, sunlight and atmosphere: they are free in both the free-as-in-freedom and free-as-in-beer senses of the word. It's as silly to ask Linux and the Net for their business models as it is to ask the same of H2O. Sure, you can sell bottled water, but how big is that business compared to what the oceans support?
  4. They are seen as external to base economic activity. Which is buying and selling goods. As I wrote in Greater Goods, "Abundant free software production and use might be seen as a network externality, resulting from the network effects caused by cost-free goods that are easily obtained and used — which is fine. But there is a cost to this perspective." That cost is a near-universal disregard for the foundational economic importance of these essential goods. For today's economy, the roles of Linux and the Net are better conceived as internal rather than external, much as the role of the Earth is internal to everything that relies on it.
  5. Their ideals are "NEA": Nobody owns them, Everybody can use them, and Anybody can improve them. Note that these are ideals, rather than facts. What matters is that free software and open source programmers have principles anchored in an understanding of software that is largely antithetical to conventional notions about simple property. Even if it's "mine", it's right for others to take it, copy it, use it, modify it, and improve on it. And, in the founding case of the GPL, insist that the same freedoms are kept as permanent fixtures upon which there can be no other improvement.

NEA ideals are what make free and open source software highly generative. It is the generativity of Linux and the Net that makes both function as an essential yet poorly understood form of infrastructure: a kind that serves ecological as well as geological and architectural functions. As generative technologies, they support origination, production and reproduction to an extreme of fecundity that shames the most reproductive species. In addition to a host of commercial applications and services, Linux and the Net today support nearly the entire FOSS (Free and Open Source Software) portfolio. This proliferation derives from a nature as practical (and in some cases as essential) as any element in the periodic table, yet with unrestricted variety. Which is why their population is growing on pace to outnumber the world's species — while evolving faster than any of them.

In his new book, In The Future of the Internet — and How to Stop It (Yale University Press, 2008) Jonathan Zittrain defines generativity as "a system's capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences." In an earlier research paper, The Generative Internet, he explains,

Generativity denotes a technology’s overall capacity to produce unprompted change driven by large, varied, and uncoordinated audiences. The grid of PCs connected by the Internet has developed in such a way that it is consummately generative. From the beginning, the PC has been designed to run almost any program created by the manufacturer, the user, or a remote third party and to make the creation of such programs a relatively easy task. When these highly adaptable machines are connected to a network with little centralized control, the result is a grid that is nearly completely open to the creation and rapid distribution of the innovations of technology-savvy users to a mass audience that can enjoy those innovations without having to know how they work.

Linux and the rest of the FOSS population fit this description, yet exceed it to the degree that they are not products of "audiences", but rather of engineers, working in coordinated ways to create and improve the code itself.

FOSS code is pure building material. The free, abundant and practical nature of this building material gives it some qualities of commodities; yet its generative nature is exceptional to traditional economic constructs. It inconveniences economic belief systems that anchor their perspective in the work of business or government, because it grows abundantly outside either context.

In The Future of the Internet, Jonathan Zittrain shows how the Net and PC operating systems are generative by locating them at the waists of hourglasses:

Both make possible an endless variety of invention and innovation both above and below them. They are like a universal joint making the stuff above independent of the stuff below.

Note that both are not at the bottoms of these illustrations. Their infra roles are in the middle. Their native flexibility is a form of structure that is both sturdy and liberating.

Yet being in the middle presents a conceptual problem Because Linux and the Net run on media and hardware, they seem to be dependent variables of those. They are higher up in "the stack", and therefore less infra than the stuff below them.

If we conceive infrastructure as a hierarchy of physical dependencies, then Linux and the Net are both subordinate what they run on, and what carries them.

Yet if we conceive infrastructure as something one must understand deeply in order to create architecture, and devise strategy, it is essential to understand what we are dealing with here — and to understand it on its own new terms, as well as those that have been around a long time but need to be revisited.

That revisiting can come from many angles.

Take for example Craig Burton's point of view.

Back in the 1980s I watched in amazement as Novell utterly changed the Local Area Network (LAN) conversation from one about "pipes and protocols" to one about services. Before Novell, LANs were all silos. Arguments were about which "pipe" topology (ring vs. bus vs. star) was best, and which datalink protocol (Ethernet vs. Token Ring) was best. Magazines like Data Communications were fat as phone books, and packed with with stories that compared the isolated stacks of Corvus, Digital, IBM, Sytek, Wang and a bunch of other companies. The silo'd nature of these stacks was rarely visited, because the entire market was comprised of "your choice of silo".

Then Novell came along and said, "We don't care what wire you use, or what protocols you use on them. You need services, and we'll give you that, starting with file and print. Those run on our Network Operating System, called Netware." Almost overnight, the game changed. A whole new industry grew on top of Netware and other Network Operating Systems (NOSes), and later on top of PC operating systems. You might say that Linux took off because it turned UNIX into a NOS. Wisely Novell today is a Linux company (that continues to sell Netware to customers that want it). Yet the generative role played by Linux today is one that was pioneered by Netware.

In fact, one of the reasons I coined the expression "markets are conversations" (long before it found a home in The Cluetrain Manifesto) was that I saw the LAN market change utterly, almost overnight, when the whole market shifted its core topic from pipes & protocols to services. The main dude who changed that conversation was Craig Burton. Way back then, Craig blew my mind. He still does. (So did his wife Judith, who implemented Craig's insights to amazing effect.)

When Craig talks infrastructure today, he uses the Burton Matrix:

Here he locates infrastructure in the upper right corner, a combination of Open and Public Domain:

Ne also notes that the process of moving from scarcity to ubiquity is one of commoditization. The arrow is a strategic vector. If you want to create new markets, or disrupt old ones, you create ubiquitous infrastructure. That's what happened with Linux and the Net.

Another mentor is my old friend Stephen Lewis. We met as fellow philosophy majors at the same small college in the late 1960s. Steve was an outstanding student whose intelligence, scholarship and powers of articulation humbled me. In the four decades since then he has accumulated an inventory of wisdom and experience that informs an understanding of infrastructure that encompasses — among much else — his photographic studies of Ottoman architecture and his cultural roots in working class New York. In a recent blog post Steve sourced Joshua B. Freeman’s Working Class New York (New York 2000), calling it "a penetrating examination of the unique ethos, economic history, and social and physical infrastructure of the City..." — and quoting this passage:

Endlessly frustrated by its difficulties and brutalities, try as I may I find it difficult to imagine living elsewhere. What keeps me in New York is neither the high culture of museums and concert halls nor the unrivaled opportunities for working, eating, and spending that New Yorkers revel in. Rather it is a sensibility that is distinctly working-class — generous; open-minded but skeptical; idealistic but deflating of pretension; bursting with energy and a commitment to doing.

That sensibility is also generative. Note that Joshua Freeman locates his city's generativity neither in physical plant nor within cliché'd architectural and cultural frames, but at the base level of a culture: the working people who get stuff done. Sound familiar? Welcome to the new software business.

It is wrong to assume, as we have been doing throughout history, that those primarily responsible for the foundations of civilization are its leading figures and institutions. While those leaders are certainly involved, full respect must be given to the invention, as well as the hard work, done by the uncredited many.

Take the matter of device drivers for the Linux kernel. The development of these requires a symbiotic relationship between the manufacturers of devices and the programmers who make those devices useful. Over the last several years, Greg Kroah-Hartman and other kernel hackers have been working to bring these parties together. Here's one excerpt from his latest Driver Project Status Report:

A funny thing happened though, what I figured would be a project flooded with requests for companies to get hardware working turned into anything but that.

Two major things then happened, both of which I could have never expected:

  • The number of developers who said they would be willing to help out in creating these drivers was amazing. As of today, we have over 300 different people who have signed up to be a developer of a Linux driver, volunteering their talents and time to help Linux out. This large developer base is a shining example of how strong and large the Linux community is.
  • Very few companies signed up for drivers.

It's this last point that made me worry. Yes, a number of companies did ask for drivers to be written, and we have done so, but not as many as I originally imagined.

Then,

I tried my best with a general announcement of, "Tell me all of the hardware that you know of that is not supported by Linux!" The response by users was overwhelming. My inbox was flooded with hundreds of messages, and the wiki page: http://www.linuxdriverproject.org/twiki/bin/view/Main/DriversNeeded was created.

In other words, the driver project, like the kernel project at the heart of Linux, is run by individuals who are "generous; open-minded but skeptical; idealistic but deflating of pretension; bursting with energy and a commitment to doing". Big name vendors are involved, but they're not at the heart of the project. They are dependent rather than independent variables.

This applies to physical infrastructure as well. Several weeks ago, guided by Gordon Cook (a guru of post-telecom architecture and economics), I joined a couple dozen other infrastructure obsessives in Loma Linda, California, where we were schooled by James Hettrick, who led the build-out of Loma Linda's fiber-based connectivity plant. James grew up on a ranch in northern Montana where self-sufficiency and resourcefulness extended to telecoms infrastructure: they rolled their own. Too far from civilization to interest Ma Bell or any other phone company, his family and other ranchers built their own phone system. What James learned growing up is still being applied today at what we're stating to call "Layer Zero". He is equipping individuals and small contractors to do the work of building out high-capacity "pipes" from the edge in. Costs of fiber and other cabling (along with wireless) are going down, while the variety of tools and materials are going up. (You can see more in this series of captioned photos from James' session in Loma Linda.)

The ability to make our own infrastructure, independent of BigCo silos and BigGov regulatory constraints, is what Bob Frankston has been advocating for many moons. One sample:

Instead of viewing the Internet as something we connect to we should view it as radiating from us and our devices. Our home networks reach out through community networks and beyond finding paths that work.

It’s akin to shifting our reference from choosing which schedule train we can take to simply driving our own cars and using whatever path we can navigate.

It’s time we took control over our local transport – just as we own the wires in our homes and the roads in our communities we must also own our local information transports.

At Frankston.com you'll find 159 documents that mention "infrastructure", and all of them challenge common assumptions bound within what Bob aptly calls The Regulatorium (a term that shows up in 64 of Bob's documents).

I interviewed Bob at some length for the current issue of Linux Journal. I'm still kicking myself for not remembering that I took this picture ...

... of Bob at the Computer History Museum last December. The caption under his picture on the museum's wall reads, "For advancing the utility of personal computers by developing the VisicCalc electronic spreadsheet". Thank Bob and Dan Bricklin for that one, and Bob for an achievement of which he is equally proud: helping make home networking happen.

It's important to note that Bob calls the Net "a prototype" and remembers its beginnings as "a class project" back when he was at MIT.

His point: Making infrastructure doesn't have to start with the Net, or with any one given thing. The risk in doing that is playing inside a frame that the telcos and cablecos still own. Soon enough you're Paying by the stroll rather than using our own industry to create a new one based on our own abundant connections and smarts.

I've been saying for years that both the model and the destiny of the the computing and networking industries is construction. "In a mature software industry", I wrote here, "Microsoft will be no more or less important than, say, Georgia Pacific or Kaufman & Broad." This is not merely inevitable. It is already happening.

What we are going to see, over the next few years, is the growth of a new building industry around the physical infrastructure required to bring the Internet to everybody — one that grows outside the existing connections provided by phone and cable TV carriers, but will also connect to them. The carriers aren't going to make the first move here. They're too anchored to their old telephone and television based business models. The smart ones will work with the James Hettricks of the world to get The Job done. In the long run, the job will require both the big and the small, the innovator and the disruptor. Consider the natural generativity of the construction business and you start to see what is bound to happen here.

And it's not just "bottom-up". You can't make microprocessors and circuit boards in your basement. Without large backbones and routers we'd still be talking to printers and file servers on networks that remain terminally local. Linux and academic papers may grow on the human equivalent of trees, but real infrastructure requires real capital investment, sometimes in the many billions of dollars. In the vast ecology of real-world business, there is much that can only be done by large commercial and government efforts, usually in concert with one another — especially if the nature of those efforts are inherently infrastructural. Even if you fill your neighborhood with fiber or wireless data paths, you're not going to lay trunks along railroads or under oceans. And even if you could, you would still have to do business with other backbone companies that will wish to be paid for carrying your bits over their pipes. Bob Frankston, David Isenberg and Richard Bennett will find common ground. Because too much will happen for any one angle to take in the whole picture. And there's just too much work to be done.

On the Linux side, we need to take up Jonathan Zittrain's challenge to preserve the generativity of the Net and the devices we connect to it. The other route is toward what he calls an "applianced" future. Already millions of appliances run on Linux. There is a good chance your TV and set top box are among them. They might be hackable, but why bother? They weren't made for that.

The world, however, was made to support hackability. That's the nature of rocks, trees, water, dirt, animal parts and natural materials. It's also true of the natural materials produced by the human mind — the kind we call software. That hackability-support is what gives us infinite varieties of infrastructure.

What we need now is to start understanding new forms of infrastructure on their own terms, and to understand more deeply what infrastructure has been all along.

What is there about the nature of infrastructure — going back long before the 1920s — that can help us understand what Linux, the Net and their like are about?

What do Linux, the Net and other generative forms of infrastructure add to our understanding of the topic itself? Can we align infrastructure and generativity? Or is that too much of a stretch?

Answering these kinds of questions requires examining topics at a depth one cannot plumb just with news coverage, or by framing queries with the parochial interests of categories and factions — including such academic disciplines as computer science, history, political science and economics. We are in new territory here.

Where do we start? One place is the comments here. I'll be interested to see how those go. There are already conversations taking place around this topic. If you're interested in contributing, let me know.

[Note on 21 April 2008] I just did a copy edit on most of this piece, changing the opening to better present the central quesitons I'm raising here.

AttachmentSize
hourglasses_med.jpg109.43 KB
burtonmatrix.JPG117.58 KB
ubiquitize_infrastructure.JPG257.6 KB
bobf.jpg98.34 KB
______________________

Doc Searls is Senior Editor of Linux Journal

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

very good article , thanks

Kayser's picture

very good article , thanks man I learned a lot here

LINUX IS PERFECT

super bright led's picture

Linux or LAMP does the same mostly by accident: Because the Windows + Netscape (later IE) combination became ubiquitous, the web browser is assumed to be the central user interface. Meanwhile the highly skilled community of web developers and sysadmins, who either liked or could deal with a certain fluidity, assembled the LAMP components as a chosen platform. So interest in web sites tends to fuel deployment on Linux on the back-end.

What a great essay, Doc.

Yule Heibel's picture

What a great essay, Doc. I'll have to bookmark and reread several times...

Some thoughts... While I can't speak to the software/open source issue, I've been thinking about infrastructure a lot -- from the perspective of urbanism. When you describe "infrastructure as a hierarchy of physical dependencies," I see how useful this is for rethinking physical and social urban infrastructure. We know that's in trouble (it's crumbling, it's going to be a money sinkhole, etc.), and anyone who has looked at how municipal governments work with (or at odds to) senior levels of government (provincial or state, plus federal levels) knows that silo mentalities totally *rule*. Here in Victoria, we're looking at a $1.2billion infrastructure project for sewage treatment, and the 2 levels of government (local and provincial) are feuding because both say that the other side didn't tell them about information that was wanted. Walled garden? It's a bloody fortification... Then there are those infrastructures that are supposed to support social programs, including mental hospitals and detox facilities -- they're not working, either, and our homeless now include not only poor people, but people who should be in some pipeline of institutional support because they're mentally ill or addicted (or both, typically). It all gets off- or downloaded to citizens now, as if we could individually step into the breach, without infrastructural support.

Maybe government is where we need open source most of all -- as a way of thinking and as a way of "architecting" infrastructure.

In The Nature of Economies, Jane Jacobs wrote about settlements (cities), and how they thrive or wither depending on receipt of imports and production of exports. Her point was, however, that it's hardly a mechanical thing, with balance sheets deciding on success or failure. She argued that cities work like nature does, meaning that within what she called "the conduit" (the environment of the actual process that begins with imports going in and ends with exports going out -- the economy in human terms, the ecology in natural terms), imports undergo changes: depending on how they are being "stretched" or "differentiated" (her words) in the conduit, imports or resources expand and multiply. Looking at Jonathan Zittrain's hourglass diagram, I'm thinking, "that's a conduit," and part of what good infrastructure does is help stretch the imports within that environment. As you say, infrastructure isn't the skin, the covering, the pipe, or the wires in the walls: infrastructure should support differentiation and stretching.

Anyway, sorry to be so dreadfully off-topic for what you're actually after, but "infrastructure" is a hot button for me these days. Oh, if only government went in for an open source make-over...

You've given me lots to think about here, thanks!

Much to think about

Doc Searls's picture

Thanks, Yule. You've made my week. Or perhaps longer.

I am taken lately with the belief that understanding infrastructure is critical not only for building and maintaining civilization's essentials, but for bridging chasms of opinion that make constructive discourse impossible.

A few years ago a republican friend from Utah said two things that have stuck in my mind. One was "There are two parts to democracy. Elections and governance. And governance is where the work actually gets done." The other was, "Most people, regardless of political philosophy, just want the roads fixed."

Ah, but there are new roads now. New conduits, as you put it. (Cognitive linguists will tell you that a great many things are understood in terms of conduits, including roads, traffic, communications, and anything else that employs the preposition "through".) I'm hoping that we can start a conversation, and perhaps an academic discipline, around understanding infrastructure. Because I think we have a long way to go. And we barely understand how we got where we already are.

Doc Searls is Senior Editor of Linux Journal

Linux and meaninglessnesesesesesses

cprise's picture

The Internet is generative because it is a platform with predictable interfaces that bring application developers and users together. The devs have their APIs and the users have UIs and the twain shall always meet.

Linux or LAMP does the same mostly by accident: Because the Windows + Netscape (later IE) combination became ubiquitous, the web browser is assumed to be the central user interface. Meanwhile the highly skilled community of web developers and sysadmins, who either liked or could deal with a certain fluidity, assembled the LAMP components as a chosen platform. So interest in web sites tends to fuel deployment on Linux on the back-end.

But not much else.

Where desktop platforms are concerned, Apple and Microsoft have provided relatively predictable platforms for both their users and application developers (esp. for those exploring a move from the former to the latter group). Linux is invisible here because neither there is no predictable API (the minimum spec for an LSB system provides little desktop functionality) and as those Walmart shoppers (and many cellphone users) experience, there is little meaning or impact to users' high-level needs since everything at that level is too heavily customized/balkanized for users to identify or exploit the common thread (which may amount to nothing more than a similar kernel-- again meaningless even to power users).

People want a certain control over their machines. The systems must be like houses with familiar doors, windows, garages allowing the owners to furnish it to their whim WITHOUT structural intrigues. But "Linux" culture dictates where you, the non-architect, can safely shop for furniture (it's called the "repo") so it will fit through the doorframe. Acquiring furniture independently is frowned upon because no one can agree between distros on minimum door width or even whether the floors should be level. OTOH the 'fun-house' atmosphere attracts a lot of hackish pseudo-architectural types who like to play with piecemeal (but not comprehensive) design variations. But families and small business tend to stay away...

Needless to say, when Samantha-user starts to graduate from writing OOo scripts to creating her own applications, she will find the infeasibility of simply moving the finished binary to her friends' other "Linux" systems to be less than inspiring. She will see her Windows and Mac peers tossing programs around between them and surely be compelled to switch.

Even platforms with tiny marketshare can be inspiring enough to seed killer applications. But you have to have a platform. A PC platform with a first-class UI (not a web browser).

Of course, you may believe that the Web is the future for all apps; or even that personal computing is an aberration, and thin clients and terminals are the proper order of things. Or through ingrained bias you may frequently imply such things in your writing without realizing or admitting it to yourself.

So I must ask: If one can't figure out the generativity of the PC revolution, then how can one protect the generativity of the Internet??

Infrastructures for generativity

Doc Searls's picture

cprise,

Do you think Linux and its communities, in the long run, are incapable of producing "A PC platform with a first-class UI"? How about a mobile device platform?

Do you think that only producers of walled developer gardens (such as Microsoft and Apple) are capable of producing those?

Do you think the "generativity of the Internet" must also rely only on commercial platforms supporting walled gardens?

If your answers are "yes" to those questions, we might not have much to talk about.

Meanwhile, I hope some Linux developers of mobile and PC (not just browser) first-class platforms and UIs take your criticisms here as a hepful challenge.

Doc Searls is Senior Editor of Linux Journal

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix