The New Vernacular
We can't talk about software without borrowing the language of construction. We call ourselves builders, designers, architects and engineers. We speak of structures, spaces, objects, frameworks, levels, partitions, elements, components and platforms. We assemble developing software into builds. Finished work takes on the rhetoric of real estate when we build sites with addresses and locations.
So what's going on here, beyond promiscuous linguistic leverage? For a while I've insisted that it means the software industry is maturing into a construction industry—one defined by skilled and reputable practitioners rather than by the sole-source suppliers we call “platform providers”. In a mature software industry, Microsoft will be no more or less important than, say, Georgia Pacific or Kaufman & Broad.
In the construction industry, open source is standard. When building materials and methods aren't secret, there's more to talk about, more people involved in the conversation and more people contributing to the improvement of those materials and methods. This is a bit more low-falutin' than the peer review process that Eric Raymond talks about, but it's the same thing. Peer review happens constantly in every living, growing industry, at every level. Consider Architectural Graphic Standards, a sourcebook for design and construction details first published in 1932. More than half its contents are new or revised every seven years.
The software industry is just a few decades old while construction is as old as civilization itself. This suggests we might have a few things to learn from the senior industry. Richard Gabriel seems to agree. In Patterns of Software (Oxford Paperbacks, 1996) he extols the virtues of building software for “habitability”. His structural ideal is a New England farmhouse:
The result is rambling, but each part is well-suited to its needs, each part fits well with the others....The inhabitants are able to modify their environment because each part is built according to the familiar patterns of design, use and construction and because those patterns contain the seeds for piecemeal growth.
In How Buildings Learn (Penguin Books, 1994), Stewart Brand calls the New England farmhouse a perfect example of “vernacular” construction. He writes:
What gets passed from building to building via builders and users is informal and casual and astute. At least it is when the surrounding culture is coherent enough to embrace generations of experience.
Vernacular is a term borrowed since the 1850s by architectural historians from linguists, who used it to mean the native language of a region....It means common in all three senses of the word—widespread, ordinary and beneath notice.
In terms of architecture, vernacular buildings are seen as the opposite of whatever is academic, high style, polite. Vernacular is everything not designed by professional architects—in other words, most of the world's buildings....Vernacular building traditions have the attention span to incorporate generational knowledge about long-term problems such as maintaining and growing a building over time. High-style architecture likes to solve old problems in new ways, which is a formula for disaster....
Vernacular buildings evolve. As generations of new buildings imitate the best of mature buildings, they increase in sophistication while retaining simplicity.
Is UNIX vernacular? Here's what Neal Stephenson says in his book In the Beginning Was the Command Line (Morrow, William & Co., 1999):
The filesystems of UNIX machines all have the same general structure. On your flimsy operating systems, you can create directories (folders) and give them names like Frodo or My Stuff and put them pretty much anywhere you like. But under UNIX the highest level—the root—of the filesystem is always designated with the single character “/” and it always contains the same set of top-level directories:
and each of these directories typically has its own distinct structure of subdirectories. Note the obsessive use of abbreviations and avoidance of capital letters; this is a system invented by people to whom repetitive stress disorder is what black lung is to miners. Long names get worn down to three-letter nubbins like stones smoothed by a river.
It is this sort of acculturation that gives UNIX hackers their confidence in the system, and the attitude of calm, unshakable, annoying superiority. Windows95 and MacOS are products, contrived by engineers in the service of specific companies. UNIX, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture.
Vernacular architecture is the opposite of what Brand calls “Magazine Architecture”, architecture as art, rather than as craft. Here's the difference, according to Henry Glassie: “If a pleasure-giving function predominates, it is called art; if a practical function predominates, it is called craft.” It's hard to imagine anything more crafty and practical than a command-line interface.
Stewart Brand finds a lot of craft in what he calls “Low Road” buildings, which tend to be “low visibility, low-rent, no-style”. He says, “Most of the world's work is done in Low Road buildings...and even in rich societies the most inventive creativity, especially youthful creativity, will be found in Low Road buildings, taking full advantage of the license to try things.”
So, it's hardly a coincidence that Brand's ideal Low Road building is MIT's Building 20, where “The Tech Model Railroad Club on the third floor, E Wing, was the source in the early 1960s of most of the first generation of computer hackers who set in motion a series of computer technology revolutions (still in progress).”
We tend to think of revolutions as rapid, but the revolution that started in MIT's Building 20 predates Moore's Law, which was first published in 1965. It also predates the software industry by almost a generation. Any way you look at it, the hacker subculture has served as a cultural foundation for computing for a very long time. It has also persisted in a comparatively stable form while commercial fashions and revolutions have come and gone, again and again. I'm not saying UNIX does not have a commercial side; just that its cultural foundations are deeper than commerce.
In The Clock of the Long Now (Basic Books, 1999), Stewart Brand sorts civilization into six layers that change at different rates over time.
From bottom to top they range from slow to fast. “The fast layers innovate”, he says. “The slow layers stabilize. The whole combines learning with continuity.”
This puts both hacker culture and software commerce into a fresh and interesting perspective. Positioned one level above nature, hacker culture is often concerned with the natural qualities of software. The GNU Manifesto, for example, says free software is “just like air”. Going up one level to governance, we find the hacker culture's obsessive concern with license agreements. These agreements manifest one layer up in infrastructure: software and protocols (a form of both governance and cultural agreement) that increases in value as it approaches ubiquity.
I don't think it's a stretch to say that hacker culture has a natural understanding of what makes software truly valuable. We see manifest in the Internet, which possesses three almost natural qualities: nobody owns it, everybody can use it and anybody can improve it.
These are built into the Net—and into free software development tools and licenses—for everybody and for fundamentally social reasons. That's why the GNU Project says, “people should be free to use software in all the ways that are socially useful.” This is embodied not only in the GNU tools and other free software, but in Linux, BIND, TCP/IP, sendmail, Apache, Jabber and SOAP—all of which (for the most part) nobody owns, everybody can use and anybody can improve.
These are values that do not—and cannot—come from business. They are not commercial values. However, the layer of software civilization we call commerce depends on the social infrastructure that has grown up out of the hacker culture and UNIX in general. This obviously includes the Net but also lots of other practical and ubiquitous stuff that's free as air.
It is important to understand how these layers relate because a lot of misunderstandings and bad decisions get made when, say, commercial interests attempt to impose self-interested infrastructure, or when governance attempts to socialize commerce.
Michael Polanyi says “comprehensive entities [such as civilization] are logical combinations of levels of reality”, and there are principles—boundary conditions—by which each lower level supplies conditions on which upper levels depend. Phil Mullins puts it this way: “A lower level imposes restrictions within which a higher level can come to operate; the lower level establishes boundaries but leaves open possibilities. The higher level cannot be exhaustively described in terms of the lower level...no level can gain control over its own boundary conditions and hence cannot bring into existence a higher level.” This is why the Net didn't create e-commerce, yet e-commerce depends utterly on the Net.
So, while the Free Software movement might appear anticommercial to many in the purely commercial sector of the software industry, in fact, it just isn't very interested in what happens at the commercial level, and is even less interested in what happens way up on the fashion level. There are more enduring, more purely cultural, more natural concerns.
When we look at the boundary conditions that flank infrastructure, we also see why software companies run into problems when they try to civilize their industry from the top down. Microsoft may have been successful at this but only temporarily. These days all of us, including many at Microsoft, are starting to feel software getting civilized from the bottom up, thanks to the Net and the culture that built it.
We see progress today in protocols like SOAP and XML/RPC, which were developed to support publishing on the Web (so, for example, you can write simultaneously for multiple sites other than your own). They're wide open and available to everybody. Dave Winer and his company, Userland, were involved in both efforts. So were Microsoft and Developmentor, a training and education company. When I asked Dave how the efforts went, he said, “Working with Microsoft and Developmentor on both projects was the best collaborative development experience I have ever had.”
Microsoft comes from personal computing and always will. But it has to thrive in a world where computing is more and more social. And social computing has been built into UNIX since the beginning.
For software businesses, Craig Burton says, “the real challenge is to create ubiquitous infrastructure while generating shareholder value”. It isn't easy, but it helps to do two things. One is comply with your own engineers, who are busy making infrastructure whether you like it or not. That's what IBM did when it embraced Linux: a phenomenon that was growing like wildfire inside its own enterprise. The other is to recognize what works over the long term. The Long Now Foundation proposes seven guidelines for a long-lived, long-valuable institution:
Serve the long view (and the long viewer).
Mind mythic depth.
Ally with competition.
Take no sides.
Institutions do learn—even the ones obsessed with art and fashion. Take Apple. Perhaps the most significant item among Apple's many rollouts early this year was the farthest from fashion and, therefore, the least reported. It happened down at the level of governance. After seventy thousand hackers jumped in to improve Apple's BSD-derived Darwin OS, Apple responded by adjusting its source code license in the direction of “the nature of things”. The new license may not be a clone of the GPL but it's a whole lot closer. Chris Bourdon, the Product Manager for OS X, told me, “You do take Darwin and do anything you like. It's there for everybody.”
Of course it is. It's UNIX.
Doc Searls is senior editor of Linux Journal and coauthor of The Cluetrain Manifesto.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Stunnel Security for Oracle
- SourceClear Open
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Google's SwiftShader Released
- Parsing an RSS News Feed with a Bash Script
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide