It's Elemental—Natural Advantages
I'm writing from the business class cabin of a United 777 en route from Chicago to Los Angeles. To my left is an LCD display that pulls out of the armrest. On screen is a map that tells me what I'm seeing out the window here on the right side of the plane.
The map scrolls a series of views, like you get in a flight simulator program. Right now it says we're heading toward the Rockies across Fort Collins, north of Denver, Colorado. About 30 miles to the right is Cheyenne, Wyoming, a flat pattern of streets and buildings surrounding the runways of an airport. The town spreads around a convergence of highways and railroads, drawn in faint lines across a landscape worn flat by almost constant winds. The story of the land's geology is told again by snow, which appears to be swept poorly by a gigantic broom, brushing northwest to southeast across endless stretches of ranches and farms.
We pass over the Front Range, the Medicine Bows, the Sierra Madres. In the distance I see the Big Horns, the Uintas—all rough features embossed on the sky above by forces below. Not many millions of years ago much of the scene out this window would have been as flat as Nebraska, but as the mountains came up under the plains, the soft soils were “de-roofed”, perhaps more by wind than by the rivers whose work is far more obvious. Even from up here you can see why they say a Wyoming weathervane is an anvil on a chain.
I take more than a passing interest in all this because I've been steeping myself for the last several years in the works of John McPhee, who does for geology what Shakespeare does for love—except the virtues of geology are not so self-evident. To McPhee, no rock has a story too dull to tell, which he proves, page after page, book after book. McPhee's series of books on American geology were only slightly condensed in 1998 into one fat work titled Annals of the Former World. It won the Pulitzer it deserved. My favorite McPhee book is Rising from the Plains, which weaves two tales into one: the story of Wyoming's deep past and of geologist David Love, who has lived through the last 89 years of it. “A geological map is a textbook on one sheet of paper”, McPhee writes. And David Love, who grew up on a hardscrabble ranch in the very center of the state, was the primary author of both the 1955 and 1985 editions of the Geological Survey's Wyoming maps. He researched them mostly on foot and guesses he has spent a quarter of his life sleeping outside.
What draws me to people like McPhee and Love is the sense of grounded perspective they give us on a topic that should become increasingly fundamental as both Linux and the computer industry mature. That topic is infrastructure—or, to use a label I prefer, interstructure. I'm talking here about our base-level computing and communications environments. Linux is down there, sitting on top of the deeper and more universal environment we call the Internet. It's infrastructural stuff. Every piece of code we add or change lithifies into solid material we use to build the civilized world.
In geology the term competent refers to rock that's dependable. You can build a house on it or with it, and you can trust that it won't break if you climb a steep surface of it. It also has nothing to hide about itself. The same goes for code. Infrastructural code is naturally competent. It is also both open to examination and improvement. The intellectual and creative processes by which we improve infrastructural code are no less natural than the geological forces that turn granite into gneiss, limestone into marble and peridotite into serpentine.
Competence has another aspect. Here's David Love: “Human environment, good and bad, starts with the rock, coupled with the other two major necessities, water and air. Ruin one of these three basic essentials and humanity is in deep trouble.” Now substitute “Internet” for “rock” and “computing” for “humanity”, and Love's point starts to come home.
Infrastructure is what we depend on. And because it is naturally common and abundant, a large number of us understand how it works and what it's good for.
Explaining why the government of China decided to create its own Linux distribution, Red Flag, Matei Mihalca, head of internet research, Asia-Pacific for Merrill Lynch, points to Linux's “transparency”. This is the inherent infrastructural advantage Linux enjoys over Windows. Even if Windows becomes 100% reliable, the reasons why will remain opaque as long as the source code remains closed. Build all you want with it, but don't ask it to serve as bedrock.
This doesn't mean there's no room for commercial developers in the business of making infrastructural bedrock. Dave Winer, the commercial developer behind SOAP and XML-RPC (both open-source protocols) says, “Ask not what the Internet can do for you, ask what you can do for the Internet.” Constantly improving Linux is one answer to that question.
Now the little map tells me we're starting to head past Las Vegas, which looks like a waffle pattern etched almost decoratively on the flat desert floor. In Basin and Range, John McPhee explains that the mountains of Nevada were mostly formed by extension: the distance between Salt Lake City and San Francisco is increasing, opening up. This started recently, in the Miocene, only about eight million years ago. The crust here has been spreading and breaking apart, and the heavier edges of each block have been sinking into the mantle while the lighter edges have floated up to form mountain ranges. Between them basins have opened and filled with erosional debris.
Many of Nevada's basins were recently also at the bottom of Lake Lahontan, a body of water as blue and subarctic as any in northern Canada. Many of today's ranges were islands in Lahontan's midst. Lahontan evaporated as the Sierra Nevada pushed out of California's deeps, casting a rain shadow to the east and starving Lahontan and its surrounding lakes of water. The last ice age ended ten thousand years ago, but the Earth continues to warm, leaving vast dry puddles where Lahontan and other Nevada lakes used to be. Las Vegas now reposes, oblivious to the coming deposits of mountain ranges eroding all around it.
As we begin our descent into Los Angeles, I find myself thinking that in the long run—in the period required for computing infrastructure to lithify—the real fight will not be between Linux and Windows, but between those who respect the nature of infrastructure and those who don't. Those who respect it see the Internet is the only platform worthy of the noun. Those who don't respect it see the Net as yet another plumbing system. Our fight with them will be over regulation intended to protect business models that require control over a plumbing system that the Internet's efficiencies threaten to obsolete. But we'll win, because nature will be on our side.
Doc Searls is senior editor of Linux Journal.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Google's SwiftShader Released
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Interview with Patrick Volkerding
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide