We need to protect the freedoms in which Linux was born and grew up.
I've been with Linux Journal since it was a gleam in Phil Hughes' eye, back in 1993. Phil's original plan was for something he called "a free software magazine". I was one of the friends Phil recruited to think and talk, mostly by e-mail, about how to make the magazine happen. The project was pretty far downstream when Phil sent the whole thing sideways with five words: "There's this kid from Finland...." That was the first I'd heard of Linus, or of Linux. But Phil was one of the world's experts on UNIX (having fathered many UNIX publications in previous years), and he was convinced that Linux was exactly the free operating system the world was waiting for. He was right.
And so Linux Journal was born, in March 1994, just as Linux itself arrived at version 1.0. Its first Editor in Chief was Bob Young, who knew almost nothing about Linux when Phil recruited him. ("He was selling circuit boards or something from a booth in the back of a tradeshow", Phil said.) Not long after that, Bob left to start a Linux company of his own, called Red Hat.
The first piece I wrote for Linux Journal was an interview with Craig Burton for an insert called Websmith. Craig sought me out, because he wanted to alert the Linux folks to LDAP, which he said was throwing a monkey wrench into Microsoft's plans to do for networked directories what it had done for desktop operating systems. Craig was right, and the wrench worked.
I started writing full time for LJ in 1998, covering the open-sourcing of Netscape's browser (now known as Firefox) and the creation of its new parent, Mozilla.org. This coincided with the birth of the open-source movement and the dot-com explosion, for which Linux itself was ground zero. The biggest IPOs of 1999 (a record IPO year) were Red Hat, Andover (which had earlier acquired Slashdot) and VA Linux (which later acquired Andover). Linux Journal also had offers at the time to sell out, but Phil turned them down. If he had said yes, some of us (especially Phil) would have scored big, but Linux Journal would have been long gone by now.
Back then, most of the staff worked in our Seattle headquarters. I worked remotely from California or wherever else I happened to be, and would fly in every couple months for meetings and work, and enjoyed it totally. We hit all the Linux tradeshows, plus O'Reilly's OSCon and Emerging Tech conferences. LJ was hot stuff, and so were our advertisers, most of which were venture-funded dot-com players. Then, when the crash hit in 2000, most of those advertisers vanished, leaving nobody to bill, sue or get on the phone. And then, after the attacks on 9/11/2001, companies everywhere dropped all kinds of discretionary expenses, including travel and advertising, and that hit us hard too. (I got an extra whammy by losing every one of the speaking gigs that were lined up at the time.)
Then, over the next few years, the Web became a "content delivery platform" for literally millions of blogs, sites that were "publishers" in name only, and "social networks", such as Facebook, Twitter, Instagram, Tumblr and the rest. So, while Linux Journal eagerly covered open-source CMSes (content management systems) like Drupal and WordPress, it also had to compete in a world filled with abundant low-grade "content" or worse: editorial matter scraped and republished, often without attribution, from legitimate sources (including Linux Journal), just to game Google and other advertising systems. That Linux Journal is still alive, and thriving, is testimony to amazing leadership and fortitude by Carlie Fairchild (our Publisher), Jill Franklin (our Executive Editor) and everybody else on our masthead.
But that's just us. There are bigger battles going on that I want to take this anniversary opportunity to talk about. One is over the future of journalism. The other is over the future of the Internet as an environment for developing and deploying Linux and other open platforms like it. Both are battles over the same organizing principle.
For the past year and a half, I've been a visiting scholar at the Arthur L. Carter Journalism Institute at NYU. My main work there has been helping professor Jay Rosen (who stars on Twitter as @JayRosen_NYU) with Studio 20, which is defined as "a consultancy that gets paid in problems". In academic terms, the class is clinical: students work on real-world problems with real-world publishers. During my time there, Studio 20 students have consulted Fast Company, Pro Publica, The Wall Street Journal, Quartz, Pando Daily, Syria Deeply, ABC News, TimeOut New York, Atavist, Al Jazeera America, DFM Thunderdome and others. Here are a few of the learnings I've gathered over the course of that time, both from the class and from experience with Linux Journal and other media:
The future of journalism is what Jay Rosen calls networked, Andrew Leonard calls open-source, and Dan Gillmor calls We the Media. Everybody with a stake in the output contributes input. While this comprises a kind of model, it is far from being a complete system. The media between individual contributors—telephony, texting, e-mail, blogging and postings on "social" and other media—are all fluxy and provisional. You use whatever works today, which might not be what worked yesterday or will work tomorrow. The old system was as solid and vertical as an office building—and mostly happened inside of buildings filled with paid staff working on finished pieces for publishing or airing at specific dates and times. The new system is all scaffolding, all the way down, with very few people getting paid for the work they do.
Most professional journalists who had work when the Web was born are out of the business, or under-employed within it. Many work for little or no money. Crazy as it may seem, they keep working because they believe the world needs what they do. (Rollo May once wrote that writers differ from other artists in this one significant respect: they suffer the illusion that the world really needs to hear what they have to say.) In this respect, journalists are a lot like Linux kernel hackers.
The portfolio of helpful skills for journalism today goes far beyond writing, photography, video and the ability to dig into a subject. I was amazed to hear new graduate students in journalism, when asked what skills they bring to the table, say stuff like "Ruby, Python and PHP". When we went around the room introducing ourselves on the first day of Studio 20, I heard only one student say "writing" and one say "investigative reporting". The rest all talked about their skills with software tools and services.
"Direct response" advertising is all the rage in digital media. This stuff might be called advertising, but it is instead directly descended from direct mail, better known as junk mail. It works on the assumption that surveillance-fed "big data" mills can give individuals an ideally personalized "advertising experience". For whatever good it does (such as keeping publishers alive), it is also why the most popular browser extensions and add-ons are ones that block ads and tracking. It also models spying for the NSA. In The Intention Economy, I call it a bubble, and I stand by that claim. If you want to know more about why it is doomed, read Don Marti, our former Editor in Chief, who is doing the world's deepest and most prophetic thinking and writing on the topic.
The subscription model is stronger than ever. Interesting fact: when Linux Journal went all-digital (dropping print), we lost a few subscribers but gained many more. This was, and remains, a Good Thing. But it's also one aspect of another thing....
We no longer own our stuff. We just have limited rights to use it. This is true of music, movies, books and countless apps on mobile phones and tablets. In a de facto sense, it's even true of much of the hardware we depend on, every day. Look at what it actually says in the terms and conditions you accept when you buy your smartphone or tablet, and the software that runs on it—even the stuff that costs nothing. Also look at who is snarfing up your usage data and what's being done with it. (If you can tell at all. Most of it is behind walls you can't penetrate.) Then consider this fact: there are few generic white-box phones or tablets. The existence of those in the computer market made both Linux and the Internet possible.
The commercial world has turned into a forest of silos. Every "loyalty program", every subscription system (ours included), and every Web site and service that requires its own login and password is a silo. Every one of these silos exists for the convenience of those who maintain it, and every one compounds the inconveniences suffered by the individuals who need to maintain countless separate "relationships" contained inside all these silos. Implicit in this "system" is the assumption that a captive customer or user is more valuable than a free one.
Next, the Internet.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
- Google's SwiftShader Released
- SourceClear Open
- Parsing an RSS News Feed with a Bash Script
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide