Let's do for news what we did for software
There have always been problems with distributing urgent public safety information. These problems show up, over and over, with every hurricane, tornado, flood and wildfire. At this moment in history, problems fall in three areas of responsibility (and, for that matter, responsiveness):
- The old official channels (radio, TV, newspapers) are scaling back on live news coverage (or on news coverage, period)
- The new official channels (web sites and services, "reverse 911") are still, as we've been saying since 1995, "under construction".
- The new unofficial channels (cell phones, blogs, RSS feeds, phone trees) are still no substitute for the Real Thing, whatever it will become.
Lately I've been thinking about some simple hacks we can do in #3 that will give some needed assistance to #s 1 and 2 as well.
What got me thinking was the Day Fire, which lasted almost a month. What began as a trash fire ended as the 5th largest wildfire in California history. By the time it was contained early this week, the Day Fire covered 162,702 acres , or about 250 square miles a total that exceeds the dimensions of Chicago. In the middle of its last week, the Day Fire was fought by nearly five thousand people, armed with 226 engines, 45 'dozers, 41 water tenders, 28 helicopters, 9 helitankers and more than 10 air tankers. Its cost so far have exceeded $70 million. As of yesterday (October 6), 831 personnel remain assigned to the fire. (Although it's contained, the fire is not yet out.)
Yet news coverage of the Day Fire was notably minimal even as rivers of dark smoke flowed over the heads of millions, and ash fell like snow. Why?
One reason was geography. The Day Fire burned across the Sespe Wilderness and the southern districts of the Los Padres National Forest, which together comprise one of the largest roadless regions in the continental U.S. even though it's under the northwest approach path to Los Angeles International Airport.
Another reason was the continued decline in the ability, and willingness, of local media to provide serious coverage of breaking news. The Ventura County Star covered the story aggressively. (And still is.) The Los Angeles Times did a good job too. Here in Santa Barbara, the News-Press is currently at war with its editors (25 so far have resigned) and its Day Fire coverage consisted mostly of Associated Press stories.
TV stations confine most live news coverage to news slots, and coverage though often vivid is short on useful details (yet long on advertising and network show promos disguised as news).
There's only one regional news radio station with enough staff to assign reporters to the fire. That's KNX/1070 in Los Angeles. The station had some Day Fire coverage, but it was usually buried amidst coverage of other stuff happening in the country's largest radio market. Ventura's KVTA/1520 did the best it could with minimal staffing. In Santa Barbara, KZSB/1290 is a news station in name only, carrying BBC feeds and audio capsules of stale news from the morning paper, which is written yesterday.
As for websites, KNX's is the best of a busy breed. The current trend for media outlets is to garbage up their websites with a zillion links and dozens of graphics and animations. Not to mention popup windows and other annoyances. Newspapers, which might have the most useful and deep information, like to put readers through registration gauntlets or to bury "content" behind paywalls. The Santa Barbara News-Press even puts its current daily news behinds a paywall. I'm a long-time subscriber and I've long since given up on making my online subscription work. It's hard to imagine a more reader-hostile website.
For news on demand, everybody including nearly all news organizations large and small looked (and linked) to one source: InciWeb, an "Incident Information System" that aggregates a variety of official sources. These include the U.S. Forest Service (Inciweb's umbrella organization), the National Park Service, the National Oceanic and Atmospheric Administration, the Office of Aircraft Services, the U.S. Fire Administration, the Bureau of Indian Affairs and the Bureau of Land Management. Among others. Its About page explains,
The system was developed with two primary missions: The first was to provide a standardized reporting tool for the Public Affairs community during the course of wildland fire incidents. The second was to provide the public a single source of information related to active wildland fire information.
A number of supporting systems automate the delivery of incident information to remote sources. This ensures that the information on active wildland fire is consistent, and the delivery is timely.
Currently InciWeb is being tested within the U.S. Forest Service, and will be used nationally in 2007 Fire Season.
I won't criticize InciWeb. It's an important and essential effort. In many ways it does a great job for a service that's both under development and under stress. It even has RSS feeds.
But it's also important to note that InciWeb failed last week, when the fire was spreading fastest and containment was at 15%. The site went down on Monday and didn't come up until late Thursday evening. And it had already been slow long before Monday's crash.
So Live Web coverage was left mostly up to citizen journalists putting up blog posts and photo streams on the likes of Flickr. If you wanted close-to-live news on the Web about the Day Fire last week, your main sources were the Ojai Post, Robert Peake, OjaiBlog, Bakersfield Californian, Flickr shots tagged 'dayfire', Technorati searches for blogs tagged 'dayfire', Libertatia Lab Reports, Sounding Circle, MaryLu Wehmeier and others, including by own blog. (My own favorite pictures of the fire came from Drumwhistles. The most revealing may have been the ones I took while flying over the fires at dawn on September 18.)
So, on September 25th, while the Day Fire was pushing a mushroom cloud of smoke and ash up into the stratosphere, I wrote, With InciWeb down and the mainstream media covering everything else, we need to collect Southern Californians who are paying steady attention (to the fire), and added, I'm going to work on some kind of River of News thing to aggregate and flow Day Fire news from citizen journals to the world. Doesn't mean I'm going to succeed. (I'm no techie, really.) But I'm going to try. Or stop when somebody else beats me to it, and point to that.
Instead of having to hunt for new stories by clicking on the titles of feeds, you just view the page of new stuff and scroll through it. It's like sitting on the bank of a river, watching the boats go by. If you miss one, no big deal. You can even make the river flow backward by moving the scollbar up. To me, this more approximates the way I read a print newspaper, actually it's the way I wish I could read a print newspaper -- instead of having to go to the stories, they come to me. This makes it easier for me to use my brain's powerful scanning mechanism. It's faster, I can subscribe to more, and my fingers do less work.
Dave invented the first River of News style aggregator in 1999, and wrote the above in early 2005. Then on August 22 of this year, Dave found a perfect home for the concept: Blackberries, Treos and other hand-helds that can browse the Web but are all but useless for viewing complex, graphics-heavy websites — which happen to compose approximately 100% of the mainstream news outlets. And a lot of blogs, too. With nytimesriver and bbcriver, Dave showed River of News delivery was ideal for the media devices almost every adult carries in their pockets.
What I wanted for the Day Fire wasn't news from just one source (such as any one newspaper, broadcast station or blog). I wanted all news about the fire from anybody who had anything to say or show specifically, anybody writing "Day Fire" or using the "dayfire" tag in text or with a photograph.
The next day, David Sifry stepped up and made that happen, with the Day Fire News River. It aggregates feeds of keyword searches for "Day Fire" and tag searches for "dayfire", while also listing a variety of official sources, such as Inciweb.
For my part I created a temporary character called "FireHose", who started a blog with a list of "favorites" that could add a stream of posts to the Day Fire news river. While putting that together, I began to think that "hose" should be a species of "river" that comes into existence for the single purpose of aggregating topic searches about specific incidents: a fire, a hurricane, a flood, a snowstorm, a terrorist attack, whatever. As sites, Hoses of News would persist for archival purposes (no need to take them down), but their purposes would be as temporary as the incidents they address.
Now here's the key. It should be easy to create a news hose. So easy, in fact, that any citizen (as well as any official with, say, a fire department) can make one. How do we create that ease?
I think we need companies close to (or in) the blogging business to create easy-to-make emergency Hose of News systems: the online equivalent of those "In case of fire, break glass" things they have in the hallways of schools and hospitals. It wouldn't take much.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.View Now!
|The Firebird Project's Firebird Relational Database||Jul 29, 2016|
|Stunnel Security for Oracle||Jul 28, 2016|
|SUSE LLC's SUSE Manager||Jul 21, 2016|
|My +1 Sword of Productivity||Jul 20, 2016|
|Non-Linux FOSS: Caffeine!||Jul 19, 2016|
|Murat Yener and Onur Dundar's Expert Android Studio (Wrox)||Jul 18, 2016|
- Stunnel Security for Oracle
- The Firebird Project's Firebird Relational Database
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Google's SwiftShader Released
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide