Setting TV Free
My 2006-vintage Sony Bravia flat-screen "Full HD" TV has Linux inside. I can tell because it comes with a two-page printout of the GPL, included almost as a warning. "Watch out", it seems to say. "This TV comes infected with freedom." Not that it's worth hacking: you can make breakfast in the time that passes between a click on the remote and a change on the screen. (I'm barely exaggerating here. Switching between the TV's eight HDMI inputs is amazingly slow.) But being a Linux device says volumes about what has happened to TV already, because the freedom it contains at the device level also ranges outward from the operating system to the network on which that operating system was born and grew up. That network was, and remains, the Internet.
TV wasn't designed to run on the Internet, and the TV industry continues to fight the drift of video production and consumption in a direction it calls "over the top", or OTT. The bottom is TV's distribution system, better known as cable. That fight is a movie we've seen before, and we know how it ends.
What we call cable today began as CATV, for Community Antenna Television, in the 1960s. CATV was invented to meet the needs of TV viewers who couldn't get signals. Hollywood didn't have a problem with that, but it was freaked out by what it saw as a new distribution channel for entertainment. At the time, Hollywood had just two distribution channels for what we now call "content". One was movie theaters and the other was commercial TV, dominated by networks and their programs. Back then, there were three major TV networks: ABC, CBS and NBC. The Fox network came along later. PBS was noncommercial and mattered less, but for distribution, it was well aligned with its commercial companions on what was then still TV's "dial". Hollywood correctly saw cable as a threat to that system, because it turned consumers of television into customers for distribution of anything, since there was hardly any limit to what could be distributed on cable's system, especially on channels other than the ones already occupied by TV stations.
So Hollywood fought CATV. It did this by, among other things, forcing employees of movie theaters to wear buttons that said "Fight Pay TV". Joining Hollywood's cause was the Television Accessory Manufacturers Institute, which misspelled itself TAME. In Blue skies: A History of Cable Television (Temple University Press, 2008), Patrick R. Parsons writes:
Cable, clearly, had no friends in the TV antenna business. TAME widely distributed flyers and press releases in communities considering CATV systems. TAME newspaper advertisements warned, "Here's Why Cable TV Is Bad for Our Community", and "How to Fight Community Antenna Systems".
At the heart of broadcasting from the start has been a scarcity imperative, based on the very fact of limited RF (radio frequency) spectrum. This started with AM radio, which operated on Long Wave (LW) and Middle Wave (MW) in Europe and other parts of the world, and in MW alone in the US, where we call it AM. Here we had channels every 10KHz from 540–1600KHz (later pointlessly extended to 1700KHz). The waves in that band—the 1MHz band—are hundreds of feet long, and are radiated by whole towers. When you see a set of free-standing or insulated guyed towers, you're probably looking at an AM station's transmitter, still operating inside conceptual and legal frameworks developed in the 1920s.
As transmitters and receivers became capable of using higher frequency bands, shortwave was added. Then, as the capacity to operate at still higher frequencies developed, the new technologies called TV and FM radio appeared on what came to be called "low-band" VHF (Very High Frequency) bands: roughly 54–88MHz for TV channels 2 to 6, and 88–108MHz for FM radio. (TV audio also used FM, so channel 6 audio could be heard below the bottom of the FM dial, at 87.75MHz.) When the first TV stations filled up low-band VHF, channels 7 to 13 were added in the high-band VHF range, which runs from 174–216MHz. Adding to the scarcity of these channels was the inability of signals to radiate on adjacent channels without interfering with each other. (Channels 4 and 5, used in New York, Chicago, Los Angeles and other major cities, were next to each other numerically, but had an open guard band between them.) So, in the 1950s, channels 14–83 were added in the UHF (Ultra High Frequency) band, which ranged from 470–862MHz.
Delivering television signals to viewers' antennae was a brute-force undertaking. The maximum permitted ERP (effective radiated power) of stations in the US was 100,000 watts on low-band VHF, 316,000 watts on high-band VHF, and 5,000,000 watts on UHF. That was for analog TV, which ended with the "digital transition" in 2008. Digital transmission continues today on high-band VHF and UHF channels, at lower powers that are still quite high. For example, the limit is still 1,000,000 watts on UHF. Higher frequencies require higher powers because their shorter waves propagate less well across terrain and through structures. This is why height matters even more than power for over-the-air TV and FM. When you see freakishly tall towers standing out in the middle of nowhere, and the tops of high buildings and mountains bristling with antennae, you're looking at TV and FM transmitters.
Cable obsoleted all these concerns for most TV viewers, especially because cable added many more channels than were possible over the air. So did satellite TV, which boasted even more channels than were possible through cable. Between cable and satellite, over-the-air (OTA) TV became almost completely obsolete, even with digital transmission. Today, OTA serves less than 10% of the viewing population. One reason is that there is far more to see on cable and satellite. Another is that digital transmission tends to fail without clear line-of-sight signal paths between receivers and transmitters. Digital TV (DTV) also isn't designed for mobile use, so having a TV receiver in your phone or pad isn't a happening thing—especially since you're already getting video over the Internet, and from infinitely more sources. And that's what Hollywood and cable are fighting today.
You need to understand Hollywood and cable as one thing. Comcast's purchase of NBC a couple years back simply clarified what was already a fact. Hollywood and cable today are also of one mind in wanting to control the distribution of video content, and of the viewing "experience", which consists, as it has since the 1920s, of "tuning" among a finite number of "channels". The Internet threatens the old channel-based TV scarcity model, and the brute-force imperatives of its distribution system, by offering à la carte choices beyond calculation. Against this fate, Hollywood and cable together are no less opposed than Hollywood and the antenna makers were to CATV's rise in the 1960s. The difference is that cable is also in the business of providing Internet access as a service, which puts it on both sides of a battle between the past and the future.
On the side of the future, the biggest video player is Google, through YouTube. And, since mobile video comes to you mostly over the Internet, Google is the biggest player there as well. But the Net doesn't have channels in the old-fashioned radio and TV sense. Even the term "content" fails to contain the scope of what the Net supports for communications and the sharing of data. This means Google doesn't control video on the Net. Nor does anybody.
TV over IP—the Internet Protocol—is also called IPTV. That's what you get "over the top" of cable. With IPTV, some sources look like TV channels, and some even carry old analog-era channel numbers. But they are TV only to the degree that they carry forward legacy branding and business models, such as advertising and subscription fees. Their new context, however, is the uncontained abundance opened by the Internet Protocol, and by the capability of devices running Linux and other abundance-loving operating systems. This abundance is terminal for television.
So, if we look back on TV's history, we see three eras:
We are in the third of those now: version 3.x. And, because TV v3 lives inside Internet v1, that will be as far as TV gets. There will be no TV 4.0.
In early May 2013, the Wall Street Journal carried a story titled http://online.wsj.com/article/SB10001424127887324059704578473400083982568.html"ESPN Eyes Subsidizing Wireless Data-Plans". The gist: "Under one potential scenario, the company would pay a carrier to guarantee that people viewing ESPN mobile content wouldn't have that usage counted toward their monthly data caps." I can't help but wonder what ESPN's actual agenda here might be. Such a plan would not only violate the principle of network neutrality—that the network itself should not favor one kind of data, or data producer, over another—but turn all competing content producers into instant network neutrality activists. More likely ESPN, the most powerful program source in the cable world, senses which way the wind is blowing, and it's against broadcasting's old models. So it wants to shake things up a bit, and hasten history along.
It is not a coincidence that Senator John McCain introduced the http://www.scribd.com/doc/140433670/TV-Consumer-Freedom-ActTV Consumer Freedom Act at around the same time: a move Business Insider said "would dismantle cable as it's currently constructed". In a statement, McCain put his case this way:
This legislation has three principal objectives: (1) encourage the wholesale and retail "unbundling" of programming by distributors and programmers; (2) establish consequences if broadcasters choose to "downgrade" their over-the-air service; and (3) eliminate the sports blackout rule for events held in publicly-financed stadiums.
For over 15 years I have supported giving consumers the ability to buy cable channels individually, also known as "à la carte", to provide consumers more control over viewing options in their home and, as a result, their monthly cable bill.
The video industry, principally cable companies and satellite companies and the programmers that sell channels, like NBC and Disney-ABC, continue to give consumers two options when buying TV programming: first, to purchase a package of channels whether you watch them all or not; or, second, not purchase any cable programming at all.
This is unfair and wrong—especially when you consider how the regulatory deck is stacked in favor of industry and against the American consumer.
Like ESPN, McCain is watching the shifting tide of consumer sentiment. Cable's customers hate bundling, and cable itself. They get à la carte on the Net, and on devices native to the Net, and they want that for their TV channels and shows as well.
Once you unbundle TV and make it à la carte, you have nothing more than subscription video on the Net. There will still be lots of "channels" and "shows", and you will still pay for them. But the context will be IP, not TV.
This means there won't be a TV 4.0 because TV 3.0—TV over IP—will be the end of TV's line. And, after that happens, Hollywood and its buddies will discover, once again, that there is a lot more opportunity in a big new world it doesn't control than in a small old world it used to control.
In its locked-up little heart, my old flat-screen knows it carries the DNA of an IP TV. That DNA long ago decided the direction in which TV would evolve, whether Hollywood and the cable industry want it that way or not.
TV Illustration via Shutterstock.com.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.View Now!
|The Firebird Project's Firebird Relational Database||Jul 29, 2016|
|Stunnel Security for Oracle||Jul 28, 2016|
|SUSE LLC's SUSE Manager||Jul 21, 2016|
|My +1 Sword of Productivity||Jul 20, 2016|
|Non-Linux FOSS: Caffeine!||Jul 19, 2016|
|Murat Yener and Onur Dundar's Expert Android Studio (Wrox)||Jul 18, 2016|
- Stunnel Security for Oracle
- The Firebird Project's Firebird Relational Database
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
- Google's SwiftShader Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide