Toward the Anticommons
Under my desk you'll find three computers--reasonably state-of-the-art Linux and Win2K systems and a more elderly Mac. One of my technology literacy goals involves keeping all three of them chatting happily, a networking feat that Linux does much to enable. Lately, though, I've been messing around with Win2K's desktop. Specifically, I've been trying to determine whether it's possible to liberate oneself from the Windows interface and customize the desktop in the way that's so easy, and so much fun, when you're running the likes of X, GNOME and KDE . So, I've found myself back in the world of commercial software after a Linux-induced hiatus of roughly four years--let me tell you, it's a shock.
It's not just the rampant commercialism, even though I've forked over so many $10, $25 and $50 registration fees that I'm starting to understand what Judge Sanders Sauls (of Florida fame) meant by the phrase "nibbled to death by ducks". It's the patent or patent-pending  notices. There's hardly a utility I've downloaded that isn't based on some ostensibly gee-whiz algorithm that has either won a US patent or is about to. The spatial 3-D audio enhancer for WinAmp, I'm told, is based on a "series of patented algorithms". There's a patent pending for a program that lets you jot down notes about web sites you visit, and then automatically displays the note when you revisit the site. A compression program's patented technology enables you to view the contents of compressed files without actually viewing them.
If you're muttering in disbelief at this point, welcome to the club. What's going on? It's simply that the US Patent and Trademark Office (PTO), egged on by a patent-friendly US federal judiciary, is handing out too many broadly phrased software patents (for examples, visit http://www.bustpatents.com). Claiming that they have no reliable way to ascertain the existence of prior art (previous invention) in software, overworked PTO examiners are granting tens of thousands of software patents annually--and among them are patents granting monopoly rights to technologies clearly invented by others and in widespread, public use.
What's the danger in this situation? Is it really so bad? Proponents of strong intellectual property protection argue that a flawed patent system is better for innovation (and, therefore, for global competitiveness and social welfare) than no system at all, but I'm not so sure. In this article, I'll highlight the mounting evidence--reliable, quantitative evidence backed by rigorous mathematical models--that the software patent avalanche poses a major threat to innovativeness (and therefore the competitiveness) of the US software industry. Other countries, notably those of the European Union, haven't yet followed the US lead--and they could find themselves holding a significant competitive advantage if the US system indeed proves dysfunctional.
Proponents of strong intellectual property protection believe that any patent system, even one as flawed as the one that's dishing out tens of thousands of software patents annually in the US, is better than none at all. If you reply that the US software industry developed quite nicely without such protection, they'll respond that the industry would have done even better if strong patent protection had been available. Sure, there are grounds aplenty for arguing that the indiscriminate issuance of broadly phrased patents is a seriously bad thing; for example, Boston University law professor Maureen O'Rourke warns of the development of an anticommons, in which "rights are held by so many different patentees that the costs for anyone to accumulate all the required licenses to enable production [are] prohibitive" (O'Rourke 1999). In the absence of rigorously researched evidence, though, very few people will change their minds.
So where is the evidence? Until recently, most of the studies investigating the relationship between patents and technological innovation have supported a strong patent system. However, many of these studies incorporate highly questionable assumptions. For example, Rose (1999) argues that patent applications increased significantly during those periods when the courts generally looked on patents favorably (1892-1930 and 1983-present). One could therefore argue that a strong patent system encourages innovation. As Rose concedes, however, the association between patent-friendly courts and increases in patent applications could be produced by a third factor, such as economic boom and bust cycles.
Only recently have scholars adopted a finer-grained approach, one that examines unique industries in terms of their special characteristics. In a recent working paper, James Bessen and Eric Maskin (1999) observe that research and development in the semiconductor and software industries is both sequential (one invention paves the way for the next) and complementary (several potential innovators pursue kindred but slightly different development pathways). Socially valuable innovation may not occur until one of the contestants pushes the process to a marketable conclusion.
In industries characterized by sequential and complementary innovation, Bessen and Maskin argue, a company is more likely to prosper--contra the claims made by advocates of strong intellectual property protection--if other companies imitate their products. The result may well be an innovation that "lifts all the boats" and substantially increases the chance that the imitated company will come up with a profitable innovation in the future. In such industries, strong patent protection may be counterproductive, as was commonly recognized in the early years of the semiconductor industry. A Bell Laboratories executive, commenting on the firm's 1956 decision to drop all remaining license fees for its transistor patents, stated: "We realized that if this thing [the transistor] was as big as we thought, we couldn't keep it to ourselves and we couldn't make all the technical contributions. It was to our interest to spread it around. If you cast your bread upon water, sometimes it comes back angel food cake" (Tilton 1971, cited in Bessen and Maskin 1999). In the semiconductor and software industries, Bessen and Maskin conclude, the path to success seems counterintuitive--you gain by encouraging your competitors to imitate your products--but there's plenty of historical and theoretical evidence that this is, indeed, the case (see Bessen and Maskin's paper for the details).
So where do patents come in? Bessen and Maskin's conclusions suggest, in industries characterized by sequential and complementary innovation, everyone's interest is served when strong patent protection is not available; the most favorable climate is one in which outright imitation doesn't bring the threat of a courtroom battle (or a demand for licensing fees).
In reply, patent proponents will surely insist that there's no necessary conflict between technological innovation and a strong patent system; indeed, they say, the patent system is actually designed to enable precisely the sort of imitation that Bessen and Maskin discuss, but under circumstances in which innovators are rewarded rather than victimized by their imitators. Imitation is part of the patent system; when you receive a patent, you're forced to disclose the science or technology that underlies it. In the pharmaceutical industry, such disclosures and licensing arrangements enable competing firms to develop non-infringing improvements that have steadily and significantly advanced the state of the art, with (putatively) enormous benefits to the public.
But the picture changes dramatically when innovation is achieved only at the cost of repeated imitative sequences. Designed to examine the incentives to innovate in an industry in which sequential innovation is the norm, Bessen and Maskin's mathematical model shows that strong patent protection is dysfunctional over the long run. Here's why, in a nutshell: As the cycles of innovation accumulate, so do the costs and inefficiencies introduced by licensing and patent-based imitation. Before long, firms lose the incentive to innovate. Prices rise, research development drops off and, ultimately, the public pays the price. In short, Bessen and Maskin's model shows that indiscriminate software patents lead precisely to the anticommons--and technological stagnation--predicted by O'Rourke.
Does this model accurately mirror reality? Consider this: the availability of software patents has done little to encourage research and development among the leading software patent holders; in fact, R&D spending has held steady or declined since software patents became available.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Doing for User Space What We Did for Kernel Space
- Parsing an RSS News Feed with a Bash Script
- Google's SwiftShader Released
- Rogue Wave Software's Zend Server
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide