Open Source in MPEG

Covenor of MPEG, Dr. Chiariglione gives the history of the Moving Picture Experts Group and explains the characteristics of the MPEG open-source software process.
The First MPEG Standards

My work experience has been in a telecommunications research establishment. The telecommunication industry used to be characterized by considerable innovation in the network infrastructure where investments were not spared and by reluctance to invest in terminal equipment. This was in part because terminals were alien to its culture (even though the more enlightened individuals were aware that unless there were new digital terminals there would not be much need for network innovation), and in part because the terminal was technically and legally outside of its competence. The attitude was “Let the manufacturing industry do the job of developing terminals.” Unfortunately, the telecommunications manufacturing industry, accustomed to being pampered and running the risk of fewer orders from the telcos based in solid CCITT standards, had no desire to make investments in something based on the whim of end users they did not understand. The consumer electronics industry, which knew end users better and was accustomed to make business decisions based on their judgment of the validity of the products, still considered telecommunications terminals out of its interest. This explains why, at the end of the 1980s, there was virtually no end-user equipment based on compression technologies, with the exception of facsimile. To make cheap and small terminals one would have needed ASICs (Applications Speciftc Integrated Curcuits) capable of performing the sophisticated signal processing functions needed by compression algorithms.

I saw the attempts being made by both Philips and RCA in those years to store digital video on CDs for interactive applications (called CD-i and DVI, respectively) as an opportunity to ride on a mass market of video compression chips that could be used for video co-communication devices. What was required was the replacement of the laborious and unpredictable “survival-of-the-fittest” market approach of the consumer electronics world with a regular standardization process.

MPEG-1

So started MPEG in January 1988 with the addition of the mandate a few months later for audio compression and the function needed to multiplex and synchronize the two streams (called “systems”). In four years the first standard MPEG-1 was developed. Interestingly, none of the two original target applications—interactive CD and digital audio broadcasting—are currently large users of the standard (video communication has not become too popular either). On the other hand, MPEG-1 is used by tens of millions of video CDs and MP3 players. One feature of MPEG-1 that is remarkable: MPEG-1 was the first audio-visual standard that made full use of simulation for its development. The laboratory at which I worked took part in the development of the 1.5-2Mbps video conference codec using three 12U racks and minimal support from computer simulation. Even more significant for future implications was the fact that MPEG-1—a standard in five parts—has a software implementation that appears as “part 5” of the standard (ISO/IEC 11172-5).

MPEG-2

In July 1990, MPEG started its second project, MPEG-2. While MPEG-1 was a very focused standard for well-identified products, MPEG-2 addressed a problem everybody had an interest in: how to convert the 50-year-old analogue television system to a digital compressed form in such a way that the needs of all possible application domains were supported. This was achieved by developing two system layers. One, called the MPEG-2 Transport Stream (TS), was designed for error-prone environment targets (such as cable, satellite and terrestrial) of the transmission application domains. The other, called MPEG-2 Program Streams (PS), was designed to be software-friendly and was used for DVD. The idea was that MPEG-2 would become the common infrastructure for digital television; indeed, something that has been successfully achieved if one thinks that at any given moment there are more bits carried by MPEG-2 TS than by IP. The title of the standard “Generic Coding of Moving Pictures and Associated Audio” formally conveyed this intention. By the time MPEG-2 was approved (November 1994), the first examples of real-time MPEG-1 decoding on popular programmable machines had been demonstrated. This was, if there had been a need for it, an incentive to continue the practice of providing reference software for the new standard (ISO/IEC 13818-5).

MPEG-4

In July 1993, MPEG started its third project, MPEG-4. The first goal is reflected in the original title of the project, “very low bitrate audio-visual coding”. Even though no specific mass-market applications were in sight, many sensed that the digitization of narrowband analogue channels, such as the telephone access network (Internet was not yet a mass phenomenon), would provide interesting opportunities to carry video and audio at a bitrate definitely lower than 1Mbps, roughly the lowest bitrate value supported by MPEG-1 and MPEG-2. For that bitrate range it was clear that a decoder could very well be implemented on a programmable device, unlike other MPEG standards. It was possible that there would eventually be more software-based than hardware-based implementations of the standard. This was the reason the reference software, part 5 of MPEG-4 (ISO/IEC 14496-5) has the same normative status as the traditional text-based descriptions of the other parts of MPEG-4.

MPEG-4 became a comprehensive standard as signaled by its current title, “coding of audio-visual objects”. The standard supports the coded representation of individual audio-visual objects whose composition in space and time is signaled to the receiver. The different objects making up a scene can even be of different origins: natural and synthetic.

This does not mean, however, that a particular implementation of the standard is necessarily “complex”. An application developer may choose among the many profiles—dedicated subsets of the full MPEG-4 tools—to select the one used to develop his application. For all these reasons, it is expected that MPEG-4 will become the infrastructure on top of which the currently disjointed world of multimedia will flourish.

______________________

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix