Streaming Media

Frank describes the hardware and software technology used on the server side of the streaming process.

Modern technology is not only growing fast, but the growth literally is exploding. There are so many areas of specialized technology that it has become impossible for one person to fully comprehend the intricacies of the subsystems that combine to produce the latest buzzword. Just as we now use acronyms inside acronyms to succinctly describe the latest innovation, we encapsulate hundreds of human years of work into a single abstract expression, and then piece the abstractions together to create larger, more complex systems, which themselves soon become just another piece of a larger puzzle.

A logical question to ask is “does anyone need to understand systems at that level of detail?” Can't we just piece together all the black boxes knowing only their input/output (I/O) specifications and achieve a working solution? The answer is clearly no. We don't need to understand the most minute detail of every technology we use, but if we are going to make the best use of our technology, we must look deeper than just the visible specifications of each subsystem.

Multimedia content delivery is a good example of a task that requires this type of detailed understanding. Multimedia is simply a combination of two or more types of communications media, such as audio, video, graphics, text or any other element that can stimulate the human perceptual senses. This article looks at streaming media—the process of using computer-based technology for time-dependent delivery of multimedia data as opposed to time-independent data delivery—to show how a fairly deep understanding of many complex technologies must be achieved in order to make these subsystems function together to achieve the desired result. The steps outlined in the streaming process can be presented linearly and, as such, might appear to be independent black boxes. However, the ideal implementation of each step requires knowledge and a consideration of all of the steps in the process.

Overview of a Streaming Media System

The streaming media process is simply a way to communicate data. This communication can originate from many sources, and it can be targeted to many destinations. The target of these communications is called the receiver. Each receiver can also be the source of data for other “downstream” receivers. The four permutations of source/destination streaming media flow are: One to One, One to Many, Many to One and Many to Many. Each of these four combinations requires a set of common streaming media services, colored by the specific requirements of the source and destination mix. Protocols have been developed by the Internet Engineering Task Force (IETF) to enable all aspects of streaming media, and the industry is only beginning to fully implement these designs.

On the hardware side, a system should be designed with growth and scalability in mind. A relatively small up-front investment into higher performing components could provide large savings later on. It is also very easy to waste money upgrading components that do not affect the areas in the systems where performance bottlenecks actually exist. We will follow the logical path from data creation to transmission to the receiver and discuss the details as we go.

Data Creation

There is an entire industry devoted to the task of creating multimedia data. Historically, expensive, highly proprietary workstations create most high-end, “Hollywood grade” multimedia. The PC revolution produced lower-cost alternatives for many of the tasks that previously required expensive solutions. These solutions were initially on Macintosh computers and have migrated to Windows-based systems, but they were still proprietary systems that locked the user into a dependency on the vendor.

There has recently been a move to Linux. Besides the “free beer” aspect of Linux, end users see a real advantage with Linux because they can free themselves from a dependency on any single vendor. Time-critical content creation is the hallmark of Hollywood production, so companies cannot depend on even the most propitious vendor. In order to protect their own businesses, they must be in a position to do the whole job themselves.

The Visual Effects Society is an industry group with about 24 member companies, each of which base their business on performing the various tasks required to create multimedia data. That society has announced a desire to move entirely to Linux within the next few years, but there is much work that needs to be done on the Linux OS in order to make that possible. Fortunately, the migration of streaming technology to the Linux OS is well underway.

If you want to provide streaming media services, the entire process depends on the data that you intend to stream. Will you stream live video? DVD or CD-ROM content? Will the data include computer-generated graphics images or some composition of several media types? How will the data be mixed and manipulated in order to produce the final data that you intend to stream to your clients? These questions must first be answered at a strategic level—what is your planned business? Then the answers must be researched based on the equipment, skill sets, time and financial resources available. Those parameters must be considered in terms not only of the data creation itself, but also in terms of the remaining services that you must provide for the entire solution. If, for example, your business is to stream recorded content to paying clients, you had better understand the requirements of your target audience. Will they pay per view, or will they expect “free” content? What is the mix of client technology that you have to support with your streams? These, and many other questions, which we touch on as we explore each step in the process, affect your decision on the software and hardware you need in your system.

______________________

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix