Streaming Media

Frank describes the hardware and software technology used on the server side of the streaming process.

Modern technology is not only growing fast, but the growth literally is exploding. There are so many areas of specialized technology that it has become impossible for one person to fully comprehend the intricacies of the subsystems that combine to produce the latest buzzword. Just as we now use acronyms inside acronyms to succinctly describe the latest innovation, we encapsulate hundreds of human years of work into a single abstract expression, and then piece the abstractions together to create larger, more complex systems, which themselves soon become just another piece of a larger puzzle.

A logical question to ask is “does anyone need to understand systems at that level of detail?” Can't we just piece together all the black boxes knowing only their input/output (I/O) specifications and achieve a working solution? The answer is clearly no. We don't need to understand the most minute detail of every technology we use, but if we are going to make the best use of our technology, we must look deeper than just the visible specifications of each subsystem.

Multimedia content delivery is a good example of a task that requires this type of detailed understanding. Multimedia is simply a combination of two or more types of communications media, such as audio, video, graphics, text or any other element that can stimulate the human perceptual senses. This article looks at streaming media—the process of using computer-based technology for time-dependent delivery of multimedia data as opposed to time-independent data delivery—to show how a fairly deep understanding of many complex technologies must be achieved in order to make these subsystems function together to achieve the desired result. The steps outlined in the streaming process can be presented linearly and, as such, might appear to be independent black boxes. However, the ideal implementation of each step requires knowledge and a consideration of all of the steps in the process.

Overview of a Streaming Media System

The streaming media process is simply a way to communicate data. This communication can originate from many sources, and it can be targeted to many destinations. The target of these communications is called the receiver. Each receiver can also be the source of data for other “downstream” receivers. The four permutations of source/destination streaming media flow are: One to One, One to Many, Many to One and Many to Many. Each of these four combinations requires a set of common streaming media services, colored by the specific requirements of the source and destination mix. Protocols have been developed by the Internet Engineering Task Force (IETF) to enable all aspects of streaming media, and the industry is only beginning to fully implement these designs.

On the hardware side, a system should be designed with growth and scalability in mind. A relatively small up-front investment into higher performing components could provide large savings later on. It is also very easy to waste money upgrading components that do not affect the areas in the systems where performance bottlenecks actually exist. We will follow the logical path from data creation to transmission to the receiver and discuss the details as we go.

Data Creation

There is an entire industry devoted to the task of creating multimedia data. Historically, expensive, highly proprietary workstations create most high-end, “Hollywood grade” multimedia. The PC revolution produced lower-cost alternatives for many of the tasks that previously required expensive solutions. These solutions were initially on Macintosh computers and have migrated to Windows-based systems, but they were still proprietary systems that locked the user into a dependency on the vendor.

There has recently been a move to Linux. Besides the “free beer” aspect of Linux, end users see a real advantage with Linux because they can free themselves from a dependency on any single vendor. Time-critical content creation is the hallmark of Hollywood production, so companies cannot depend on even the most propitious vendor. In order to protect their own businesses, they must be in a position to do the whole job themselves.

The Visual Effects Society is an industry group with about 24 member companies, each of which base their business on performing the various tasks required to create multimedia data. That society has announced a desire to move entirely to Linux within the next few years, but there is much work that needs to be done on the Linux OS in order to make that possible. Fortunately, the migration of streaming technology to the Linux OS is well underway.

If you want to provide streaming media services, the entire process depends on the data that you intend to stream. Will you stream live video? DVD or CD-ROM content? Will the data include computer-generated graphics images or some composition of several media types? How will the data be mixed and manipulated in order to produce the final data that you intend to stream to your clients? These questions must first be answered at a strategic level—what is your planned business? Then the answers must be researched based on the equipment, skill sets, time and financial resources available. Those parameters must be considered in terms not only of the data creation itself, but also in terms of the remaining services that you must provide for the entire solution. If, for example, your business is to stream recorded content to paying clients, you had better understand the requirements of your target audience. Will they pay per view, or will they expect “free” content? What is the mix of client technology that you have to support with your streams? These, and many other questions, which we touch on as we explore each step in the process, affect your decision on the software and hardware you need in your system.

______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState