Amateur Video Production Using Free Software and Linux
In 1999 I purchased my first DVD player. My wife and I had a small collection of VHS tapes containing videos that we wanted to view using our new purchase. Furthermore, optical media is very convenient and stable, and the idea of storing our video collection on CD-R discs was very attractive to us. What followed was a very indepth investigation that has flourished into an interesting hobby. In this article, I cover how to digitize analog video sources for storage and manipulation on a computer, tools for editing video on a computer and some options for publishing digital videos. One publishing option I present is storage in the video CD (VCD) format, which is compatible with many DVD players. All of these steps are performed using free software.
I am a software developer, not a video producer, so please bear with me as you read this article.
The first obstacle I encountered in my work to convert my videos was how to digitize the analog VHS tapes. Because I wanted to convert standard analog video tapes, IEEE 1394 (Apple calls this interface FireWire; Sony calls it i.LINK), though extremely powerful, defines a purely digital interface and would not suffice. Instead, I decided to purchase a video capture card. Many vendors produce these cards, which take standard analog video streams and digitize them for storage or display on a computer. I bought Hauppauge WinTV PCI video capture card that works nicely with Linux for around $80 US. Incidentally, the Linux driver framework for video capture cards is named Video4Linux.
There are a few important considerations to make when purchasing a video capture card, though they are becoming less relevant as the speed of computers continues to increase. Because capturing video from most analog sources must occur in real time, writing raw video to disk requires a very fast hard drive. In my experience, even a 10,000 RPM SCSI drive has difficulty storing raw 24-bit video with a resolution of 640 x 480 and a frame rate of 23.9 frames/second. Think about it: around 30 frames per second, 640 x 480 = 307,200 pixels per frame, and each pixel is 24 bits. In order to store uncompressed video of this quality, a hard drive needs to write 2.21 x 108 bits, or around 26MB every second!
Don't run out and buy an expensive high-speed disk array quite yet—an alternative exists. Compressing the raw video before writing it to disk shifts some work away from the hard disk. Compression can be done either by a dedicated processor, shifting work to video capture card compression hardware, or in software, shifting work to the system's CPU. Since my system has two 1,000MHz CPUs, my cheap Hauppauge card, which lacks compression hardware, performs just fine. If your computer's CPU is a little slower, it may make more sense to invest in a video capture card with hardware compression capabilities and save a relatively expensive CPU upgrade for later.
Capturing raw or losslessly compressed video is ideal for editing purposes, but capturing using a carefully chosen lossy technique such as MJPEG, which stores each frame using JPEG still image compression, is a realistic compromise. JPEG compression can be performed relatively quickly in software. In addition, many hardware video compressors output MJPEG.
Even when compressing a video stream before writing it, hard disk speed is important in digitizing video. It follows that the filesystem used is a large factor in performance. I have experimented with the ext2, ReiserFS and XFS filesystems. My experience is that capturing video to an XFS filesystem generally outperforms capturing to ext2- or ReiserFS-formatted disks. XFS has the additional benefit over ext2 of being a journaling filesystem.
Andrew Morton's low-latency kernel patch also seems to help the digitization process. I find that with Andrew's patch I am able to perform minor tasks on my computer while capturing video without losing too many frames.
As I am from the United States, I am interested in using the National Television System Committee analog video format (NTSC). Many Europeans may be more interested in PAL, which has similar properties. If you live elsewhere, a little research will reveal the analog video format used in your region. My VHS tapes are encoded using NTSC. NTSC has a range of acceptable resolutions and frame rates; when capturing from a VHS source I generally capture 640 x 480 frames at a rate of 23.976 frames/second. Though VCDs, being digital, don't have a video norm such as NTSC, DVD players generally use the frame rate that a VCD contains to decide what type of analog signal they will send to the television to which they are connected. For example, if I encode VCDs at 25 frames/second, my DVD player outputs a PAL signal that looks distorted on my NTSC television. If I encode the same video stream at 23.976 frames/second, a valid NTSC frame rate, my DVD player outputs an NTSC signal to my television.
Digital media streams found on a computer are generally stored as a wrapping format containing one or more audio and video tracks. Examples of wrapping formats are AVI and QuickTime. QuickTime has the advantage of being well defined by Apple, supported on Linux and able to store video streams much larger than 4GB. Within the wrapping format, different compression techniques such as MJPEG, OpenDivX, Ogg Vorbis and MPEG audio may be used. These compression/decompression techniques are often called codecs. Wrapping formats such as QuickTime also can contain storage-intensive raw digital audio and video.
I have found that streamer, part of the xawtv package, performs the digitization task nicely. Using streamer, my system can capture 640 x 480 video at a frame rate of 23.976 frames/second from my video capture card and compress it in real time to an MJPEG encoded QuickTime before writing it to disk.
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
|Non-Linux FOSS: Seashore||May 10, 2013|
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- Drupal Is a Framework: Why Everyone Needs to Understand This
- Validate an E-Mail Address with PHP, the Right Way
- A Topic for Discussion - Open Source Feature-Richness?
- New Products
- Download the Free Red Hat White Paper "Using an Open Source Framework to Catch the Bad Guy"
- The Secret Password Is...
3 hours 58 min ago
- Keeping track of IP address
5 hours 49 min ago
- Roll your own dynamic dns
11 hours 3 min ago
- Please correct the URL for Salt Stack's web site
14 hours 14 min ago
- Android is Linux -- why no better inter-operation
16 hours 30 min ago
- Connecting Android device to desktop Linux via USB
16 hours 58 min ago
- Find new cell phone and tablet pc
17 hours 56 min ago
19 hours 25 min ago
- Automatically updating Guest Additions
20 hours 34 min ago
- I like your topic on android
21 hours 20 min ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?