Making MPEG Movies with Axis Network Cameras

by Kyle Amon

One of the earliest companies to realize the commercial value of Linux for embedded applications was a Swedish company by the name of Axis Communications. Founded in 1984, they specialize in network-attached peripherals and produce, among other interesting products, a line of network-addressable cameras. There are several models offering various features with differing performance characteristics, but they all produce compressed JPEG images. These images can be viewed live, directly from the cameras themselves or via intermediary web servers in order to offload high hit rates to more powerful hardware. Although these cameras don't contain an inherent ability to produce MPEG movies, their images can be archived on another system and turned into MPEG movies utilizing freely available software-MPEG encoders.

When Axis introduced their network cameras in November 1996 with the Axis NetEye 200, they weren't running embedded Linux yet. Rather, they employed an Axis proprietary instruction-routing scheme called OSYS running on their ETRAX 4 chip design. By the summer of 1998 however, Axis was already on its way to completing their eLinux/ETRAX port for use as the basis of all future products. As a result, eLinux/ETRAX is now the engine behind all their current products.

In June 2000, they released the Axis 2100 network camera containing the new ETRAX 100 chip, Axis' fifth-generation, optimized system-on-a-chip design that includes a 100MIPS 32-bit RISC CPU, 10/100Mbps Ethernet controller, advanced DMA functionality and a wide range of I/O interfaces. These cameras use a ¼ inch Sony progressive scan RGB CCD for image capture and incorporate the new Axis ARTPEC-1 chip providing hardware JPEG compression, delivering up to ten frames per second.

The Axis 2100 camera as well as the more full-featured Axis 2120 camera released October 2000 both run eLinux/ETRAX, Axis' port of the 2.0.38 Linux kernel with uClinux patches for MMU-less processors. This kernel includes Axis' Journaling Flash File System (JFFS), a log-structured filesystem designed to be used on Flash-ROM chips that remains consistent even through power-downs and crashes, thus obviating the need for fsck. The eLinux/ETRAX kernel also includes a Bluetooth stack, and although the 2100 series cameras don't support Bluetooth themselves, it is worth noting since Axis would be more than happy to have you use an eLinux/ETRAX solution for your next project. They offer a prototyping developer board, are willing to customize it to suit and their eLinux/ETRAX distribution along with accompanying documentation is available on their developer's web site at

Setting up the Environment

I recently installed a couple Axis 2100 network cameras for a client wanting to monitor his business from home. I put in a channel-bonded, 2BRI ISDN line with compression, providing a network connection of about 300Kbps between his home and his business. Once the cameras and the network connection were in place, he could easily monitor activity at his business via a web browser from his workstation at home. This was great for viewing activity in real time, but he also needed the ability to view the day's activity later. To accomplish this, I configured the cameras also to archive images to the hard drive of a nearby Linux server and wrote a couple Perl scripts to create MPEG movies from these images once a day. One script makes MPEG-1 movies, the other makes MPEG-2 movies, and both are discussed here.

Since these cameras don't have the ability to make MPEG movies themselves, most of the actual moviemaking activity takes place on a nearby Linux server. This is where the JPEG snapshots from the Axis network cameras are archived and later turned into MPEG movies. Although it should not need mentioning, this server requires plenty of free disk space because working with large numbers of image files consumes disk space at an astonishing rate. Furthermore, a number of factors affect the amount of disk space required. The dimensions of the images archived and the amount of compression used on them greatly affects the size of each image. The frequency with which new images are archived and the length of the movie created from them also greatly affects the amount of space required for image archival. All of these factors, along with the options used during the MPEG encoding process itself, affect the size of the resulting MPEG file. Multiply all of this by the number of cameras the images are archived from and it becomes evident that a little careful planning up front goes a long way toward minimizing disk-space headaches later on.

Another factor to consider is that software-MPEG encoding is an extremely processor-intensive activity. The Linux server used for the encoding process therefore needs to have a sufficient amount of horsepower to handle this job. The actual amount of horsepower needed is dependent on the same factors mentioned above regarding disk-space requirements, so proper planning here, as always, is the most important ingredient of sustained success.

Once the Linux server is ready, the Axis network cameras need to be configured to archive their JPEG images, numbered sequentially, to a directory on this server. The images can be pushed from the cameras utilizing their own FTP facilities, or they can be pulled off by a program running on the Linux server. While I am still experimenting with the best method to achieve this, I am currently using a program developed by Axis for their 200 series cameras that is no longer officially supported by them. The program is called eye_get, and while it doesn't seem to be available directly from Axis anymore, the source code for it is currently available at

The eye_get program compiles easily on Linux and comes with a nice man page. It is fairly flexible with many options, including the ability to pull images from multiple sources when using a configuration file. I'm currently using /etc/eye_get.conf as a configuration file with the following contents:

-s0 -r/cgi-bin/image320x240.jpg -f
-s0 -r/cgi-bin/image320x240.jpg -f -ma cam2

Each line of the configuration file configures image archiving from a separate Axis network camera. The -s option tells eye_get how often to retrieve images in seconds, and using an argument of 0, tells it to retrieve images continuously as fast as resources will allow. The -r option specifies the name of the image file to retrieve, and the -t option specifies where to put this file and what to call it. The -t option modifies the name of the stored image file according to the arguments given. Using arguments d and n tell eye_get to insert the current date and sequential number of the retrieved image into its name, such that the fourth image retrieved on February 3, 2001 from cam1, for example, would be named cam1_010203_4.jpg. The -l option specifies a file to use for logging, and the -mh and -ma options specify an SMTP server and e-mail address respectively for use in error notification.

At the start of each business day, eye_get is run via cron with the following command:

/usr/local/bin/eye_get -i /etc/eye_get.conf >/dev/null &

At the end of each business day, it is stopped via cron with the following command:

kill `ps ax|grep [e]ye_get | awk '{print $1}'`
This provides the set of date-stamped, sequentially numbered JPEG images from each Axis network camera used to create MPEG movies of daily activity.
Making MPEG-1 Movies

The MPEG-1 (ISO/IEC 11172) standard specifies a compressed audio and video bit stream with progressive frame coding, designed for early CD-ROMs and other applications producing data rates of up to 1.5Mbps. While the specification provides for sampling and bitrate parameters beyond those commonly used, what is generally referred to as MPEG-1 is really only a functional subset of MPEG-1 known as Constrained Parameters Bitstreams (CPB). The Source Input Format (SIF) of CPB, derived from the CCIR-610 digital television standard, uses 352 x 240 images for NTSC and 352 x 288 for PAL/SECAM. Video sequences are composed of a series of groups of pictures (GOP) and each GOP is composed of a sequence of frames. Each of these frames may be an I (intra), P (predicted) or B (bidirectional) frame. Further division of the video stream is into slices, macroblocks, blocks and motion vectors, but detail at this depth is beyond the scope and requirements of this article.

To encode the archived JPEG images from Axis network cameras into an MPEG-1 video file, I wrote a Perl script called (see Listing 1 at It uses the UCB MPEG-1 encoder, mpeg_encode, to perform the actual encoding and works by generating the parameter file required by mpeg_encode at runtime. This encoder is therefore obviously required to run the script successfully and is available in deb package format as ucbmpeg_1r2-6.deb, and in the Debian nonfree section, as well as in RPM package format as mpeg_encode-1.5b-2.i386.rpm from rpmfind. The original source tarball is available from the Berkeley Multimedia Research Center web site at along with a postscript users' guide that I highly recommend if you want to customize the encoding parameters.

Since the MPEG-1 standard doesn't allow for framerates below 23.976 frames/second, fudges the framerate in the temporal direction by specifying each input file multiple times, depending on the rate at which the source JPEG images from the Axis camera were archived. I also tweaked the parameters of the generated-parameter file for maximum speed of execution and minimum size of the resulting MPEG-1 file. Although these parameters are suitable for my purposes, they are undoubtedly not the best for all circumstances, so you may want to experiment with modifying them for your own needs.

To display the usage and syntax of, execute it as: --help

or with no arguments at all, and it will inform you that running the following command: --src /some/dir --dst /tmp/movie.mpg --cam cam1 --ips 4 --date
for example, would make an MPEG-1 movie in file /tmp/mmddyy_movie.mpg, out of JPEG files beginning with the string cam1, in directory /some/dir, where the source JPEG files were generated at four images per second. This and whatever additional glue may be required for a particular application is now all that is needed to make MPEG-1 movies from Axis network cameras.
Making MPEG-2 Movies

The MPEG-2 (ISO/IEC 13818) standard specifies a compressed audio and video bit stream with either interlaced or progressive frame coding designed for broadcast TV using the CCIR 601 standard as a reference, providing for data rates from 2 to 10Mbps. It was later extended to include HDTV and other applications producing data rates of up to 50Mbps. With MPEG-2, the most common combinations of sampling and bitrate parameters have been grouped into levels. The two most commonly used levels are main and low. Main level is used for CCIR 601 sampling rates, and low level is used for SIF sampling rates. Coding algorithms have been grouped into profiles. Using MPEG-2's main profile and main level is analogous to using MPEG-1's CPB with CCIR 601 sampling limits.

To encode the archived JPEG images from Axis network cameras into an MPEG-2 video file, I wrote a Perl script called (see Listing 2 at It uses the MSSG MPEG-2 encoder, mpeg2encode, to perform the actual encoding and works by generating the parameter file required by mpeg2encode at runtime. This encoder is part of the MSSG MPEG-2 Video Codec and is obviously required to run the script successfully. It is available in RPM package format as mpeg2vidcodec-1.2-1-i386.rpm from rpmfind and can be converted into deb package format with Debian's alien utility. The original source tarball is available from the MPEG Software Simulation Group web site at

Unlike the script, is dependent on two external utilities in addition to the encoder. The identify utility is used to get the image dimensions of the source JPEG files, and the convert utility is used to convert them into YUV format prior to encoding. Both of these utilities are part of the ImageMagick collection of image manipulation utilities and libraries. ImageMagick is available in deb package format as part of the Debian distribution and in RPM package format as part of the Red Hat distribution. The original source tarball is available from the ImageMagick web site at

Since the MPEG-2 standard shares the 23.976 frames/second, framerate limit of the MPEG-1 standard, fudges the framerate in the the temporal direction in the same manner as, although it accomplishes this quite differently due to the differing requirements of mpeg_encode and mpeg2encode. In most respects though, works similarly to The parameters of mpeg2encode's generated parameter file have also been tweaked for speed of execution and minimum size of the resulting MPEG-2 file, but you may want to experiment with modifying these parameters for your own needs as well.

To display the usage and syntax of, execute it as: --help

or with no arguments at all and it will inform you that running the following command: --src /some/dir --dst /tmp/movie.mp2v --cam cam1 --ips 4 --date
for example, would make an MPEG-2 movie in file /tmp/mmddyy_video.mp2v, out of JPEG files beginning with the string cam1, in directory /some/dir, where the source JPEG files were generated at four images per second.

If you investigate the script closely, you may notice that although the mpeg2encode utility can't use JPEG files for input, it is capable of converting them to YUV format itself, although I have inexplicably used the ImageMagick convert utility in order to perform this function separately. If you are tempted to modify the script to have mpeg2encode perform this conversion itself, be forewarned that the reason I used an external utility for this is that mpeg2encode likes to consume all available memory when asked to perform this conversion itself.

Kyle Amon founded GNUTEC, Inc. and is a principal of IPDefence. He has spoken about Linux security for SANS and is coauthor of the Red Hat Linux Installation and Configuration Handbook by QUE. A member of USENIX, SAGE, ACM and ICSA, he has championed Linux and free software since Linux kernel discussion first hit Usenet.

Load Disqus comments