I've been deeply involved with MPEG for several months now, so your article on MPEG-1 playback programs caught my eye (LJ May 2001). I found several factual errors about MPEG in the article.
First, MPEG-2 video does not necessarily have better video quality than MPEG-1: the differences between the video portions of the MPEG-1 and MPEG-2 standards are fairly minor. An MPEG-1 file with a 720 x 480 frame size, compressed to 6 Mbit/s is legal, and will be very similar in appearance to a 720 x 480 6 Mbit/s MPEG-2 movie.
Second, the author states that “not all MPEG-1 files are entirely `compliant”'. The MPEG standards define what a compliant decoder is expected to be able to handle, but not how a compliant encoder works. In other words, a compliant decoder is not one that is “tolerant”, but is instead one that adheres closely to the letter of the ISO 13818 standard, so as not to be surprised by the output of a novel (but legal!) encoder. Unfortunately for free software authors, these documents are expensive: the basic set of MPEG-2 PDFs (ISO 13818-1 through -3) costs $424, and the complete set is $1,390 (from http://webstore.ansi.org/). It's little wonder that commercial MPEG products far outstrip free ones' capabilities, in nearly all cases.
Third, the author states that an “MPEG-1 audio stream...is an MP3 file”. This is not always true, and is not even likely. MPEG-1 defines three different audio encodings (or “layers”). Layer I audio is the most basic, but it isn't used very often. Layer II is the most common: video CDs and MPEG files you download from the Internet almost always use Layer II audio. Layer III (aka MP3) is optional, so most MPEG decoders don't include support for it. Since few decoders support Layer III audio, most encoder creators also don't bother including support for it.
Just a note to let you know that the certified sword cuts both directions (Editorial focus, LJ May 2001). As CIO of a multimillion dollar corporation it is my job to not only run the IS department but to hire and fire employees. One of the first questions I ask is if the prospective employee has any certifications, if so I politely tell them “I'll let you know if we can use you” and promptly throw their application into the garbage can.
Before you ask, no I am not certified. I have however taught pre-MCSE classes at Unisoft Institute of Technology in Houston and was horrified to learn that I had to teach the way the test worked, and the way the real world worked (pre-MCSE in this case really meant A+ and Networking certification). This effectively meant that I held two classes in one, which to say the least was difficult. Additionally, long ago before computers were my profession I was an ASE-certified mechanic. Since I have passed 10 ASE certifications I can tell you that they are just as much a joke as computer certifications. I quickly realized that even holding all the certifications I did, and after graduating from a top automotive technical school with a 4.0 GPA and Alpha Beta Kappa National Honor Society, that I was not a very good mechanic.
To me, being certified means that the person does not have enough knowledge or experience to get the job on their own merits and hopes that this piece of paper will help them, and it does not. In my experience the only time certifications help you is when you are applying to a business where the person responsible for creating hiring policies is not a real computer technician.
I am forced to deal with MIS and IS degrees from recent college graduates as well as a plethora of certifications on a regular basis. Unfortunately, I have found that the people who have neither a degree nor a certification but who have been working with computers for ten years are much better equipped to handle the job. At least if they are inexperienced I can teach them the way things really work instead of attempting to retrain them after they have their degree or certification.
I very much enjoyed the April 2001 Linux Journal article “Linux on Carrier Grade Web Servers”. You did a nice job of describing the software choice, hardware environment and test results. I look forward to future articles discussing the other LVS implementations (direct routing and IP tunneling) and comparing their stability and performance with that of the NAT implementation.
I enjoyed your articles on Linux Certification (LJ May 2001). I thought the “real-life” experience was very telling, although perhaps toned-down a bit to protect the vendors.
Here's my thoughts on what I read:
We can earn an extra $10K per year by becoming certified? Really? Who can? New grads? Having worked on, supported and/or maintained SVR4, AIX, HP-UX and Solaris for twenty years, I can't imagine getting another $10K just because I had some Linux certificate.
I looked at some of the questions from Red Hat's and Sair's study guides and tests. What a crock! “What's the fdisk type code for a Linux swap partition?” Who cares?! Look it up by typing l to list the types. Forgot that command? Type ? to list them all. Better yet were the impossible to understand questions and answers on Sair's test. Their “correct” answer for wc -l * is that it returns the total number of lines in the files. Gee, my experience is that it shows the line count for each file, followed by the total, but that answer isn't available. Yes, they want the entire Linux community to help improve the tests, but if they can't get the simple things right, I'd hate to see how they do with the hard topics.
Finally, the exam companies could learn a good lesson from the FCC and ARRL. The amateur radio exams are also multiple choice, but they are composed of a certain number of sub-elements. Each sub-element has a number or required topics. Each topic has a number of published questions and answers, with references to the rules and regulations. The effect is, the actual exam might have only 25 questions, but those questions are pulled from a pool of several hundred, and each critical element is covered.
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- Linux Systems Administrator
- New Products
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- Designing Electronics with Linux
- Dynamic DNS—an Object Lesson in Problem Solving
- Using Salt Stack and Vagrant for Drupal Development
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Favorite (and easily brute-forced) pw's
44 min 48 sec ago
- Have you tried Boxen? It's a
6 hours 36 min ago
- seo services in india
11 hours 8 min ago
- For KDE install kio-mtp
11 hours 8 min ago
- Evernote is much more...
13 hours 9 min ago
- Reply to comment | Linux Journal
21 hours 54 min ago
- Dynamic DNS
22 hours 28 min ago
- Reply to comment | Linux Journal
23 hours 27 min ago
- Reply to comment | Linux Journal
1 day 17 min ago
- Not free anymore
1 day 4 hours ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?