On standards and standards bodies
What does it mean to be open.
My copy of Oxford defines open as: unconcealed circumstances or condition. Way back in the day when the GNU operating system was getting going, they coined the mantra: Free software is a matter of liberty, not price. To understand the concept, you should think of free as in free speech, not as in free beer.
Last month, I talked about transparency and how important it was in software and systems. Just as important are standards, and, more important following those standards. Today, in Computerworld, a different issue has been raised. The value of standards.
Way back, last year, there was a ratification of standard by the International Standards Organization (ISO), the same group of people that brought you the stupid label guy (ISO9000), IS-IS routing (does anyone really use it?) and of course, the OSI stack (Please Do Not Throw Sausage Pizza Away). The standard that was ratified was the Open XML standard. Now, I am not that much of a geek to be able to accurately reflect the arguments for the Microsoft (ratified) version and the non-Microsoft (not ratified) version that came to pass. I won’t lob too many stones at Redmond (that bastion of standardization), but I will highlight one point. There are some countries who are less than happy with the ISO and, in fact, are so dissatisfied that they are questioning not only the Open XML standard, but the value of any of the ISO standards at a national level.
My father used to work for the telephone company, back before Judge Green broke up AT&T. He has since moved on and dabbled in the computer industry and is currently working on smart buildings. One of his constant complaints is the lack of standardization in the computer industry. And this from a man who helped a couple of companies actually make money back when computers were expensive items. In many ways, I have shared in his frustrations. He is management, but technical enough to grasp most of the issues. I am a technician and have had to wrestle with the standardized non-standards in the industry. Even something as simple as a PCI slot is enough to drive you nuts (and if you have been around for a while, we all remember the headaches of EISA, and “where is the disk…”). Standards are important, but for a standard to be accepted, it has to work, and it has to work well. We can all look at the standard wars between Betamax and VHS (or Lightscribe and Labelflash) to see how important, or how mind numbing the different standards can be and how much they can affect the technology that is adopted, and as we have seen, better does not always win.
But when countries start questioning the entire standardization process, or worse, as is the case with the fight over Open XML, start accusing the standards body of being unduly influenced by corporate concerns, we then have a real issue that needs to be looked at deeper. Standards bodies cannot afford to be even thought of being driving by a corporate perspective, despite the fact that many standards start out that way. Standards bodies, to be of any value must be independent, and must be willing to consider, up to a reasonable point, objections to the standard. If not, then the whole issue of a standard is moot.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Managing Linux Using Puppet
- Tech Tip: Really Simple HTTP Server with Python
- Parsing an RSS News Feed with a Bash Script
- Google's SwiftShader Released
- SuperTuxKart 0.9.2 Released
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide