Open Source, it is not just for Linux anymore
I was involved in an email discussion the other day with a fellow Amateur Radio operator about a program called UI-View, a Windows-based application for the Automatic Position Reporting System. In the course of our discussion I inquired into the state of the source code, having pointed out that some of the interfaces should be reviewed to take advantage of some of the newer mapping tools. I was informed that the source code had been destroyed on the author’s death, at his request. This made me pause.
I paused, not because the author was dead, I knew that. In fact, Roger Barker G4IDE had been dead for several years before I got around to using his software. I paused because I was absolutely stunned that any Amateur, a member of a community that prides itself on innovation, experimentation, and community mentoring would willing destroy the source code for what is arguably a professional grade piece of software, well thought out, and very functional. This made me pause and review the software I use as an Amateur.
There is a large amount of software that is in use by the Amateur community that is open sourced. Most (dare I say all?) of it is Linux-based, which is not a surprise. In my tool box, I have software for logging contacts, running APRS, programming my radios (and each one is slightly different for each radio) and doing packet work. It would seem every single one of them is closed (or at the very least, not openly saying they will share their source).
Perhaps the developers have a good reason for this. Perhaps they are unaware of the advantages of being open sourced. One of the most beneficial to my mind is keeping the code going after the passing of the original author (either by death or frustration). Good applications show their value and people pick up the torch and keep it going, even if the originating author is done with it. Other advantages, as have been stated elsewhere, include fewer errors in the code, a more rapid time to completion and those gee whiz! moments of insight that move the state of the art forward.
Certainly there are issues related to resources. It is not cheap to procure a compiler and development environment for the Windows platform, learn the interfaces, or is it easy to test software to be used in a community that has been described as the cheapest group of individuals on the planet. My tongue is only slightly in my cheek. Amateur Radio operators will acquire any old piece of technology and keep it running if there might be a value to it. I have seen some amazing things come out of these junk drawers, from antennas to interfaces between old tube radios and modern computers. And computer technology runs the gamut from modern laptops running Vista and Linux to clunkers that barely boot running DOS…version 2. So I can understand why a developer might want to make a buck with his or her code.
One of the more interesting things I have seen, outside of the shareware model for raising funds, although not as much recently, is the wishlist model in the open source community. I first saw it with Tobi Oetiker’s MRTG program all those years ago with his CD wish list. Today he has a much more robust funding model, primarily spurred by the ability to target ads, something that was not as well defined in the late 1990s. But by and large, the model in the open source community has been to crank out the code and make it available for the joy of doing it, or because the developer or group of developers saw a problem and a solution and thought that others could benefit from it.
On the Windows side, however, this generally is not the case. There are a number of Windows programs that are open sourced, but many of these did not start as Windows programs. Notable programs are Pigin and Wireshark two programs that started in the Linux realm and were ported to Windows because of demand, but it does not seem that programs that start natively in Windows-based development are developed with the same sense of … well openness.
So, to those who code, especially those who code on the Windows platform, I am not opposed to your recouping of costs, but I would encourage you to open source your code. You might be pleasantly surprised at the results. And to those who already do, my thanks.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- Tech Tip: Really Simple HTTP Server with Python
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
- Rogue Wave Software's Zend Server
- Parsing an RSS News Feed with a Bash Script
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide