Looking for Algae--the Next Voyage
The Brazilian winter was almost over, and while the mild winters in Florianopolis allowed me to work on the Agape, the coming of spring meant that it was time to set sail for new adventures.
Grayson, the youngest of the Pollywogs, showed up early in the morning at the dock, as I got ready to sail. "What are you doing?", he asked. "I am getting ready to study algae", I answered. "Algae!", he exclaimed, "why are you looking for that?"
Algae is very important to our lives. It is estimated that 73% to 87% of the net global production of oxygen is produced by algae. It is at the bottom of the food chain for most life in the sea, and it is being studied as a way to produce food for the teaming masses of tomorrow. Yet algae, like a lot of other things, is being threatened by global warming and other aspects of pollution.
How does this fit in with computing? Most of the readers of Linux Journal have at least one computer in their houses. I lost count of my own computer stock at about 15, and some of them are real electrical power-eaters. A lot of them have really dangerous chemicals in them, like lead and acidic materials. Fortunately, over time, power requirements per CPU and graphics cycles have gone down, as have costs for the hardware. Manufacturers, either through legislation or social and civic concern, have moved to making their systems from more environmentally-friendly components.
Unfortunately, we still have desktop systems today that are measured in the hundreds of Watts. My desktop machine has a power supply rated at 450 Watts, and I recently saw an IBM workstation rated at 850 Watts. Although it is true that these machines do not soak up that much power continuously, the power meter on the house does cycle quite fast when they are turned on.
Normally, we do not think too much about the cost of electricity when we run these systems, because either we do not pay for the electricity (our school or office does) or we are careful to turn off the computer when we are finished with it. However, several colliding factors may change our thinking for us.
First, the number of computers is poised to increase dramatically. It took us 60 years or more to manufacture the first billion general-purpose computers. Due to the drop in price of computers, it is estimated that the next billion computers will take only a few years. In the USA alone, there are families that have a computer for each child, for each parent and a home router. At some periods of the night, all of these systems may be on while Mom and Pop work and the kids scan the Net, do homework, play games or call their friends over VoIP.
The next thing to consider is that computers are replacing other types of electronics in our daily lives. VoIP telephony, multimedia systems(think LinuxMCE) and messaging systems of all types are beginning to be commonplace. Each of these systems shares a common characteristic: they do not function if the computer is turned off. These duties used to be managed by fairly low-power specialized devices, such as TiVos or drone devices, such as analog telephones.
Some people are replacing these relatively low-powered specialized devices with "media centers", relatively high-powered CPUs that act as servers for all of our daily needs, complete with "not-so-thin" clients that exist in every room. These systems have to be "always on" to do their jobs. This moves the demand for electricity to new highs.
I met a man who had five Pedabytes of data in his home cellar--a combination of TV shows he had recorded and movies he had copied onto his server. When I mentioned how cool it was, he said, "No, it is actually quite hot."
This brings about another point: it takes power to run the computers, and it takes power to cool the computers. In hot climates, the power for cooling the computers is almost equal to the power to run them.
Finally, all of this is compounded when you are "off the electric grid". When you are not hooked up to a dependable, reliable, adequate, inexpensive power supply, "always on" computing really suffers.
"Off the grid" could mean a city facing long brown or black outs. Having an uninterruptable power supply for mission-critical computing (and what could be more mission-critical than recording "The Simpsons") gets exponentially more expensive as the power requirements go up.
"Off the grid" also could mean people who have to generate their own power. This could be due to being in the middle of the Amazon jungle, far from any power lines, or it could mean being on the wrong side of a mountain in upstate New Hampshire. Or, it could mean being on a small island atoll or a small sailboat in the ocean. When you are "off the grid", the costs of generating your own electricity often increase dramatically.
In Brazil, the world's largest hydroelectric plant is called Itaipu. It generates 14 Gigawatts of power. Itaipu can generate enough electric power for 40 million desktop systems at 350 Watts of power apiece. It will take another 25 "Itaipus" for the next billion computers, and another 25 "Itaipus" (more or less) to cool those next billion computers.
We need to do our computing in smarter ways.
Today, there is a lot of concentrating on low-power, environmentally-friendly computing. Computers are being manufactured that are also RoHS-compliant, which means when they are discarded, they are easier on the environment.
It also means concentrating the computing power on the task at hand. It does not take much power to move data from an Ethernet port to a disk, or vice versa. Therefore, server systems that can do this simple task can be made to be very low power. It means turning off the high-power gaming machine when you are not playing the game, but leaving on the low-power server to capture your next TV show, or to accept and channel your next VoIP call.
The Linux Foundation has developed a new initiative to work on power management. This is good news. Linux can become (along with the choice for supercomputing, embedded systems and server farms) the "green operating system". We should help both the Linux Foundation and Linux move along this path, if not just to save ourselves money in electricity
and cooling, if not to help those "off the grid", then to help save the algae.
I finished stowing the gear on board the Agape. My time ashore was over, and it was time to shove off and look for new algae fields to study. I was fairly sure the Pollywogs would miss me, and I certainly would miss them and my other friends who visit the restaurant "Alideia dos Piratas".
As Grayson helped me with the docking lines, he asked me when I would be back and when he would see me again. I gave Grayson one last hug and a solemn salute before I climbed aboard.
"Perhaps never", I answered, "but when you really need me, I will return."
I did not look back as the Agape sailed away from Florianopolis, where I had spent so many happy months and met so many very wonderful people.
Jon "maddog" Hall is the Executive Director of Linux International (http://www.li.org), a nonprofit association of end users who wish to support and promote the Linux operating system. During his career in commercial computing, which started in 1969, Mr Hall has been a programmer, systems designer, systems administrator, product manager, technical marketing manager and educator. He has worked for such companies as Western Electric Corporation, Aetna Life and Casualty, Bell Laboratories, Digital Equipment Corporation, VA Linux Systems and SGI. He is now an independent consultant in Free and Open Source Software(FOSS) Business and Technical issues.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Doing for User Space What We Did for Kernel Space
- Google's SwiftShader Released
- Parsing an RSS News Feed with a Bash Script
- SourceClear Open
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide