Satellite Remote Sensing of the Oceans
The remote sensing group at Southampton University Department of Oceanography (SUDO) works on many different aspects of Earth observation for applications such as climate monitoring, pollution control and general oceanography. Linux first appeared in the Oceanography Department in 1994 and has steadily become the workhorse OS.
Earth observation is now a familiar topic to almost everyone. Many of us are used to seeing satellite images of the Earth every day whether it's in the news, TV documentaries or just the national weather bulletin. More and more Earth observation satellites are being launched all the time. Here at SUDO we deal primarily with observation of the world's oceans, and we use many types of satellite data in studying various surface features. Satellites produce vast amounts of data, and large satellite images can be very difficult to use for this reason. The days when a low-powered PC could churn through a satellite dataset are long gone. Today we are required to process more and more images as the speed of the hardware gets faster.
We use IDL (Interactive Data Language) to process all our satellite and meteorological data. This package is very good as it allows us to read, warp and manipulate images easily as well as providing lots of useful built-in math functions for detailed analysis of our data. IDL (http://www.rsinc.com/) is now on its fifth release and is extensively used by the satellite community all over the world. IDL places a lot of demands on the system, so on anything less than the quickest 486, IDL will be incredibly slow. These days IDL is available for almost every type of platform.
Sooner or later, when a machine is pushed to the limits to do its job, you may have to worry about the operating system (OS) you are using. When you are running jobs that take hours or even days to run, the last thing you need is your system to crash or hang. Similarly, when your system isn't fully loaded, you would like to be able to run several jobs to make more efficient use of your resources. A few years ago we discovered that by using Linux we could accomplish all of this. As usual though, the change had to come from within in order to establish Linux here and show its strengths against systems such as MS Windows.
When I arrived in the Department, it became apparent that I would need a reliable method of wading though the enormous amounts of data I was required to process. I became particularly frustrated with the Microsoft Windows environment which frequently crashed and gave everyone headaches. In our research we can't afford to have our programs crashing, particularly since they take so long to run. In those days you could perhaps process one image in an hour, if your top-of-the-line 486 was up to it, so frequent multiple crashes could quite easily eat up much of the day. Of course, our single SPARC station never had these kinds of problems so there was fierce competition between colleagues to use it. We tried at first to use the machine remotely from our PCs using Vista Exceed. This proved very successful at first, but eventually the SPARC station became so overloaded with users that it was quicker to go back to using Windows again.
Then one of our Mexican Ph.D. students, Miguel Tenorio (now at CISESE, Ensenada in Mexico), told us of an exciting OS called Linux which could turn a humble PC into a powerful Unix workstation. I was very skeptical at the time and refused to believe that the Slackware version he had running on his 386 could possibly improve our efforts, especially since he was not able to run X because of memory limitations. However, he proved us all wrong later when he installed a full Slackware version with X on one of our better 486s, and since that time we have never looked back. Now our group has a suite of high-spec 486 and Pentium machines running Caldera Linux and Slackware Linux and utilising the powerful processing and data analysis power of IDL. IDL is the software we use most for our data processing, although occasionally we have to write the odd C program or Unix script. IDL has been around on Unix and Windows for a long time and was recently fully ported to Linux, much to our relief.
Global warming is a phenomenon that might very well affect us all one day. By using data from infrared radiometers in space, we can see how the average sea surface temperature (SST) changes over time so we can tell from the satellite image archives whether or not we are seeing changes in the Earth's climate. Because the ocean has such an immense thermal capacity, even a small change (e.g., 0.1 degrees C) in the average SST can imply a huge change in the Earth's heat budget. Unfortunately, the SST as measured from a satellite is a measure of the temperature of only the top millimeter of the ocean and does not always reflect the true sea surface temperature a few centimeters below because of the cooling effects of the wind and evaporation. Sometimes the “skin temperature” of the ocean can be as much as half a degree cooler than that of the bulk temperature just a few centimeters below. It's therefore important that we understand how the skin of the ocean behaves under different meteorological conditions, so that we can apply a correction to the satellite measured SST to account for the variability of the temperature of the ocean skin layer. After all, the skin temperature variability (up to 0.5 degrees C) is larger than the sort of changes in SST we are looking to measure for global climate research (0.1 degrees C).
The biggest source of error in estimating SST from space is the atmospheric absorption due to water vapour in the atmosphere. One method of dealing with this is employed by the Along Track Scanning Radiometer (ATSR) on board the ERS-2 remote sensing satellite. This radiometer views the sea at two different angles, 0 and 55 degrees to the vertical so that there are two images for every patch of sea. By looking at the difference between these two images, taken through two different thicknesses of atmosphere, a correction factor can be calculated to adjust the images for atmospheric absorption.
To study the atmospheric effects and the skin effect on measuring SST from space, I use marine infrared radiometers to measure SST at the same time as the satellites pass overhead, thereby getting simultaneous SST measurements. The ship I use is a ferry vessel, the MV Val de Loire (Figure 1) which sails regularly across the English Channel. Figure 2 shows a thermal infrared satellite image of the English Channel for January 1997 at which time the MV Val de Loire was approaching the French coast. The ship's radiometer and meteorological data are currently being used to evaluate the effect of the cool ocean skin on remotely sensed SST under a wide range of environmental conditions.
Using IDL under Linux I have found I can process all my data simultaneously rather than running batch jobs that eat up the whole DOS machine. One of the future ideas for this project is to have the ship data telemetered back to base here in Southampton, so that it can be processed and archived in real time rather than collected by hand. However, we have to await further funding for that to happen.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Interview with Patrick Volkerding
- Google's SwiftShader Released
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Tech Tip: Really Simple HTTP Server with Python
- SuperTuxKart 0.9.2 Released
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- Returning Values from Bash Functions
- Managing Linux Using Puppet
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide