Satellite Remote Sensing of the Oceans
The remote sensing group at Southampton University Department of Oceanography (SUDO) works on many different aspects of Earth observation for applications such as climate monitoring, pollution control and general oceanography. Linux first appeared in the Oceanography Department in 1994 and has steadily become the workhorse OS.
Earth observation is now a familiar topic to almost everyone. Many of us are used to seeing satellite images of the Earth every day whether it's in the news, TV documentaries or just the national weather bulletin. More and more Earth observation satellites are being launched all the time. Here at SUDO we deal primarily with observation of the world's oceans, and we use many types of satellite data in studying various surface features. Satellites produce vast amounts of data, and large satellite images can be very difficult to use for this reason. The days when a low-powered PC could churn through a satellite dataset are long gone. Today we are required to process more and more images as the speed of the hardware gets faster.
We use IDL (Interactive Data Language) to process all our satellite and meteorological data. This package is very good as it allows us to read, warp and manipulate images easily as well as providing lots of useful built-in math functions for detailed analysis of our data. IDL (http://www.rsinc.com/) is now on its fifth release and is extensively used by the satellite community all over the world. IDL places a lot of demands on the system, so on anything less than the quickest 486, IDL will be incredibly slow. These days IDL is available for almost every type of platform.
Sooner or later, when a machine is pushed to the limits to do its job, you may have to worry about the operating system (OS) you are using. When you are running jobs that take hours or even days to run, the last thing you need is your system to crash or hang. Similarly, when your system isn't fully loaded, you would like to be able to run several jobs to make more efficient use of your resources. A few years ago we discovered that by using Linux we could accomplish all of this. As usual though, the change had to come from within in order to establish Linux here and show its strengths against systems such as MS Windows.
When I arrived in the Department, it became apparent that I would need a reliable method of wading though the enormous amounts of data I was required to process. I became particularly frustrated with the Microsoft Windows environment which frequently crashed and gave everyone headaches. In our research we can't afford to have our programs crashing, particularly since they take so long to run. In those days you could perhaps process one image in an hour, if your top-of-the-line 486 was up to it, so frequent multiple crashes could quite easily eat up much of the day. Of course, our single SPARC station never had these kinds of problems so there was fierce competition between colleagues to use it. We tried at first to use the machine remotely from our PCs using Vista Exceed. This proved very successful at first, but eventually the SPARC station became so overloaded with users that it was quicker to go back to using Windows again.
Then one of our Mexican Ph.D. students, Miguel Tenorio (now at CISESE, Ensenada in Mexico), told us of an exciting OS called Linux which could turn a humble PC into a powerful Unix workstation. I was very skeptical at the time and refused to believe that the Slackware version he had running on his 386 could possibly improve our efforts, especially since he was not able to run X because of memory limitations. However, he proved us all wrong later when he installed a full Slackware version with X on one of our better 486s, and since that time we have never looked back. Now our group has a suite of high-spec 486 and Pentium machines running Caldera Linux and Slackware Linux and utilising the powerful processing and data analysis power of IDL. IDL is the software we use most for our data processing, although occasionally we have to write the odd C program or Unix script. IDL has been around on Unix and Windows for a long time and was recently fully ported to Linux, much to our relief.
Global warming is a phenomenon that might very well affect us all one day. By using data from infrared radiometers in space, we can see how the average sea surface temperature (SST) changes over time so we can tell from the satellite image archives whether or not we are seeing changes in the Earth's climate. Because the ocean has such an immense thermal capacity, even a small change (e.g., 0.1 degrees C) in the average SST can imply a huge change in the Earth's heat budget. Unfortunately, the SST as measured from a satellite is a measure of the temperature of only the top millimeter of the ocean and does not always reflect the true sea surface temperature a few centimeters below because of the cooling effects of the wind and evaporation. Sometimes the “skin temperature” of the ocean can be as much as half a degree cooler than that of the bulk temperature just a few centimeters below. It's therefore important that we understand how the skin of the ocean behaves under different meteorological conditions, so that we can apply a correction to the satellite measured SST to account for the variability of the temperature of the ocean skin layer. After all, the skin temperature variability (up to 0.5 degrees C) is larger than the sort of changes in SST we are looking to measure for global climate research (0.1 degrees C).
The biggest source of error in estimating SST from space is the atmospheric absorption due to water vapour in the atmosphere. One method of dealing with this is employed by the Along Track Scanning Radiometer (ATSR) on board the ERS-2 remote sensing satellite. This radiometer views the sea at two different angles, 0 and 55 degrees to the vertical so that there are two images for every patch of sea. By looking at the difference between these two images, taken through two different thicknesses of atmosphere, a correction factor can be calculated to adjust the images for atmospheric absorption.
To study the atmospheric effects and the skin effect on measuring SST from space, I use marine infrared radiometers to measure SST at the same time as the satellites pass overhead, thereby getting simultaneous SST measurements. The ship I use is a ferry vessel, the MV Val de Loire (Figure 1) which sails regularly across the English Channel. Figure 2 shows a thermal infrared satellite image of the English Channel for January 1997 at which time the MV Val de Loire was approaching the French coast. The ship's radiometer and meteorological data are currently being used to evaluate the effect of the cool ocean skin on remotely sensed SST under a wide range of environmental conditions.
Using IDL under Linux I have found I can process all my data simultaneously rather than running batch jobs that eat up the whole DOS machine. One of the future ideas for this project is to have the ship data telemetered back to base here in Southampton, so that it can be processed and archived in real time rather than collected by hand. However, we have to await further funding for that to happen.
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- Designing Electronics with Linux
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Dynamic DNS—an Object Lesson in Problem Solving
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- Build a Skype Server for Your Home Phone System
- Validate an E-Mail Address with PHP, the Right Way
- Why Python?
- A Topic for Discussion - Open Source Feature-Richness?
- Tech Tip: Really Simple HTTP Server with Python
3 hours 38 min ago
- Reply to comment | Linux Journal
3 hours 46 min ago
- Understanding the Linux Kernel
6 hours 1 min ago
8 hours 31 min ago
- Kernel Problem
18 hours 34 min ago
- BASH script to log IPs on public web server
23 hours 1 min ago
1 day 2 hours ago
- Reply to comment | Linux Journal
1 day 3 hours ago
- All the articles you talked
1 day 5 hours ago
- All the articles you talked
1 day 5 hours ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?