Scientific Visualizations with POV-Ray

With a little work, the Persistence of Vision Raytracer (POV-Ray) can be adapted to create stunning three-dimensional imagery from floating-point scientific data files.
Density Files

In the sphere example, a mathematical function was used to calculate the isosurface value. My thunderstorm numerical model data cannot be represented as a mathematical function; instead, it is represented as three-dimensional floating-point arrays containing model variables such as temperature, wind speed and cloud concentration at each grid location (Figure 2).

Figure 2. An example of multiple isosurfaces, focusing on a region of the supercell called the wall cloud. The yellow isosurfaces in the foreground, which are located below the wall cloud, represent where tornado-like swirling motion is occurring.

POV-Ray 3.5 has a feature called a density file that allows for the mapping of functions represented as gridpoint values. The POV-Ray documentation describes a density file as follows: “The density_file pattern is a 3-D bitmap pattern that occupies a unit cube from location <0,0,0> to <1,1,1>. The data file is a raw binary file format created for POV-Ray called df3 format.”

Density files can be used as functions passed as an argument to the isosurface object. Here is an example of a density file being used for isosurface rendering:

#declare DENSFUNC=function
{
    pattern
    {
        density_file df3 "cloud.df3"
        interpolate 1
    }
}
isosurface {function { 0.1 - DENSFUNC(x,y,z) }

In the above example, an isosurface with value 0.1 would be created from the cloud.df3 file using a trilinear interpolation scheme (more on interpolation below).

The density file format is strict, and the data values are represented as 8-bit data (unsigned short integers ranging from 0 to 255) scaled internally to range from 0.0 to 1.0. Because my thunderstorm data is 32-bit floating-point data, it is not feasible to use the density file format with the stock POV-Ray 3.5.

Enter Ryouichi Suzuki, who has been providing POV-Ray with unofficial add-on code since 1996. He provided the first patches to POV-Ray 3.0, which introduced the isosurface object, now a standard object in version 3.5. Suzuki's code, contained in the Zip file referenced above, contains routines that expand the functionality of POV-Ray density files, including the option of rendering floating-point density file data.

When using density files as functions one must consider that although a mathematical function is a continuous expression—it is defined for any floating-point value of x, y and z spatial coordinates—a density file is a discrete set of data referenced by integer array indices. Interpolation must be done for referencing space in between gridpoints when rendering a scene. The two interpolation methods available are trilinear and tricubic spline. Trilinear interpolation is fast but usually does not produce as smooth an interpolation as does tricubic spline interpolation.

Getting Model Data into POV-Ray

After patching the POV-Ray 3.5 code with Suzuki's density file code, you can render floating-point isosurfaces if you adhere to the df3 format or Suzuki's extended format. In my case, I had hundreds of gigabytes of HDF (hierarchical data format) model data, which is designed specifically for numerical models. Because I am interested in not only producing a few isosurface images but making animations from hundreds, sometimes thousands, of these images, writing an HDF to df3 converter for each of these files was not a viable option. Hence, I started looking closely at the POV-Ray routines that handle density file data with the hope that I could modify the code to read HDF data.

It was important to me that the modifications I made to POV-Ray would not cause a loss of functionality or break compatibility with the unpatched version. I achieved this goal by adding some new objects, referenced in the scene description file, that could be parsed and rendered by my patched version, while leaving all other objects alone.

The main piece of code I modified is found in pattern.cpp, which contains the Read_Density_File routine. This routine, as you might have guessed, reads density file data into a three-dimensional array. Using this routine as a template, I created a new routine, Read_Hdf_File, to read my history file data into POV-Ray. This is where most of the modification of the POV-Ray code needs to be made to adhere to your own data format. An abbreviated version of Read_Hdf_file is shown in Listing 1.

The function of Read_Hdf_file is to read HDF floating-point data into mapF, a 3-D array of floats, where it now is ready to be manipulated as a density file. I wrote a separate piece of code called history.c, which contains all of the HDF I/O routines referenced in pattern.cpp. Your data file format will require its own format-specific code to read your 3-D data into POV-Ray.

A few more files were modified in order for POV-Ray to recognize the new HDF file format natively and to allow for the rendering of more than one model variable per scene. Table 1 contains a list of the files modified and a brief description of what was done to each file.

Table 1. A listing of modified files to accommodate model data into POV-Ray and a brief description of what was done.

FileModification
pattern.cpGet_HDF_File_Data routine added, which reads model data into memory.
pattern.hAdd declaration of Read_Hdf_File.
parstxtr.cppAdd case statement block for HDF_TOKEN.
tokenize.cppAdd HDF_TOKEN to Reserved_Words array.
frame.hAdd char *Var1 to Density_file_Data_Struct structure.
parse.hAdd HDF_TOKEN to TOKEN_IDS.

The HDF file format allows for more than one variable to be stored in each file, unlike the density_file format. In my case, each HDF file is a snapshot of the model state at a given time and contains 12 3-D variables per file. It often is illustrative to look at multiple variables, such as cloud, rain, hail and snow, together in one scene. I achieved this by creating a new token representing the HDF file format, called HDF_TOKEN (distinct from DF3_TOKEN representing the original df3 format), and added a new character array called Var to the structure Density_file_Data_Struct. Var is assigned in the scene description file and is passed as an argument to the HDF routines to specify what model variable to select. In order to parse the variable name (represented as a character string), I added an additional case statement to the Parse_PatternFunction routine in parstxtr.cpp (Listing 2). Notice the addition of Parse_Comma and Parse_C_String, which grab the variable to be read.

______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState