Although GIMP is useful, it's showing its age, and the time is coming soon when it either will adapt or be shuffled to the wayside by more capable tools. For the photographer working in Linux, as well as for the high-depth CGI artist, both Krita and CinePaint are welcome tools. Each is strong where the other is weak. As both programs continue to develop, we can expect great things from them—they have both proven themselves to be well-designed packages with deliberate and capable teams behind them. The odd mix of Darwinian competition and cooperation has given us a new generation of these tools, and they're ready to be used. Enjoy!
HDRI: Faking Film's High Bit Depth
High Dynamic Range Imaging was originally developed for lighting 3-D scenes, as a way to capture the real-world range of luminescence, and has been a boon to realistic 3-D lighting work for many years.
But, it has another use. By tone mapping the image, HDRI's wide contrast range can be represented in 8-bit space to stunning effect—preserving details in the shadows and minimizing clipping in highlights. As this aesthetic became popular, several techniques have developed to create HDRIs from digital snapshots and then convert them for display on monitors or in print.
Creating HDRI images from digital photographs requires Bracketed Exposure—taking a set of photos with different exposure settings to give a wider collective latitude than the camera natively allows (Figure A). Afterward, the bracketed images are combined into a single HDRI. Although this can be in the terminal, it's far easier with CinePaint's self-explanatory Bracket to HDR plugin (included in the package). Once created, the HDRI either can be turned into a light probe (for lighting a 3-D scene) or tone mapped for display and/or printing (Figure B).
Tone mapping interpolates an HDRI into 8-bit space without clipping the high and low ends—it compresses the image nonlinearly to preserve the details otherwise lost. The result is a much richer image than could normally be captured by 8-bit equipment. At the moment, tone mapping isn't available in CinePaint or Krita (although it is on Krita's to-do list). Instead, pfstools, a command-line suite of algorithms for configuring the interpolation curves, does the job.
Fortunately, for those of us who don't like experimenting blindly in the terminal, a thoughtful soul has written a GUI that offers the full range of options available at the command line, with a preview window. The program, Qpfstools, along with an introduction and tutorial, can be found here: theplaceofdeadroads.blogspot.com/2006/07/qpfstmo-hdr-tone-mapping-gui-for-linux_04.html.
Dan Sawyer is the founder of ArtisticWhispers Productions (www.artisticwhispers.com), a small audio/video studio in the San Francisco Bay Area. He has been an enthusiastic advocate for free and open-source software since the late 1990s, when he founded the Blenderwars filmmaking community (www.blenderwars.com). Current projects include the independent SF feature Hunting Kestral and The Sophia Project, a fine-art photography book centering on strong women in myth.
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
|Introduction to MapReduce with Hadoop on Linux||Jun 05, 2013|
- Containers—Not Virtual Machines—Are the Future Cloud
- Non-Linux FOSS: libnotify, OS X Style
- Linux Systems Administrator
- Validate an E-Mail Address with PHP, the Right Way
- Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- Introduction to MapReduce with Hadoop on Linux
- RSS Feeds
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?