I run a small multimedia studio and often do a number of jobs simultaneously, from sound engineer to producer. But, when the client calls go quiet, and no projects are pressing, I take time to indulge in my escapist passion: photography.
I keep Photoshop at the ready in case of emergencies, but I don't use it if I don't have to. Most of my machines run Linux, and I prefer it that way. I love the variety of CLI tools for batch processing in ImageMagick, and PanoTools (though I doubt I'll ever master all their capabilities). I love the UNIX philosophy of creating larger applications by knitting together a suite of modular programs that do one thing and do it well. I love that RAW image processing is simple and efficient with UFRAW, which saves to high bit-depth formats and preserves one of the great advantages of using RAW for texture and HDRI work. Most of all, I love the pixel pushers—those end-user programs for editing raster graphics.
For years now, I've been using The GIMP for the most of my postprocessing work. GIMP frequently may be maligned for its un-Photoshop-like interface and its utilitarian approach to filters, but I've grown to love it precisely for these reasons.
During its 2.x release cycle, GIMP has outgrown a lot of its early awkwardness. It is now more memory-efficient, and its new features, such as improved font handling, keep it looking fresh and chipper. But, under the hood, it truly is becoming gimpy, because its core is hobbled by design.
The problem started as a political one. Once upon a time, Rhythm and Hues submitted a set of patches to GIMP that gave it high-color depth capability (a necessity for retouching movies). But GIMP, still in the 1.x release series, didn't know what to do with it, and it rejected the patches out of hand. The patches were primitive and didn't seem important anyway. After all, Photoshop didn't support such images then either, and no one really needed it.
That decision proved remarkably short-sighted. There have since been a number of abortive attempts to replace GIMP's color engine with GEGL to handle high-depths, but so far it's been vaporware.
In the intervening years, computing power comfortably chugged along the path of Moore's Law to Kurtzweil's Singularity, and some startling changes happened. Consumer equipment outgrew GIMP.
First, GIMP can handle only 8 bits of contrast per channel. There are 24 million possible colors and 255 different potential levels of brightness. Although this looks wonderful on a computer screen compared to what we once saw, and although the sharpness and resolution of modern flat panels mean that they often look better than old CRTs or television, the fact remains that a contrast ratio of 255:1 is small, particularly compared to the thousands of gray shades that film reproduces and the hundreds of thousands that our eyes perceive. To put it bluntly, even at its best, color in the digital world has pretty much always sucked.
That is changing. High-def video formats have a wider contrast ratio, using a 10-bit floating point rather than an 8-bit linear color format; one of the high-def formats, HDV, is priced to sell to more spendy consumers in the form of camcorders from all the major manufacturers, starting at around $1,000.
In the world of film and photography, high-quality film scanning has pushed the contrast resolution of a good drum scan higher still, into the realm of 16-bit float or 32-bit linear. Although that may seem irrelevant to most end users, the corollary is not. Nikon, Canon and most of the other major manufacturers have priced Digital SLRs below $800, and almost all of these cameras allow users to shoot in RAW format.
RAW formats are CCD data dumps—the three color sensors aren't interpolated, blended or processed like you would normally expect. This is left up to users to do with their computers when they offload the images, which easily can run more than 10MB each. Most of the time, people shoot RAW and then process the images to ordinary JPEGs under the mistaken presumption that they're getting more bang for their buck, when in fact they're just creating more work for themselves. JPEG compression is, after all, JPEG compression. JPEG is a lossy, 8-bit format, period. JPEGs can look stunning, and most of the time they are perfectly adequate, even for some print jobs. But they do not preserve the advantages to shooting RAW, which are twofold:
No lossy compression.
Higher bit depths.
How much higher the bit depth varies by camera between 10-bit float and 16-bit linear. These higher depths are desirable for shooting light maps or textures for use in 3-D programs. The broader contrast range of these images means far subtler color reproduction, smoother exposure curves, more detail in the shadows and less blowout in the highlights than ever before. But, in order to use this extra detail, you have to preserve it. In order to preserve it, you have to be working with high-depth file formats.
So, GIMP won't do. Thankfully, there are some excellent alternatives.
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- RSS Feeds
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- A Topic for Discussion - Open Source Feature-Richness?
- Drupal Is a Framework: Why Everyone Needs to Understand This
- Validate an E-Mail Address with PHP, the Right Way
- Designing Electronics with Linux
- What's the tweeting protocol?
- Kernel Problem
2 hours 29 min ago
- BASH script to log IPs on public web server
6 hours 56 min ago
10 hours 32 min ago
- Reply to comment | Linux Journal
11 hours 5 min ago
- All the articles you talked
13 hours 28 min ago
- All the articles you talked
13 hours 31 min ago
- All the articles you talked
13 hours 33 min ago
17 hours 57 min ago
- Keeping track of IP address
19 hours 48 min ago
- Roll your own dynamic dns
1 day 1 hour ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?