I run a small multimedia studio and often do a number of jobs simultaneously, from sound engineer to producer. But, when the client calls go quiet, and no projects are pressing, I take time to indulge in my escapist passion: photography.
I keep Photoshop at the ready in case of emergencies, but I don't use it if I don't have to. Most of my machines run Linux, and I prefer it that way. I love the variety of CLI tools for batch processing in ImageMagick, and PanoTools (though I doubt I'll ever master all their capabilities). I love the UNIX philosophy of creating larger applications by knitting together a suite of modular programs that do one thing and do it well. I love that RAW image processing is simple and efficient with UFRAW, which saves to high bit-depth formats and preserves one of the great advantages of using RAW for texture and HDRI work. Most of all, I love the pixel pushers—those end-user programs for editing raster graphics.
For years now, I've been using The GIMP for the most of my postprocessing work. GIMP frequently may be maligned for its un-Photoshop-like interface and its utilitarian approach to filters, but I've grown to love it precisely for these reasons.
During its 2.x release cycle, GIMP has outgrown a lot of its early awkwardness. It is now more memory-efficient, and its new features, such as improved font handling, keep it looking fresh and chipper. But, under the hood, it truly is becoming gimpy, because its core is hobbled by design.
The problem started as a political one. Once upon a time, Rhythm and Hues submitted a set of patches to GIMP that gave it high-color depth capability (a necessity for retouching movies). But GIMP, still in the 1.x release series, didn't know what to do with it, and it rejected the patches out of hand. The patches were primitive and didn't seem important anyway. After all, Photoshop didn't support such images then either, and no one really needed it.
That decision proved remarkably short-sighted. There have since been a number of abortive attempts to replace GIMP's color engine with GEGL to handle high-depths, but so far it's been vaporware.
In the intervening years, computing power comfortably chugged along the path of Moore's Law to Kurtzweil's Singularity, and some startling changes happened. Consumer equipment outgrew GIMP.
First, GIMP can handle only 8 bits of contrast per channel. There are 24 million possible colors and 255 different potential levels of brightness. Although this looks wonderful on a computer screen compared to what we once saw, and although the sharpness and resolution of modern flat panels mean that they often look better than old CRTs or television, the fact remains that a contrast ratio of 255:1 is small, particularly compared to the thousands of gray shades that film reproduces and the hundreds of thousands that our eyes perceive. To put it bluntly, even at its best, color in the digital world has pretty much always sucked.
That is changing. High-def video formats have a wider contrast ratio, using a 10-bit floating point rather than an 8-bit linear color format; one of the high-def formats, HDV, is priced to sell to more spendy consumers in the form of camcorders from all the major manufacturers, starting at around $1,000.
In the world of film and photography, high-quality film scanning has pushed the contrast resolution of a good drum scan higher still, into the realm of 16-bit float or 32-bit linear. Although that may seem irrelevant to most end users, the corollary is not. Nikon, Canon and most of the other major manufacturers have priced Digital SLRs below $800, and almost all of these cameras allow users to shoot in RAW format.
RAW formats are CCD data dumps—the three color sensors aren't interpolated, blended or processed like you would normally expect. This is left up to users to do with their computers when they offload the images, which easily can run more than 10MB each. Most of the time, people shoot RAW and then process the images to ordinary JPEGs under the mistaken presumption that they're getting more bang for their buck, when in fact they're just creating more work for themselves. JPEG compression is, after all, JPEG compression. JPEG is a lossy, 8-bit format, period. JPEGs can look stunning, and most of the time they are perfectly adequate, even for some print jobs. But they do not preserve the advantages to shooting RAW, which are twofold:
No lossy compression.
Higher bit depths.
How much higher the bit depth varies by camera between 10-bit float and 16-bit linear. These higher depths are desirable for shooting light maps or textures for use in 3-D programs. The broader contrast range of these images means far subtler color reproduction, smoother exposure curves, more detail in the shadows and less blowout in the highlights than ever before. But, in order to use this extra detail, you have to preserve it. In order to preserve it, you have to be working with high-depth file formats.
So, GIMP won't do. Thankfully, there are some excellent alternatives.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Tech Tip: Really Simple HTTP Server with Python
- Doing for User Space What We Did for Kernel Space
- Parsing an RSS News Feed with a Bash Script
- Rogue Wave Software's Zend Server
- SuperTuxKart 0.9.2 Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide