Automated GIMP Processing of Web Site Images
The GIMP is a great interactive image editor, but it also can be used to automate the mundane tasks that are involved in creating Web sites. Commercial Web sites that are designed to sell products rather than deliver technical text content tend to rely heavily on images to make users stick around. This article is about using The GIMP to create the visuals for such image-laden sites. The sites targeted by this article generally are not using Flash and are expected to work cross-browser, which means they support Internet Explorer's limitations on alpha masks.
In this article, I ignore HTML and stylesheet specifics and concentrate on batch image processing with The GIMP. The two main image targets that minimally are expected to work are 8-bit GIFs with a single-bit alpha mask and JPEGs with no auxiliary alpha support at all. Unfortunately, image formats, such as PNGs, which offer a nice 8-bit alpha mask, cannot be used for cross-browser Web sites with nontechnical audiences. Many people still are using whatever browser came by default with their operating system, and businesses tend not to want to ignore potential customers.
As an example, consider a site that wants to have a tiled background image, oval buttons and a side panel with a drop shadow and images that are not confined to sitting evenly inside the side panel. Part of the Web site is about showing images of various products, which are to sit nicely shadowed and/or antialiased on top of the background or another random pixel offset on the site.
A single-bit alpha mask is not useful for displaying many types of images on top of a complex background image. This is because the edges of some images need to be blended explicitly to the color of the background in order not to look jagged. With a single-bit alpha mask, the background color needs to be very stable for the entire perimeter of the image in order to look acceptable.
As an example, I show here how easy it is to make things, such as the composition shown in Figure 1, with the final image without the layer perspective as in Figure 2. This example is of a cake image, placed half on the side panel and half on the background of the page—as though the user has haphazardly left the cake slightly aside for now. The layers shown could be stored in different xcf files for easy maintenance and properly composited with The GIMP scripting into a caramel-cake.jpg, which can be included by the Web site.
Unfortunately, some distributions no longer package gimp-perl. It is available for GIMP 2.x from ftp.gimp.org, and once perl-PDL and perl-ExtUtils-Depends are installed along with GIMP, gimp-devel and the Perl bindings for GTK+ 2.x, the module itself can be installed using the classic CPAN trio shown in Listing 1.
Listing 1. Compile and install gimp-perl.
perl Makefile.PL ; make; su -l; make install;
Start The GIMP without a GUI, ready to process Perl commands with the following:
gimp -i -b '(extension-perl-server 0 0 0)' &
The scripts shown in this article tend to follow the pattern of taking a single image as input and generating a single output image to try to mimic Linux command-line pipes. Unfortunately, the scripts in this article can't be piped together but rely on temporary image files to save each image modification. The images are passed in using the -inputimage parameter, and the -outputimage parameter is used to name where the image is saved for the next script to process.
Note that both images should be specified using full paths to ensure that The GIMP puts them where you expect.
More complex translations can be streamlined into a single Perl script. Although using many smaller scripts is slower, because of the extra image save/load cycles, this is less relevant for batch processing images. The upside is that once the little scripts are known about, they can be strung together quickly from both the command line and Makefiles. Common utility functions have been moved into the MonkeyIQGIMP module so that many little image editing scripts can be created quickly.
Practical Task Scheduling Deployment
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.View Now!
|The Firebird Project's Firebird Relational Database||Jul 29, 2016|
|Stunnel Security for Oracle||Jul 28, 2016|
|SUSE LLC's SUSE Manager||Jul 21, 2016|
|My +1 Sword of Productivity||Jul 20, 2016|
|Non-Linux FOSS: Caffeine!||Jul 19, 2016|
|Murat Yener and Onur Dundar's Expert Android Studio (Wrox)||Jul 18, 2016|
- Stunnel Security for Oracle
- The Firebird Project's Firebird Relational Database
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- Google's SwiftShader Released
- SuperTuxKart 0.9.2 Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide