Linux Apprentice: Linux Tools for the Web
I run a small consulting business in Michigan, and one of the services I provide to my clients is web site creation. I had been using my Windows machine for this work, but was not satisfied with the tools I was using. They lacked flexibility, and I had occasional lockups and other problems. Since I used Linux for all my day-to-day tasks (browsing, e-mail, etc.), I searched for Linux tools to do my web design work.
I wanted tools for editing HTML, updating client web sites easily and keeping track of revisions. I tried to select programs that had GPL or BSD licenses or an equivalent. Also, I wanted tools with flexibility, preferring to use several instead of one monster tool that tried to do everything. Finally, I wanted tools that would compile and run on my SuSE 6.1 system without requiring me to run GNOME or KDE.
I found a number of tools, and have been using them for a month now on several client sites: some I imported from my Windows environment and some I created under Linux. The tools available under Linux turned out very capable and easy to use; I was quickly able to be productive using them. Right now, the only thing I cannot do under Linux is use my scanner, but I expect that to change shortly. [All you need is a scanner with Linux support and the program xvscan. —Editor]
I prefer to use non-WYSIWYG (what you see is what you get) editors for the bulk of my HTML work, so I have been using Bluefish and August. While both of these editors are not up to version 1.0 yet (Bluefish is at 0.3.5 and August is at 0.52 beta), I have found that they run reliably and work well.
Both of these editors allow you to work on multiple files at the same time. They both provide buttons to insert basic HTML tags such as headings, lists and text attributes. Bluefish uses a tabbed menu, which allows more tag selections via pushbuttons (see Figure 1) while August has a single push-button menu and uses drop-down menus for selecting other functions (see Figure 2). This makes for a cleaner interface and quicker selection of basic HTML tags. Both of these editors let you preview web pages using Netscape, and August also allows you to preview your page using Lynx.
Bluefish allows you to group your pages in project files. The next time you want to edit them, all you have to do is open the project from the Project menu. Bluefish also offers basic wizards for such things as creating new pages and tables.
August does not provide a project menu, but you can open all the HTML pages in a directory using a command-line switch when you start it. August also allows you to create templates and your own tag combinations accessible via the user menu. I found the user-defined tag feature to have a few problems.
Bluefish requires the GTK libraries to compile. You also need the imlib libraries for the “Insert Picture” functions to work. August is a Tcl/Tk program and therefore does not have to be compiled.
Despite being pre-1.0 versions, both of these editors have worked well for me. I prefer Bluefish so far, but not for any specific reason.
Keeping remote web sites up to date manually can be a chore. Fortunately, there are tools that help automate this process. The two tools I have been comparing are weex and sitecopy. weex is currently at version 2.3.0, and sitecopy is at version 0.9.5.
To use them, first create a .rc file that contains site information such as the remote server (ftp.here.com), your username, password, source (local) directory and the remote directory to update. When the programs are run, they compare the contents of the local directory to the remote directory, then try to make the remote directory match the local directory.
weex offers very precise control over the files and directories that can be updated or deleted. For example, you can tell it to ignore specific directories on the local file system and other directories on the remote file system. You can provide it with a list of file names to ignore or use wild cards. When a file is ignored, it is not copied or deleted.
While sitecopy's file control is not as precise, it offers some features that weex does not. For example, sitecopy can determine whether a file needs updating by checking its timestamp or calculating its MD5 checksum. This is handy if a file's timestamp changes, because you checked it out of a revision control system.
sitecopy can also notify you if someone has changed files on the remote site after you have uploaded them. When this “Safe Mode” is active, sitecopy stores the modification time from the server when you upload a file. The next time sitecopy is run, it compares that stored time to the current modification time on the server. If the times do not match, sitecopy displays a warning message and will not overwrite the remote copy, allowing you to see what changes have been made. I have not tried this function yet.
The console output of weex is more colorful and provides a bit more detail than that of sitecopy (see Figures 3 and 4). The weex command interface is also simpler: all you do is issue the weex command followed by the name of the site you wish to update. To run sitecopy, you issue the sitecopy command, followed by the action you wish to take (update, fetch, synchronize), followed by the site you wish to use. If you do not include an action command, sitecopy displays the files that are different between the local and remote directories.
I have found the best way to use these tools is to create the directory structure on the remote machine before running them. It is also best to make a complete backup of your remote directory before running either of these programs, so you do not accidentally erase important administration or cgi-bin files.
sitecopy provides a nodelete option that prevents it from deleting files on the remote file system, and weex offers a test option that does something similar. I recommend you use these options the first few times you run the programs.
While I prefer the simpler command line and more colorful console output of weex, it does not always keep track of the files it uploads to subdirectories on the remote site. Consequently, it uploads some files again and again, even though they have not changed. It also seems to have problems when the remote system is running Windows NT, but after a bit of tweaking, I have gotten it to work.
sitecopy runs into a problem if it tries to create a directory that already exists on the remote machine. If the remote directory exists, sitecopy quits with an error message. Running sitecopy in fetch mode before update mode takes care of this.
For copying files manually or creating the initial remote directory structure, I use WXftp. This program provides a graphical front end to the ftp command (see Figure 5). It also stores user information for sites, so you do not have to enter addresses, usernames and passwords manually (see Figure 6). You can also configure it to switch to specific local and remote directories for each site you maintain. In addition to copying files, you can create and remove directories, delete files and execute commands on the remote system. I have been using version 0.4.4 and had no problems with it.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Google's SwiftShader Released
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Interview with Patrick Volkerding
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Tech Tip: Really Simple HTTP Server with Python
- Parsing an RSS News Feed with a Bash Script
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide