In the January 2006 Get Your Game On column, Dee-Ann LeBlanc wrote: “The closest I got this time was pulling out my Classic Text Adventure Masterpieces CD with old Infocom games.”
You and many of your readers probably didn't know that you can already play most of these under Linux very smoothly. It turns out that the Infocom games were written for a virtual machine (sort of like Java); they come in two pieces, the actual game and the virtual machine interpreter. Several replacement virtual machine interpreters have been written that run on Linux and are under the GPL. One is called Frotz, and the current version is 2.43. It's available from www.cs.csubak.edu/~dgriffi/proj/frotz; it's also packaged for Debian and probably other distributions. There are many others.
You have to experiment a bit to find out where the actual game file is on the CD, since they've been put in different places for each game. For zork 1, it's in pc/zork1/data/zork1.dat, so run frotz pc/zork1/data/zork1.dat in a console and you're off. Some of the game files have a .zip extension rather than a .dat extension (that's what Infocom called its virtual machine format, back in the days before PKZIP).
Frotz doesn't behave properly on the games with graphics,
like Zork Zero; support for those is in a derivative GPL
program that runs only on Windows, called WindowsFrotz2002,
and will hopefully be ported back to the original someday. In the
meantime, WindowsFrotz2002 runs under Wine, and can be downloaded from
but it requires pictures translated into the Blorb format, which you'll
have to poke around the Net for.
Dee-Ann LeBlanc replies: Thank you very much! I think I might have known this at some point along the way many moons ago and then forgot. When I get home from being on the road, I just might pull out that Infocom CD again and cover this.
I have been a very delighted Linux Journal subscriber for several years. The last year or so I felt that I had several sources of information, including LJ. So I decided to let my subscription expire and just receive the remaining copies until they stopped. And then came along Mr Lerner's article about Ruby On Rails, which was very informative and inspiring [LJ, October 2005]. This made me change my mind about not renewing, so I just subscribed for another two years.
Even though I'm a (very happy) FreeBSD user for just about as many years as I've subscribed to LJ, many articles in LJ are just as relevant to FreeBSD and other open-source OSes, which I believe is the true force of open source.
Open source is the possibility to borrow and grow ideas and principles from each other. Many articles are well written, but I do still wish that some of them weren't so Linux-centric and would take on a broader view.
Remember that in the quest of trying to grab a larger
share of the server and desktop segment dominated by
one large company, Linux advocates should be careful
not to use the same rhetoric as the one vendor does.
One idea you could try out is a BSD-corner (or other
open-source OSes for that matter) once in a while so
Linux users are exposed slightly to other concepts.
Regarding the frivolous little bit, “Might be Just Right” [LJ, December 2005, page 94] does the word Lagom apply to Linux? I think it might, but maybe another word that applies to Linux is Gaia, in that Linux consists of many organisms working together as a single organism. Linus Torvalds did not create an operating system; he created a kernel that was made into a successful operating system by obtaining the cooperation of an array of GNU utilities, which in turn spawned many computer geeks around the world to use the fledgling and somewhat useful UNIX system to fix and enhance the system and its utilities, adapt other stuff to work with Linux and write new stuff from scratch and so on. You get the picture. There is no real master organism, but there are many organisms spontaneously working together toward the one common good.
Another definition: Linux is an example of a near-perfect anarchy for all the same reasons that make it a Gaia—all work together for the advancement of their common good without a government or owner directing their activities. Alexander Berkman would be proud.
Is that black helicopters I hear coming this way?
I'd encourage Maestro Mr Taylor to make a point of reminding the readers of his Work the Shell column that his work environment is specific to the Bourne shell, and some users, for any number of legitimate reasons, may be confronted with a C-Shell environment. As a nicety to these readers, maybe a quick side-track into how to get their won Bourne shell environment? [See Dave Taylor's Work the Shell column, beginning in the December 2005 issue of LJ.]
Thank you all for your hard work!
Michael C. Tiernan
Dave Taylor replies: I knew we couldn't go too long without someone bringing up the great religious war of the shell scripting world, which shell to use. I tried to highlight in my first column that I would be writing for the Bourne Again Shell (though almost everything will work with any modern Bourne Shell too, especially if it's a Posix-compliant distro), but just to clarify, I don't think it really matters what shell you opt to use; with just a few relatively minor syntactic changes, the basic concept of scripting and how you utilize Linux commands to accomplish extraordinarily difficult programming tasks in just 5-10 lines remains the same.
Also, you can certainly have the C Shell or one of its variants as your command-line interpreter / login shell and still use Bash as your scripting environment of choice.
To find out what shell you're currently running, simply type in the following command: ps -p $$ it'll either say sh, csh, bash, tcsh or similar. The syntax used throughout the shell scripting column is for sh or bash.
Matthew Hoskins' article “UNIX: Old School” in the December 2005 issue
was a gem. I found it intriguing that a vintage 1974 release of UNIX
could be booted under a PDP-11 simulator and experienced firsthand. A
Just had to drop a quick note to say excellent issue. I'm thrilled with
the new column, Work the Shell. This was by far my favorite issue in
the past few years. I eagerly await next issue to see if it can hold up!
Subject: Re: On Patents
Roger Wolff's response to Don Marti [see Letters in the December 2005 issue of LJ] displays a weak understanding of patents. He uses the descriptions of the patents to decide what they cover. That doesn't work. To understand what the patent covers you must read the claims section. Everything else is little more than window dressing. Unfortunately, the claims section is written in lawyer-ese. Reading the rest of the patent can be helpful for setting context for the claims. But that is all that it is.
For his example “method for coding an audio signal”, the claims could be very narrow or could be broad. Much depends on how careful the patent examiner was. But you simply cannot tell from the description.
Disclaimer: I am not a lawyer. But I do have some patents.
I found it interesting to read the “Archives, Patents” letter by Kari Laine, and the answer from LJ [November 2005 issue, page 10]. The answer totally avoids the issues involved, when you form a symbiosis with something that you're trying to fight. Kari's letter gives the impression that patents are evil, and yet Kari suggests a model where a part of the Open Source community relies on patents to fill the bank account. How can you whole heartedly fight something that puts food on your table?
Now, I'm not saying that the Patent Commons Project is bad because it forms a symbiosis with the patent system—open-source purists will take that discussion if needed.
Maybe LJ can clarify the implications of forming symbiosis with what
trying to fight, by having Doc Searls write about it, all the way from
putting hooks to binary-only modules into the kernel (pwc/pwcx), over
emulation of “the deselected OS” (Wine), to making money on patents when
actually don't like them?
Martin A. Bogelund
I have a few comments to share about the article “Bringing Usability to Open Source” in the January 2006 issue of LJ.
First off, I must thank Nate and the Linux Journal for printing the article. I have to stress that usability should be one of the number one goals of all software creators, open source or otherwise. There is no substitute for observing users, because as Nat has pointed out, they often have a different mindset from that of the developer. Furthermore, developers often consider a feature finished when it “works”, without realizing how normal users flow through the processes they use on a daily basis. More times than I can count, I have sat down at my software with users and realized immediately that I missed the usability boat entirely as I watched them go through several painstaking and frustrating steps I had never envisioned to accomplish their task. In most of these cases, it wasn't a case of “bad user”, it was a case of “bad software”--which is hard for a guy like me to admit.
However, I think one has to be careful when making software more “usable” not to be equally blinded by the actions of a few users.
As an exercise, imagine you took a young child out to your vehicle, set up a camera, and asked the child to “take me to the store”. While this is a very simple task, you might imagine your young friend might grasp the wheel, step on a couple of pedals, turn the key (with the clutch out) smash into the garage door (oops that wasn't right), and then give up and admit that he or she doesn't know how to do it.
In my mind, this does in no way imply that the car is “broken” from a usability perspective. A well-meaning engineer may take this feedback to indicate that the vehicle should be altered in some way to ensure that this novice would succeed next time. The problem is that any individual with driving experience would get in this “fixed” vehicle and become instantly annoyed at having to step through some new sequence of steps, most likely taking more time, and ultimately being much less usable.
In Nat's specific example of the New button being wrong, I would have to disagree respectfully. First of all, the New button is quite consistently used in many software packages to begin a new action. New may be too terse, but conceptually it's correct. In my opinion Send would be incorrect, because the option doesn't send anything, it only creates a new message. If you want to send the message, you need to press Send. This would be my interpretation of course, which doesn't imply it is correct.
In this case, I might recommend that the button should say New Email (for English speakers). This is both more clear and yet remains true to the action it represents. Perhaps this user would not have stumbled with this minor change in place.
The point is not to disagree with the article or Nat's intentions. The point is to ensure that software engineers and developers understand the need to take usability very seriously and look at a problem from several angles before making a decision. Because writing bad software is much easier than writing good software, and there are more wrong ways than right ways, selecting the best way is a huge challenge.
There are a lot of factors that go into usable software. I have always believed that some key elements are 1) consistency, both internally and among other applications. 2) Clarity—language should fit the action precisely and be backed up by verbose tooltips that users can rely on to clarify. (Tooltips should always be present as a matter of consistency). 3) Efficiency—displays should present all relevant information organized comprehensibly. Help should be available and abundant. Help should never restate the question (that is, help like “The Sort button will sort your results” is not really very helpful). The number of keystrokes and mouse clicks needed to accomplish tasks should be minimized. Repetitive tasks should have short cuts. Process flow should always be taken in to consideration. Things like allowing multi-select rather than single select can mean the difference of an hour of work versus five seconds of work. Always automate where appropriate, but give options. Never lock the user into a process but ensure that the most basic processes are well designed and flow well and have a minimum of “speed bumps”. Allow the user to turn off or skip features designed to assist a novice. You can try to anticipate what a user wants, but don't insist.
I can easily give examples of “bad” usability that has been coded into software. Just for fun, consider some features that I have experienced while running Windows. Every time I put in a new CD the OS tries to “guess” what I want to do with it, wasting my time when I could have already been doing it. And I really love it when Windows tries to “help” me clean up the icons on my desktop. Yeah, I really want to delete those links, thanks for asking me (every day). This does not improve my experience, and it's frustrating how difficult it is to disable these behaviours, even for a relative expert without a lot of time to go hunting down the method.
And yes, I could give examples in Linux as well.
I'm sure I could write on the subject for days. Though I don't consider myself an expert at interface design I am fairly passionate about the need for usability in software, and I've made my share of mistakes. I've also been subjected to my share of bad software as I go about my own daily tasks, as I'm sure most readers have.
It is my hope that the article and the work of the
Novell Usability Labs will help spur more work on the
subject. Success of the Linux desktop has more to do
with mindshare than anything else at this point. The
platform is ready.
We are close folks! Let's keep going!
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- Managing Linux Using Puppet
- Tech Tip: Really Simple HTTP Server with Python
- Returning Values from Bash Functions
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Rogue Wave Software's Zend Server
- Doing for User Space What We Did for Kernel Space
- Parsing an RSS News Feed with a Bash Script
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide