- Bringing Usability to Open Source—And Vice Versa
- They Said It
- HLA Adventure:
- Nigel McFarlane
- Ten Years Ago in LJ: October 1995
- On the Web
- Might Be Just Right
- diff -u: What's New in Kernel Development
Bringing Usability to Open Source—And Vice Versa
“Usability” is one of the big raps on the reputation of open-source code. Fixing that reputation at the source(s) is the idea behind a series of FLOSS (Free/Libre Open Source Software) Usability Sprints. The first was held in San Francisco February 2005. By the time you read this, the second will have happened in July 2005. Others will follow.
At Digital ID World in May 2005, I met with Eugene Eric Kim of Blue Oxen and interviewed him briefly about what the whole thing was about.
DS: What were your motivations for the first Sprint?
EK: The motivation for the event was multilayered. We wanted to figure out a way to make open-source software more usable. Both Blue Oxen Associates and Aspiration believe strongly in open source and use it ourselves, so we had a strong stake in the outcome. Both of us also care strongly about nonprofit organizations and believe that poor usability is one of the impediments for wider adoption of open source among nonprofits.
DS: Why a “sprint?”
EK: A sprint is a hands-on and collaborative clinic where everybody works hard for three days to get real stuff done and learn new things in the process.
There's a knowledge gap between the usability and the open-source communities; but the problem is actually that the two communities weren't previously motivated to learn about each other and to close that gap. It's a bit ironic, because these two communities share a lot of values: iterative improvement, user-centrism, a desire to improve things. They should have been collaborating with each other, but they weren't. Our goal was to build shared understanding between these two communities and then to see what happened.
DS: How did you select participants?
EK: We started with the largest and most diverse possible group: six open-source projects (Activist Mobilization Platform, Chandler, CivicSpace, FotoNotes, Identity Commons, OpenACS and WordPress), 12 usability specialists (OSAF, Sun, Adobe, Adaptive Path, openusability.org, Stanford and Oracle) and a mix of developers and project managers—for a total of 40 people.
Our stated goal was straightforward and concrete—for each group to improve the usability of some aspect of their projects in three days. On the second day, we brought in about 15 users to help the process.
Our unstated goal was to catalyze ongoing collaboration between these two communities. The event was highly interactive and was expertly facilitated by the good folks at Aspiration (who also do Penguin Day and other excellent events). We emphasized concrete outcomes, reporting out and self-documenting on our Wiki.
DS: What did you get out of it?
EK: We accomplished both of our goals and then some. Each project improved in concrete ways. Everyone learned something. Open-source developers learned practical usability techniques, but more importantly, they were motivated to learn more. This wasn't training. It was learning by doing. The difference is critical. A follow-up bootcamp will be much more effective if the developers are motivated and if they have a context from which to understand these techniques.
DS: What did the two constituencies learn from each other?
EK: The usability folks learned a lot about open-source development. There were many misconceptions, both ways. It built shared understanding to work closely together in such an intense, high-energy, accelerated—and, most importantly, fun environment. Curiously, the usability folks hungered for an opportunity like this. For one thing, most usability practitioners never collaborate with each other. They have good professional groups, such as CHI and UXnet, but they rarely work in twos. A lot of folks approached me afterward and said how much they enjoyed working with other usability people. These usability folks also hungered to do some good and make an impact. One problem with usability in corporate environments is that you can't talk about your work. Maybe you can point to a final product, but you can't point to the analysis and methodologies that got you there. You also can't show how crappy the project was before you came along.
So, a huge benefit for usability practitioners to work with open-source projects is simply the ability to talk about it afterward.
Both sides learned that usability and open source have an inherent tension. The usability mantra is, “You are not your own user.” The open-source mantra is, “Scratch your own itch.” On the surface, these two philosophies seem contradictory. But, if you dig deeper into both of these, you'll find that it's not that simple. We gained confidence that collaboration will help resolve these tensions.
DS: Did any new projects come out of the event?
EK: Two interesting and unexpected concepts and projects emerged from the event. The first was the notion of “extreme usability”. See, usability is like security; it's not something you can just tack on at the end. The best route to usable software is to incorporate it into the process from the very beginning. As all of us explored this notion further, we converged on this analog to pair programming, where you partner a usability person with a developer. This resonated strongly with a lot of folks.
The project I'm most excited about is called OpenWebGUI (openwebgui.sourceforge.net), because it shows the kinds of unique and powerful collaboration that open source enables. We had three CMSes there, and throughout the weekend, they realized that they were faced with similar usability problems. For example, all of them wanted more usable administrative systems. Even though they are all different systems, the functionality and work flow is essentially the same. These projects realized—independently of any of us—that because they were faced with the same problem, they were better off working together than in silos. They started a project called OpenWebGUI, which will focus initially on CMS administration GUIs. They will do an analysis, develop some HTML prototypes and test those. The tested and usable HTML files will be released under an open-source license, so that any CMS can incorporate them and customize them as they see fit. I love this outcome because it vividly illustrates the exponential returns that open source enables. We spent a bit of money to improve these projects, and yet the outcome will very likely improve the entire ecosystem of related projects. Now that's ROI!
DS: How else do you expect this to play out in the long run?
EK: Many folks involved with CHI are interested in being more involved with open source, and we are actively working to facilitate this. Jan Muehlig, a usability specialist who flew in from Germany to participate, has an outstanding project called OpenUsability.org, where he's trying to connect usability folks with open-source projects. Jan did some excellent work with KDE, and we're actively trying to promote his stuff. And, we've got a long list of projects and people who want to participate in the next sprint, at the end of July. We'll incorporate what we learned from this past sprint to improve the next one, and we'll also experiment with some new ideas. And we'll continue to engage with the community as a whole. These events are meant to perturb the ecosystem and catalyze the community. We encourage everybody to participate.
To find out how, visit www.flossusability.org.
They Said It
At Parrs Wood OSS is seen not as merely a way of saving money, but rather of spending it more effectively.
—BBC News: news.bbc.co.uk/1/hi/education/4642461.stm
When I switched from Windows to GNU/Linux (Redhat/Fedora/Debian mostly) about 5 years ago, I found a vast developer's playground. It was like the old days of Compuserve, which was a candy aisle of freeware. Free software is still like that for me, there's lot of it to explore, and I can see the source code without significant restriction. I can use the source, and I can share the source...which is something geeks love to do. The Windows world by the mid-1990s was very closed (still is mostly), something that's really restrictive as a developer.
—Anonymous, on IT Garage: www.itgarage.com/?q=node/617#comment
You have reached the pinnacle of success as soon as you become uninterested in money, compliments or publicity.
—Thomas Wolfe, The Sun, July 2005
There is no such thing as a “personal” blog if you are employed.
When Zork appeared on the scene in the late 1970s, computer enthusiasts from around the world were instantly hooked on the interactive fiction genre known fondly as the Text Adventure game.
HLA Adventure is the latest in a long line of public domain and free software text adventures being released by people all over the world. It combines elements from MUDs, Advanced Dungeons & Dragons and J.R.R. Tolkien's famous The Lord of the Rings.
Using verbs and nouns to communicate with the game world, the player moves about HLA Adventure with but a simple goal in mind: slay the menacing dragon at the end of a large expanse of caves. While solving this main quest, the player is also presented with nine other unique quests, which allow the player to find items and equip weapons, armor and a brightly lit lantern. Even a magical flute plays a role—useful in putting magical beasts to sleep.
Players will encounter hellhounds, werewolves, vampires, hobbits, ghosts, barbarians and demigorgons. Talk to creatures in the game with the TALK TO command. Once you have acquired the necessary armament and passed the requisite number of quests, you can then enter into the cave and slay the dragon for good.
Despite some bugs in the game, HLA Adventure is a solid, robust open-source adventure game. It was written in Randall Hyde's High-Level Assembly (HLA) programming language.
With the sudden death of Nigel McFarlane, the Web Development and Open Source Software communities, both in Australia and around the world, have lost one of their most well-known authors, consultants and pundits.
Although in many ways a very private person, Nigel had a professional and personal network that spanned the globe and included such on-line luminaries as the lead engineer for the open-source browser Firefox Ben Goodger, and countless others in the Open Source, Web Development and Linux communities. Since his passing, many community sites, in a number of languages, have expressed their sorrow, a testament to Nigel's influence.
A real Melbourne boy, describing the city proudly as “the World's most liveable”, Nigel had science degrees from both the University of Melbourne and LaTrobe University. Even when speaking in Sydney, he was always keen to get home as soon as possible, where he would bushwalk and ramble, swim and surf.
Nigel's writing extended to the columns “Searching for Substance” for InformIT, and articles for such publications as Linux Journal, DevX, Builder.com, CNet, The Age and the Sydney Morning Herald. Nigel was an entertaining speaker as well as a writer. I particularly recall chairing a conference session that Nigel presented late last year. Often conference-goers are anxious to get early places in the meal queue, but although we had gone overtime for lunch, Nigel captivated the room. When offered the opportunity to break, the entire room turned it down, glued as they were to Nigel's presentation.
Generous with his time, energies and knowledge, Nigel contributed to mailing lists, newsgroups and forums, as well as speaking to audiences large and small at conferences and for user groups. His reach went far beyond Australia, as tributes in recent days from developers and members of the Open Source and Web Development communities around the world testify.
Nigel's passing is a sad loss for these industries still in their infancy.
Ten Years Ago in LJ: October 1995
The October 1995 issue covered “Text Processing”, and feature articles introduced groff, LaTeX and Linuxdoc-SGML, which was an early document format at the Linux Documentation Project. All three document formats are still in use today.
Making the transition to 64 bits is IT news today, but it was a hot topic for us ten years ago. Jon “maddog” Hall, then still at Digital, covered Linux on Alpha and its advantages for computer science education:
Over time, this meant that to get all the sources to our Unix products, 15 separate licenses were necessary, at a cost of thousands of dollars, and even then the sources were restricted to a “need to know” basis and were not for consumption by curious students.
Publisher Phil Hughes, in a “Stop the Presses” item, pointed out that Microsoft Windows 95 overwrites a PC's Master Boot Record—the first Windows version to do so and a FAQ for dual-booters ever since. More help came in the form of an ad for the “System Commander” boot manager, which offered an easy solution for multi-OS systems, with the bonus feature of fixing boot sector virus infections.
On the Web
To go along with this month's theme of Personal Desktop, here are some articles from the Linux Journal Web site that will help you find your way through OpenOffice.org, try out some Linux audio software and rescue data from a hosed USB device:
Do you want to move to OpenOffice.org but aren't sure what to expect? Are you trying to convince friends and/or family members to give OOo a try, but they want to know about the learning curve? If so, Bruce Byfield's article “OOo Off the Wall: What New Users Need to Know About OpenOffice.org” (www.linuxjournal.com/article/8443) is suggested reading. Bruce sheds some light on OOo's “interface shortcomings” and “the limits of its on-line help”, as well as the “logic of its interface design and the importance of styles and templates in an efficient work flow”.
Audio for Linux has come a long way in the past couple of years, and Dave Phillips continues his tour of what's new for musicians and engineers, whether full-time or part-time. In recent months, he's introduced us to FreeWheeling, “a powerful loop-based performance tool” (www.linuxjournal.com/article/8445), as well as QSynth and QJackCtl, GUI front ends that “make Linux audio tasks easier and faster, letting you get straight to the music” (www.linuxjournal.com/article/8354.
Finally, Collin Park shares his story of “How a Corrupted USB Drive Was Saved by GNU/Linux” (www.linuxjournal.com/article/8366), offering hope to those of us who have lost important data and will lose it again.
Might Be Just Right
At LinuxWorld in Boston earlier this year, I got together with an old Swedish friend. She's a nurse, not a technologist, but she was curious about my work and the conference that brought me to town. Somewhere in the midst of my explanation of Linux and its virtues, she said, “Ah, Linux is lagom”. She explained that lagom is a Swedish term that conveys a sense of balance, proportion and appropriateness. “Not too much, not too little...just right.”
When I told her that Linus Torvalds' first language and surname were both Swedish, she said, “Well of course. There you go.” (I'm half-Swedish myself, though I'm not sure that matters.)
So I put the question “Is Linux logom?” to The Man Himself in an e-mail. He debugged my spelling and declined to commit:
Lagom, with an “a”.
And yes, it means “just right”, in the sense of “not too much, not too little”. See en.wikipedia.org/wiki/Lagom
Then he added, in a following e-mail:
They still end up confusing “lagom” with finding the “optimal” amount. That's pretty much missing the point. It's not that something is “lagom” because it's the best possible or “optimal”. Quite the reverse. Something being “lagom” very much involves not caring too much about what the optimal amount even is. Or possibly questions where “optimal” simply doesn't make sense.
So I began checking other sources. The best I found was from “In Other Words”, published in AskOxford, published by the Oxford English Dictionary (www.askoxford.com/worldofwords/wordfrom/otherwords). It lists lagom among a handful of “the most insightful, intriguing, and satisfying expressions on the planet—for which there are no English equivalents”. It says:
Swedish commentator Dr Bengt Gustavsson argued that the lagom mentality can be seen as the trait that gives Swedish society its characteristic stability and yet an openness to external influences. The word alludes subconsciously to the avoidance of both conspicuous success and humiliating failure, which is deeply ingrained in the Swedish psyche. It is the inclination among Swedes to shun ostentation, accept modest rewards, be good team players—to fly beneath the radar.
Beneath the Radar was also the title of Bob Young's book about starting and guiding Red Hat to success. Coincidence?
Perhaps characteristically, Linus adds these final words to the matter: “but whether that applies to Linux I have no idea.”
diff -u: What's New in Kernel Development
After a long and difficult life, DevFS is finally being removed from the Linux kernel. Created by Richard Gooch, DevFS has been around for years, and it represented a serious attempt to cure the runaway /dev directory. Developing DevFS was an uphill battle against many detractors, but Richard did succeed in creating a very useful tool. In the end, however, critics of DevFS won out, citing “unfixable races” and other problems, and Richard vanished from kernel development completely. Greg Kroah-Hartman and others then developed udev as a replacement for DevFS. Some lingering sense of the 2.6 kernel as a stable tree has made this decision slightly controversial even now, but almost certainly it's not enough to influence the outcome. Farewell DevFS—it was a valiant effort.
Recently, various folks have reported compilation problems when trying to compile the 2.4 kernel with GCC version 4, and some developers have posted patches to address these issues; however, Marcelo Tosatti has stated that it is simply too late in the day for these sorts of patches to make it into the 2.4 tree. Unlike 2.6 development, the maintainers of 2.4, 2.2 and 2.0 have not decided to follow suit and abandon the idea that their trees must aim for stability. Marcelo has been trying to rein in 2.4 development ever since the first 2.6 kernel came out, but he has still allowed large IDE changes, new hardware support and other patches whose invasiveness would typically fly in the face of a push for stability. And with 2.6 development showing no sign of slowing down, Marcelo has been under constant pressure to incorporate new features into 2.4 to be available to folks who needed 2.4's stability. With the advent of the w.x.y.z tree, however, some of this pressure has undoubtedly flagged, and Marcelo has been able to tighten up the restrictions on what can and cannot get into 2.4 at this late date.
The git versioning system continues to grow and strengthen. Andrew Morton's -mm tree will be available as a git repository, although Andrew himself has no plans to use any versioning tool for actual development. The ALSA Project has migrated development to git, as has libata. Marcelo Tosatti's 2.4 tree also will use git for ongoing development. Linus Torvalds is still very strongly involved with the project, and although mailing-list traffic has tapered off somewhat from its frantic early weeks, much of this is explained by the fact that folks now understand the basics of the tool, and the fundamental concepts no longer need to be explained to newcomers.
In the midst of all the version-control upheaval, it's hard to know for certain if the new w.x.y.z stable kernels are working out. But several kernel folks, including Jeff Garzik and Alan Cox, feel that this tree successfully provides a stable kernel to supplement the 2.6 tree's ongoing large-scale development. Greg Kroah-Hartman and Chris Wright, the primary maintainers of the w.x.y.z tree, do seem to be doing a rigorous job, not only collecting and applying patches, but adhering to Linus Torvalds' strict guidelines on what patches may be applied, and how and when they may be accepted. A number of aspects make this project less appealing than doing real development work, but Chris and Greg seem to be bearing up nicely, and the rest of us are the beneficiaries.
Martin J. Bligh has put together a set of automatic testing scripts that compile and boot all official kernel releases (including the w.x.y.z kernels) and several prominent branches like the -mm tree, within 15 minutes of their release. If a kernel boots successfully, Martin's scripts hit it with a variety of benchmarks. Compilation and boot results are recorded, benchmark results are graphed and everything is made available as a set of ongoing kernel.org Web pages. This is the sort of project that will not solve all bugs, but it will identify many trivial bugs, track performance problems across multiple kernel releases and may identify hard-to-find bugs that regular users would not normally see.
The relatively recent introduction of Signed-Off-By tags in kernel patch submissions has made a huge difference in providing a trail of authorship, so that if anything like the SCO lawsuit occurs again, it will be easy to prove who wrote any disputed source code. This was, in fact, Linus Torvalds' stated purpose in introducing the Signed-Off-By header. When first introduced, the idea was quite amorphous, with few details settled. Since then, various wrinkles have been introduced to improve its usefulness. One of the most recent of these is the addition of a From header as the first line of the body of patch e-mails. This header identifies the true author of a given patch. Before this wrinkle, the true author was assumed to be the person with the bottom-most Signed-Off-By header. This, however, became confusing and was not always adhered to. The From header is intended to leave no doubt as to the original authorship of a given patch.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- Tech Tip: Really Simple HTTP Server with Python
- Doing for User Space What We Did for Kernel Space
- Parsing an RSS News Feed with a Bash Script
- Rogue Wave Software's Zend Server
- SuperTuxKart 0.9.2 Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide