Letters to the Editor
Thank you and thanks to Mick Bauer for the articles on rsync
[LJ, March and April 2003]. rsync's ability to
“pull” as well as “push” solved a long-standing
problem of how to synchronize through a firewall without too much hassle.
rsync is also more intuitive than rdist.
Pascal and C arrived on the scene about the same time.
As far as it went, Pascal was a better idea than C, but
Pascal did not go far enough, and C won. Pascal inspired Ada,
which was awful. C grew into C++, even more half-baked than
C. It took Niklaus Wirth two decades to get Pascal fully grown
into a finished product of the quality of the original idea.
The result is Oberon-2. Unfortunately, it arrived after the
train had left the station. For more see www.waltzballs.org/other/prog.html.
I want to set up something to regulate my wireless
network so the person has to log on to the domain.
NoCatAuth, nocat.net, does what you need. We plan to cover it next issue. —Ed.
In response to a letter in the June 2003 issue, I was motivated to
clear up a mistake. Kathy claims that “Debit on the left,
Credit on the right” in double-entry accounting can be
explained by the Latin for left and right. The Latin words for
left and right are sinister and
dexter, respectively. Credit actually
comes from credo, to trust, and debit
comes from debeo meaning to owe.
I have an ongoing debate with my son on the correct
pronunciation of Linux. I say
that it is pronounced either as in finish or as
in helix, and my son says it is pronounced as
in Lennox or tennis. Who is correct?
Listen to Linus Torvalds say it at www.kernel.org/pub/linux/kernel/SillySounds. —Ed.
I've been a fan and a subscriber of your magazine for some time now.
However, I was recently disappointed to see you running a review
of SCO's Linux product, in light of their lawsuit against IBM
[LJ, June 2003]. To
get directly to the point, given SCO's opinion, and now their opening
terrorism tactics to Linux customers, publishing a SCO product review
is in bad taste and insulting to the entire Open Source Development
SCO pulled their Linux distribution from the market, and made sweeping accusations that Linux contains code copied from SCO UnixWare, after we already had gone to press. But our timing wasn't that bad after all. Because we still have access to the SCO update service, we can substantiate the fact that SCO continues to offer Linux source code under the GPL: linuxjournal.com/article/6899. SCO's position on Linux, and the responses from developers and companies, are changing too rapidly to cover in a monthly publication. Check our web site for the latest. —Ed.
I generally agree with the opinion from Dr Mark Alford on GNOME 2 [Letters, LJ, June 2003], but I believe some comments will help him and other readers.
You can still use sawfish on Red Hat 8 and configure it from the Extras→Preferences menu, including the keyboard shortcuts and window layout. You need to add the following lines in your .bash_profile file:
WINDOW_MANAGER=/usr/bin/sawfish export WINDOW_MANAGER
See the gnome-wm script in /usr/bin.
You can recover some of the chopped-off functionality of GNOME 2 by writing a Nautilus script. For example, if you want to see the list of files contained in an RPM file, write an rpm command in a script:
rpm -qlp $@ >/tmp/rpmlist.txt gview /tmp/rpmlist.txt
Many script examples are at g-scripts.sourceforge.net.
Although my employer is very supportive of Linux (it's our primary operating
system) and open-source development in general, our legal department
become quite sensitive to the contents of open-source licenses.
Specifically, they are concerned about some of the licenses that require
any modifications to code to be distributed back to the original
authors—even if it is not otherwise publicly redistributed. They
discovered some odd parts to licenses, such as “you can use this software,
but you have to buy me a beer if you're ever in Boston.” The net
result is that we now have to obtain legal approval before downloading,
installing, using and especially modifying any open-source software.
(They're equally restrictive about proprietary licenses, but that seems
justified.) Although I appreciate the levity some open-source
“licenses” have, it seems that we may need better standards in licensing
to encourage its adoption in litigation-heavy corporate environments.
Once you add terms like the ones you mention, the license is no longer a Free Software license or an Open Source one. These issues do not affect the standard free software licenses such as the GNU GPL and the Apache license. Can you get your legal department to approve the standard licenses, so you don't need a program-by-program review? —Ed.
In the June 2003 issue of Linux Journal, a
letter from John was run under the title “Freedom Threatens Some
Companies”. In this letter, John wrote that he felt the integrated
library system Koha, www.koha.org and other
free software projects represented a threat to small- to medium-sized
commercial software companies. He argued that it seemed likely that
a single free software project eventually could dominate a particular
market niche and thus drive out the commercial competition. I think he's
wrong, both in the case of Koha and in the larger case of free software in
general. I can't give specific information about other projects, but I can
see how Koha can help, not hurt, the library automation marketplace. Koha
is a thriving little free software project. Several mailing lists are
devoted to it, nearly 40 people have CVS access, and a growing number of
sites use it. Currently, at least three commercial ventures work on Koha
actively, and several others offer commercial support for the platform.
Any library automation vendor is perfectly able to pick up Koha and create
an offering built around it—in fact, I'd encourage them to do so. If
Koha doesn't meet their needs, perhaps one of the other free library
systems will. Libraries who adopt Koha take on a level of commitment
resulting in their giving back to the larger Koha community. In some cases
this means developing new features or fixing bugs in Koha themselves,
in other cases they might hire someone to do so. Some libraries invest
in Koha by reporting bugs, writing documentation, answering questions on
the mailing list or explaining library-specific knowledge to developers
without library backgrounds. This same kind of commitment is seen from
many other users of free software. Not everyone gives back, but enough
do to keep the community viable.
kaitiaki/manager, the Koha Project
For every line of code we write, every class or function that we
make, there must be documentation. My professors all have commented on
my documentation or the lack of it. As a self-taught programmer, I see
coding as more of an art form than a how-to guide. I had written hundreds of
lines of code and projects that do amazing things, but no one could follow
my source code. Now that I am helping to develop open-source projects,
using documentation is imperative. The Open Source community
has brought me into the light. With documentation we can all make
beautiful software together. Thanks to Linux Journal for making me
a comprehensive programmer.
Student at Southeastern Louisiana University
Being an apprentice DIY chef and a passionate Linux worshiper, I always
enjoy reading Cooking with Linux. Some days ago I was preparing some
cookies that reminded me of you wearing a chef hat. I couldn't resist
taking some photos and sending them to you! Hope you enjoy them.
I've got to comment on the article “Introducing the 2.6 Kernel”
by Robert Love [LJ, May 2003]. Training strictly as a network engineer for 14 years,
programming jargon was always over my
head. Mr Love kept the developer's
jargon in sync with the rest of us out here in the network field, which made it
the most pleasurable article I've read to date. I actually understood
what the kernel was doing and as such, now have a greater appreciation
for what's happening on the inside. I hope you can convince Mr Love
to write ALL future kernel release articles for Linux Journal.
Intel Solutions Center Network Engineer
Keep watching Kernel Korner for more from Robert Love. —Ed.
Bravo on the C++ editorial. The clock is ticking, and die-hard C/procedural programmers are running out of excuses to eschew the notion of C++ as a suitable language for small- to mid-sized projects. The year 2003 is upon us, hence the time to dispel some old rumors (or, in some cases, review some old facts):
“C++ is slow”: modern compilers are closing the C/C++ speed gap. Some of C++'s fabled performance lags may be due to proper (and automatic) calls of constructors and destructors, the sort of initialization and cleanup that you already should have in your C code anyway. To close the remaining gap, we can replace certain object hierarchies with template programming, in which we perform decision making at compile time rather than runtime.
“Compiled C++ libraries are incompatible with one another”: compilers are stepping up to the line drawn by The Standard, which means this soon will be a thing of the past.
“C++ is the worst of both worlds”: this is purely an issue of perspective: imagine the cleanliness of encapsulation, constructors/destructors, exceptions and inheritance when needed, plus the speed of pointers when wanted.
“C++ is fat and complex”: is this really C++ or just the OO paradigm? It takes some exercise to get one's mind around it and to write true object-oriented software, but once you understand it you'll never go back. For those thorough procedural programmers, imagine being able to wrap those cleanup-style calls into a function that is automatically (or automagically) called at scope exit.
“Not everyone knows C++, and a common language is key for community-style projects”: although is this true in a certain sense, the only way it will change is if people buckle down and experiment with C++. At one point in its lifetime, every computer language was new and therefore not as well known as some others.
With those complaints out of the way, developers can now make a more
educated decision as to what language should be used for a given project.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- My +1 Sword of Productivity
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
- Google's SwiftShader Released
- Parsing an RSS News Feed with a Bash Script
- Rogue Wave Software's Zend Server
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide