The distance calculation in Dave Taylor's December 2009 Work the Shell is
in error because he's not using enough
significant digits. Six digits usually are not sufficient to define
the position of a place or object accurately in lat/lon coordinates. In my
typically takes nine digits to get sub-meter accuracy.
Dave Taylor replies: Ahh, I'll have to check that out. Thanks for the tip!
I like your National Debt figure tally, which used to appear in
the LJ Index. It is a healthy reminder of
the country's economic burden. However, it is difficult to grasp how much
money it is. Here is my favorite method to “get it”. A $100 bill is
approximately 0.1mm thick. So in a cm, you have 100 bills = $10,000. In a
meter, you can stack a million dollars, or 1.6 billion dollars in a mile. So
a National Debt of 2.54 trillion dollars is a stack of 1,587 miles of $100
bills! If you split it in two and put it on a road, you will have almost
800 miles of $100 bills. If you drive your car on it, you will run out
of gas before reaching the end!
I devoured Mick Bauer's Paranoid Penguin column “Linux Security Challenges
2010” in the January 2010 issue. How I wish the people where I work had
one-tenth of his insightful common-sense view! But, I think one challenge
is missing, user education. Although the IT department must bear the ultimate
responsibility for security, I think end users should be held accountable.
Computer basics should be on the hiring criteria for any position that
requires the use of a computer, and basic training should be compulsory for
new hires, as should annual “what's new” sessions. My company offers free
training in just about everything except computer know-how, and it
considers including its computer/e-mail policy statement in the new-hire
packet satisfactory. As a result, many users don't know what copy/paste
does, but they use company laptops for everything from on-line shopping to
browsing porn sites. They have no motivation to change, because their
behavior has no consequences. But if doing something stupid costs
properly trained employees bonus points, they might think twice before
If you ask me why I keep a Windows desktop, the answer is still three letters, DRM. I have two desktops: my Ubuntu 9.10 (which is very good and does just about everything) and my W7 desktop (which does everything with DRM). Even with the CrossOver Office Plugin, I can't run Amazon Unbox, which allows me to watch my favorite TV shows over and over.
I have been a Linux user since RH 6.2, and I love the way things have come along. There have been few DRM issues, but again, until we make some kind of headway with all the major distributors, I will have to keep a Windoze desktop of some kind. I now run W7. It's half the bloated warthog of its predecessor, but it's still not my beloved GNOME desktop. (Yes, I really love GNOME.)
The day I can run the Amazon Unbox client, or the day Amazon ports Unbox over to Linux (and, of course, somebody works out a deal with Netflix) is the day I give up this machine of great power (Intel i7) and make it my new, true desktop.
My Linux desktop is highly functional and a
flawless media server and print server for the house. Now, if we could only
get the major streaming media people to help us out.
I know exactly what you mean. Linux itself is quite functional; it's just all the services from others that are causing the problem. I have the same issue with DRM, and while I try diligently to avoid it, there are times when I'm forced into such things. Netflix, for example, runs great on the Linux-based Roku device, but for those of us with Linux desktops, we can't watch streaming video! So, yes, I feel your pain.—Ed.
Recently, I decided to download Google Chrome for Ubuntu after the almost stealthy release last month. I was blown away by the many great features and other hoopla we've been eagerly awaiting. One surprise feature, which is of extreme interest to myself as a Web developer, is that Google Chrome has an App Mode where it runs a Web site in its own application window, à la Mozilla's Prism.
There's one problem though. The .desktop files it creates to do so are garbage and won't install into /usr/share/applications without hacking permissions on the folder, which will lead to constant messages indicating the change in permissions when applications are installed/updated. So, not to be defeated by such a simple problem, I came up with what I'd like to think is an elegant solution—a Python app I call chromify.
It's a simple application that makes up for Google Chrome's shortcoming in creating clean .desktop files. For now, it's just a CLI program, but it's still very feature-rich and uses a Python module so that other interfaces can be used/developed (that is, CGI or a GUI) to create Google Chrome Web apps.
Downloads, installation guides and a wiki are available at code.google.com/p/chromify.
Note: I'm not trying to advertise here. This was something of an
extreme annoyance for me, and I know it likely is for others. This is more
of an “I thought I'd share” letter than anything else.
G. John Schutte
Whoa, that's awesome. Prism with Firefox has been rather wonky of late. I do like Chrome's App Mode, and although I haven't had a chance to play with it enough to discover the issues with its .desktop files, it sounds like it's pretty annoying! Thanks for not only fixing the problem, but also for sharing your solution. Hooray for open source!—Ed.
I loved Mick Bauer's “Linux VPNs with OpenVPN” article in the
February 2010 issue, and I think it is long overdue. OpenVPN is a great
I'm just wondering if Mick meant VPN where VLAN is listed on page 28, as
these are two completely
different types of network technology.
This also brings up a good idea—any chance on doing an article describing
how to configure Linux for use with VLANs?
Mick Bauer replies: You're quite right, both instances of the term VLAN in this article (in the same paragraph) were misprints; I meant to say VPN. I hope this didn't cause too much confusion!
You're also right that VLANs (Virtual LANs) would make a good topic. Actually, Paul Frieden covered VLANs in 2004 on the LJ Web site, in his article “VLANs on Linux” (www.linuxjournal.com/article/7268), including its use with iptables. But, I've been thinking of covering this topic myself in the context of a larger article on building Linux broadband routers using OpenWRT—stay tuned for the (probable) upcoming series. Thanks both for the correction and the suggestion!
Joey Bernard's review of diff and its relatives on page 16 of the February 2010 issue was useful, but he didn't mention sdiff, which is the basis of my favorite way of comparing two text files.
The problem with diff -y is that the side-by-side format isn't very convenient. I prefer to use the following:
sdiff -s -w156 file1 file2 | more
This places the lines of file1 above those of file2, if they differ. Small
differences really stand out in this format.
The -w 156 makes corresponding columns line up in an 80-column text
Andrew T. Young
Thanks for the wonderful article “Running Remote Applications” by Michael J. Hammel in the February 2010 issue of Linux Journal.
In the article, it says “Because VNC is based on the tile architecture, where rectangles of frame buffer memory are resent if they have been updated, any compression that improves the transfer of tiles will have serious performance implications.” I am confused. Are you saying that if tile compression of the frame buffer segments is implemented that the performance would be worse as opposed to non-tile compression? One would think that frame buffer compression would improve bandwidth usage but may, in fact, increase latency—what do you think?
By the way, it would be great if a follow-up article could be done on remote
multimedia in concert with remote desktop—I was thinking about UPNP AV
Michael J. Hammel replies: I'm not completely sure about this. I think this is information I found during my research and not in my experimentation, but I thought it worthwhile to include in the article. Compression, in general, should increase latency while reducing bandwidth. However, my interpretation of my own writing is that the client side potentially sees tile compression from the server as raw pixel changes and, therefore, asks for all tiles repeatedly. If this is true, compression potentially increases bandwidth by resending more tiles than what was actually updated. I'm not sure this is actually the case, however. My recent use of VNC using virt-viewer and friends doesn't seem to show this, although I can't verify (at the moment) if compression is being used. In essence, I can't find my notes that say where this came from. At best, it is accurate based on specific implementations. My apologies for not being able to back it up further.
As far as a follow-up article on UPNP AV and DLNA, I'm not familiar with either at the moment, but I'll certainly look into them. I love to find new technologies to play with, and if I can speak intelligently about them, I'll certainly propose it to LJ (if I can just remember to hang onto my research notes—sigh). I believe Red Hat's recent announcement on SPICE is related to this problem space, and I'm planning on investigating that as well. Thanks for the feedback!
Linux has provided my desktop environment 99.99% of the time since somewhere around the days of Red Hat v3 (that's mid-1990s) at work (college computer technology instructor, retired for about five years now) and at home. I also used it on my laptop and several lab servers at work.
Applixware was the first office suite I used. Then came StarOffice and currently OpenOffice.org. Regarding Bruce Byfield's article “OpenOffice.org vs. Microsoft Office” in the February 2010 issue, he makes the statement: “Similarly, Impress lacks the ability to use the pointer to draw on the screen during a presentation.” This feature has been present for some years before I retired. To use it, one simply must enable it in the Slide Show→Slide Show Settings menu. The three-pane window of Impress was never an issue either. The best solution for me was a dual-head system. Any geek worthy of the name should be able to scrounge up a second monitor and a dual-head video card. The extra screen space is very useful.
My only gripe with Linux is that somewhere between Fedora 8 and 9 the dual head, aka multiple monitor, facility was broken for my video card, the Matrox G450. I have reported the bug as new releases come along, but it is still broken. I continue to use the latest release of Fedora for some things, as I am right now, but when I need to be productive (for example, programming, image editing, Web site development), it's back to my Fedora 8 system.
I encourage others to try OpenOffice.org whenever the opportunity arises.
It has served my needs well. What the future will bring to the desktop,
only time will tell. But as for me and my data and apps, I do
not want them
blended with the stuff out on the Net. Eat local, compute local. If it
comes from far away, I want to know.
Thanks for the correction on OpenOffice.org. As far as the Matrox G450 goes, I find it amazingly ironic that it doesn't work well, considering the Matrox cards used to be the only video cards a Linux user could rely on! Hopefully, things will straighten out for your trusty G450 soon.—Ed.
I liked Michael J. Hammel's article “Running Remote Applications” in the February 2010 issue, but there's an easier way to get a remote desktop under Linux. I use FreeNX (https://help.ubuntu.com/community/FreeNX is a good place to start) provided by NoMachine (www.nomachine.com). It's faster than VNC and secure. I connect to my desktop computer at home from my workplace, and it's almost like sitting in front of the computer. Better still, it allows you to suspend a session and reconnect to it later, so when I come back to work the next day, I can resume the session, and all the windows I had open are still ready to go, including whatever pages I had open in Firefox.
I sound like I work for NoMachine, but I just like it that much. It is
Michael J. Hammel replies: I'd heard of FreeNX and had looked at NoMachine some time back, but I simply didn't remember it before working on the article. Being an old-timer, I was using XDMCP. Even VNC was new to me when I started into the article. FreeNX certainly sounds like it's worth investigating in the future.
Regarding Bruce Byfield's “OpenOffice.org vs. Microsoft Office” in the February 2010 issue of Linux Journal, the article compares the features of both pieces of software, but it fails to compare the bugs. With most professionally produced pieces of software, that would not be an issue, but it is with OpenOffice.org.
I am in love with Linux and the Open Source movement, and I have been running Linux in place of Windows on my home machine for a number of months now. Generally, I am willing to trade flawless functionality for knowing that what is installed on my computer is owned by me.
Usually, I don't need to make that trade, but with OpenOffice.org's Writer, I have had to. Writer (and this is as true of the latest version as with earlier versions I have used) randomly reformats and/or deletes text for me. Sometimes, I save a document I have worked on, and when I open it up, it is reformatted. Writer is, in my judgment, unreliable.
Whether or not this extends to the rest of the suite, I do not know; however, this is not acceptable behavior from a piece of software that purports to provide an essential piece of business functionality. For my own home use, it is good enough, but the nonprofits I work with choose not to use it, even though it is free, because they know that if they trust their business with it, they will get burned.
I am a new entrant to the Open Source/Linux community and I look to Linux Journal, as a kind of de facto authority to provide a clear and seasoned perspective that I can rely on. Failing to address the bugginess (when it exists) of open-source software supports a communally held fantasy of the purity, nobility and infallibility of open-source software, which, in my opinion, does not serve the community, and it certainly does not serve me.
Let's get real about this.
Thanks for a great magazine.
Bruce Byfield replies: I am sorry that you have had such a frustrating experience with OpenOffice.org Writer. If the problems you describe were widespread, I certainly would have mentioned them in the article. However, I have used Writer for more than nine years and follow a number of mailing lists about OpenOffice.org, and I have never encountered the problems you describe. Nor, so far as I can remember, have I ever heard of anyone else having them.
Have you tried discussing your problems on the OpenOffice.org Users list? If not, you can subscribe at www.openoffice.org/mail_list.html.
Have a photo you'd like to share with LJ readers? Send your submission to firstname.lastname@example.org. If we run yours in the magazine, we'll send you a free T-shirt.
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
|Introduction to MapReduce with Hadoop on Linux||Jun 05, 2013|
- Containers—Not Virtual Machines—Are the Future Cloud
- Non-Linux FOSS: libnotify, OS X Style
- Linux Systems Administrator
- Validate an E-Mail Address with PHP, the Right Way
- Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- Introduction to MapReduce with Hadoop on Linux
- RSS Feeds
- Bought photoshop CS5 for developing a website :(
1 hour 12 min ago
- What the author describes
2 hours 38 min ago
- Reply to comment | Linux Journal
6 hours 48 min ago
- Reply to comment | Linux Journal
7 hours 33 min ago
- Didn't read
7 hours 44 min ago
- Reply to comment | Linux Journal
7 hours 49 min ago
- Poul-Henning Kamp: welcome to
9 hours 59 min ago
- This has already been done
10 hours 17 sec ago
- Reply to comment | Linux Journal
10 hours 45 min ago
- Welcome to 1998
11 hours 34 min ago
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?