The OpenDocument Format (ODF) just keeps on getting stronger. It is now an official ISO standard; there are numerous applications that support it, with varying degrees of fidelity, including Google's online word processor and spreadsheet; there's an official Microsoft-funded plug-in for Microsoft Office that allows it to open and save ODF files, and a program that converts between ODF and the Chinese UOF XML office format; and the ODF community has largely sorted out issues of accessibility that threatened to de-rail its adoption by Massachusetts.
At the same time, Microsoft is clearly beginning to feel the pressure. Its attempts to ram its own XML format through as an ISO standard, and the unseemly haste with which its 6000 pages of documentation were approved as an ECMA standard, are an indication that it is playing catch-up in this sector, even if it remains the dominant player.
ODF's strength comes at a time when Microsoft's focus is elsewhere. The recent launch of Vista has not caught the public's imagination in the way that Windows 95 did. Back in 1995, there was no doubt that this was a defining moment that would radically change the computing landscape; with Vista, on the other hand, even Bill Gates seems to be struggling to articulate why anybody would bother upgrading from Windows XP:
NEWSWEEK: If one of our readers confronted you in a CompUSA and said, “Bill, why upgrade to Vista?
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
|Non-Linux FOSS: Seashore||May 10, 2013|
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- Drupal Is a Framework: Why Everyone Needs to Understand This
- Validate an E-Mail Address with PHP, the Right Way
- A Topic for Discussion - Open Source Feature-Richness?
- New Products
- Download the Free Red Hat White Paper "Using an Open Source Framework to Catch the Bad Guy"
- The Secret Password Is...
3 hours 46 min ago
- Keeping track of IP address
5 hours 37 min ago
- Roll your own dynamic dns
10 hours 50 min ago
- Please correct the URL for Salt Stack's web site
14 hours 2 min ago
- Android is Linux -- why no better inter-operation
16 hours 17 min ago
- Connecting Android device to desktop Linux via USB
16 hours 45 min ago
- Find new cell phone and tablet pc
17 hours 44 min ago
19 hours 12 min ago
- Automatically updating Guest Additions
20 hours 21 min ago
- I like your topic on android
21 hours 7 min ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?