European companies often get the jump on their North American counterparts regarding the addition of Linux compatibility. A fine example is Italy's Memopal, which now offers a Linux version of its on-line backup utility. Memopal offers automatic and continuous backup to a remote server via a secure Internet connection, a service that has been lacking in the Linux space. The company claims that its Memopal Global File System archiving technology provides a distributed filesystem that supports up to 100 million terabytes of storage, transparent read-write compression, hot-add scalability and more. In beta at the time of this writing, Memopal for Linux supports Ubuntu 8.04 and Debian Etch.
While other laptops hog the mainstream media glory, the Linux-based ASUSTeK's Eee PC is the underdog “little PC that could”. To get to know this now darling of the Linux community, get your hands on William Lawrence's Using the Eee PC from Que. The book covers everything from turning on the machine and connecting it to the Internet, as well as how to upgrade and update it. The machine-book combination will help you convert your loved ones to Linux while keeping their after-hours tech-support calls to you at a minimum.
Please send information about releases of Linux-related products to firstname.lastname@example.org or New Products c/o Linux Journal, 1752 NW Market Street, #200, Seattle, WA 98107. Submissions are edited for length and content.
James Gray is Products Editor for Linux Journal
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
|Non-Linux FOSS: Seashore||May 10, 2013|
|Trying to Tame the Tablet||May 08, 2013|
- RSS Feeds
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- New Products
- Validate an E-Mail Address with PHP, the Right Way
- Drupal Is a Framework: Why Everyone Needs to Understand This
- A Topic for Discussion - Open Source Feature-Richness?
- Download the Free Red Hat White Paper "Using an Open Source Framework to Catch the Bad Guy"
- Home, My Backup Data Center
- Tech Tip: Really Simple HTTP Server with Python
- Please correct the URL for Salt Stack's web site
39 min 17 sec ago
- Android is Linux -- why no better inter-operation
2 hours 54 min ago
- Connecting Android device to desktop Linux via USB
3 hours 23 min ago
- Find new cell phone and tablet pc
4 hours 21 min ago
5 hours 50 min ago
- Automatically updating Guest Additions
6 hours 58 min ago
- I like your topic on android
7 hours 45 min ago
- Reply to comment | Linux Journal
8 hours 6 min ago
- This is the easiest tutorial
14 hours 20 min ago
- Ahh, the Koolaid.
19 hours 59 min ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?