The Ultimate Linux Box

We packed unbelievable power in a tank case and added all the trimmings for less than $4,000.

We chose Patriot eXtreme Performance 4GB (2 x 2GB modules) 240-pin DDR2 SDRAM, which sells for about $250. Linux can use all 4GB of RAM whether you run a 32-bit x86 or 64-bit x86_64 kernel, but most people won't really need 4GB of RAM even with four CPU cores. If you want to cut back on the total price, you can try the CORSAIR XMS2 2GB (2 x 1GB modules) 240-pin DDR2 SDRAM, which runs about $160. The latency on the CORSAIR modules is actually better than the Patriot eXtreme. The CORSAIR timing is: 4-4-4-12, and the Patriot is 5-5-5-12. We've run benchmarks that show that lower latency helps performance, especially with AMD processors, but if you can tell the difference in actual everyday use, let us know. We can't.

Regardless of the memory modules you choose, make sure you insert them into the correct slots for dual-channel operation. In the case of the ASUS Striker Extreme, it is the first and third slot (every other memory slot, with matching colors).


The ASUS motherboard has what it calls integrated RAID onboard, as is true of most motherboards available today. This is really a misnomer. There is no hardware RAID controller on this or most other motherboards. The onboard RAID is really just a multichannel SATA controller. We configured Kubuntu 7.04 to run RAID 0+1 (also known as RAID 10) using four drives attached to the onboard SATA. It worked fine, but it was much more trouble than it was worth, so we do not recommend that approach. It is an especially bad idea if you intend to run more than one version of Linux on the same machine. It was hard enough going through the procedure once. We wouldn't want to repeat the process for every distribution we tried.

If you really want to use RAID, take our advice and buy a real RAID controller card. We chose the 3ware 9650SE-4LPML PCI Express controller. We connected four drives and configured them in RAID 10, which provides the best performance and safety at the cost of disk space. It stripes two sets of two drives, mirrored. The striping gives you the performance. The mirroring gives you safety, because you can replace a failed drive without losing any data. However, because two drives are redundant, you get half the disk space of your four drives. Our four 320GB drives gave us about 640GB of disk space.

If you really want more storage space than we created, you can buy larger drives or sacrifice some performance and configure your array as RAID 5. RAID 5 trades write performance for more storage space.

The 3ware controller is superb. It delivers excellent performance and it is very easy to set up. You press Alt-3 to activate the setup screen at boot time. This utility allows you to create storage specifically for booting operating systems, but you don't need to use this feature. It may be necessary for other operating systems, but you can boot fine from the RAID array with Linux by using normal RAID partitions.

You can add a battery backup unit to the RAID controller so that you are less likely to lose data if you experience a power outage. We didn't include the battery backup as part of our ultimate box though.

You shouldn't need to add drives to your system via the onboard NVIDIA SATA controller if you use this RAID card. If you do add drives to the onboard SATA, however, be warned that some Linux distributions may get confused about the order of drives in your system. We tried adding a drive and did not experience this problem with this particular combination of components, but this problem has reared its ugly head with other similar configurations, so we assume it's still possible.

You may see an on-screen message at boot time that says the controller is not compatible with your BIOS. It goes by so quickly that you may miss it. If that concerns you, there are a variety of other RAID cards from which to choose. However, despite this warning, our 3ware card has performed without a hitch for weeks, and we love the fact that the Linux kernel has great support for the card by default.

The card is PCI Express x 4, which works fine in the middle PCI Express slot of the motherboard. If you go with our recommendation, make sure you plug the card in to the center PCI Express slot, not the second PCI Express slot for video SLI. You will experience lockups and problems if you choose the latter slot.


Some might consider a RAID cage to be frivolous. But this is the Ultimate Box after all, so we're including the 3ware RAID cage that lets you hot swap four drives in the space of three 5.25" drive bays. Aside from easy drive replacement, the RAID cage has two advantages over mounting the drives normally. First, the 3.5" hard drive cage in the case fits only two drives if you want good air circulation. You would have to mount at least two of the drives in the 5.25" area, and then add drive fans if you want to keep them cool. The RAID cage lets you pack all the drives in one small space, and it comes with its own set of fans to keep them cool. The cage also requires only two power sources, instead of one for each drive.

Figure 4. The 3ware RAID Cage with the Door Closed

Figure 5. Pop open the RAID cage door, unlock the drives and pull them out for easy hot-swapping.



Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.


Anonymous's picture

I used to own that keyboard (MSNEK 4000), and it is the only keyboard that has ever failed on me. It is not a very good keyboard for the price. The keys are mushy and the leg stands broke after a few months. It was also a pain when it first came out to get most of the keys working. I could never get the zoom slider to work with any software. It also has a windows (super) key, which on a linux computer, is analogous to putting a Hyundai emblem on a BMW. However, if you insist on getting a keyboard with the natural layout, go for the older Natural Elite.

No mention of RAID compatible drives...

Anonymous's picture

Hmmm, recommending RAID, but failing to mention the problem with hardware RAID and some hard drives...did you check 3ware's RAID site to see if you are using drives approved by 3ware? One of your recent articles, the one about the 12 TB or 16 TB or whatever high TB backup server build detailed the issue of RAID and jackass hard drive manufacturers with their crappy firmware and the necessity of flashing new firmware on some hard drives to get them to try to work correctly with hardware RAID...

Having been burned by Maxtor (iirc) dropping out of a RAID 5 array (kind of ironic, would be funny too unless you try and wrap your head around redundancy and drives dropping out of RAID 5 with hot-spare and the reasons for spending A LOT of money on RAID 5 capable hardware RAID and the cost of all the drives...) dropping out of the array because of firmware putting the drives to "sleep" during "inactivity", and the subsequent necessity of moving all the data back off the RAID array so you can shut down the workstation or server to remove the drives from the RAID array, plug them into the motheboard ports, flash them inddividually, then reinstall on the RAID card, re-initialize them over another half day to full day, reinstall the OS and data...finding out about all this after you install the OS and migrate the data to the array...priceless. And one way for hard drive manufacturers to be placed on the permanent sh*t list...

If you're going to be showcasing a RAID card in your builds, when you KNOW that some of your readers are going to follow most or all of the build as their spec have a duty to call the hard drive/RAID compatibility issue out, whether you're beholden to your advertisers/potential advertisers or not. Otherwise your just as guilty as the tech review sites who continued to use Deathstars in their review rigs without mentioning the Deathstar issue when most other tech sites were reporting on the issue and covering the subsequent class action lawsuit and the later exiting of IBM from the hard drive business. The fact that other sites were covering the issue didn't mitigate the fact that some sites continued to use (and list) the Deathstars in their test rigs during their performance testing and reporting. Nor is it a mitigating factor that a tech review site can't afford to buy new drives since they recently purchased the of the excuses I was given when I called them on the issue in an email, nor did they address the issue in their response to me of not mentioning the Deathstar issue somewhere in each performance shootout of other hardware.

Finding out that you could've avoided the hard drive firmware flashing chore after you purchase the drives doesn't exactly make a happy camper. Finding out that the manufacturer's drives are to be avoided at all, or that you purchased the drives and the return period ran out while you waited for other hardware to be delivered/back-ordered, or that you'll have to pay a restocking fee, or that the vendor doesn't accept returns of OEM drives...all because there wasn't a heads up in the article about the "approved drive" or bad hard drive manufacturer firmware issue...especially for specs for a computer that isn't really a server and may likely be someone's first experience with hardware RAID...especially for someone who in all probability will be ordering the hard drives and RAID card at the same time...

Just a minor observation... ;-p

Linuxjournal redefines RAID levels...

Anonymous's picture

"We configured Kubuntu 7.04 to run RAID 0+1 (also known as RAID 10)"


An observation I'd expect from PC Magazine or other Ziff Davis publications, not Linuxjournal.

"RAID 0+1 is NOT to be confused with RAID 10. A single drive failure will cause the whole array to become, in essence, a RAID Level 0 array"

Perhaps a RAID refresher, especially the part about fault tolerance between 0+1 and 10 would be appropriate?

Interesting article. A setup with AMD Phenom quad core cpu would be nice, along with if the possibility of a dual phenom quad core cpu motherboard is available, comparison would be great as well. I thought you needed specific dual version or 8-way versions of Opterons for multi-socket boards, hence the existence of 2xxx and 8xxx versions of Opterons in addition to 1xxx versions (I'm aware that there are other specific advantages to the multi-socket versions of Opterons, better communication between the cpus, but from what I remember of the AMD Durons being used in dual-socket boards when that wasn't intended by AMD and they tried to prevent this through locking the cpus...

I guess I'll have to check out if they are still around and other dual AMD sites if they still exist to see what my options are before I spec out a new computer. Hope there are lower cost options out there for home workstations without moving up to a SuperMicro board or other workstation/server board intended for business processing. And even for non-servers, some of us are able to put 4/8 cores to use, as well as 4 GB and more memory even though we aren't compiling anything. So don't assume...

How about an almost-ultimate desktop box that doesn't require a gaming class graphics card or electrical breaker jumping cpu/power supply? Say a low-power Phenom quad core or highest clock speed Phenom at 95 watt class rather than the 125 watt class, 4 GB memory, what motherboard, sane cooling, etc.

How about a desktop cooled via ducting, duct tape (if necessary, don't discount it), a custom built manifold made out of cardboard > and a 20" box fan located in another room for silent cooling in one of your buildups?

How about a 4 port Areca card intead of a 3ware card, where RAID 5 & 6 levels (they're not the same, ;-p btw) become possible, or RAID 5 with hot-spare...I'd personally go with RAID 10 on one of these builds, with at least one hot spare (having suffered catastrophic deathstar-related data loss when two mirrored drives failed within hours of each other, teaching me a good lesson on the value of hardware RAID, appropriate RAID levels, hot-spare value, and the subsequent lesson Adaptec burnished into me on fake "hardware" RAID and Linux support). The problem with RAID 10 with hot-spare however, is that it requires an Areca or 3ware board with more than 4 ports, which means jumping up to an 8 port card and the extra 150% plus cost that entails.

Since you're running Drupal, reCaptcha is also an option. I'd guess you're Captcha solution is too easy to break, isn't it? Also, how about enabling the blockquote tag, I see it isn't working...

Cover info in error

Jay Griffin's picture

On the the cover of this September issue at the bottom there was the bullet info for the Ultimate Linux Box that had an error. The cost amount showed "$4,0000" and was meant to have been "$4,000" most likely. I believe this was overlooked by the proof staff in their overview before release. I would think that the cover, in the marketing aspect, should be the most verified and looked over part of the developement stages prior to entering the internal editorial stage. I am sure it was just an oversight by the delivery team. In short, it was kind of an eye catcher for me. I wonder if anyone else caught it?

Keep up the Great Linux coverage.

Nah mate ... perhaps you

Anonymous's picture

Nah mate ... perhaps you should get out more ..

That junky keyboard is bent.

Anonymous's picture

That junky keyboard is bent. Your missing several buttons from your mouse. That screen is too small. That box has too much excess trim. It lags...