Best of Technical Support

Our experts answer your technical questions.
Choosing the Right Commands

I'm trying to configure my Red Hat 6.0 system to allow clients to access CD-ROM images from my Linux server hard drives. After looking at various file systems such as Samba and NFS and commands such as MAKEDEV, vnconfig, mount, smbmount etc., I'm getting confused as to which combinations of commands to use. —Mark J. Foucht,

Red Hat 6.0 uses knfsd, which works somewhat differently compared to the old userland NFS server. One big difference is that you have to export each file system mounted in order for clients to see them (with the old server, you could just export= /, and clients would have a view on all your file systems).

In your case, if your CD-ROM is mounted under /mnt/cdrom, put the following in your /etc/exports file:

/mnt/cdrom (ro)

Then, type the following to migrate the entry to /var/lib/nfs:

moremagic:~# exportfs -av
exporting :/mnt/cdrom

To see if it worked, type:

moremagic:~# showmount -e localhost
Export list for localhost:
/mnt/cdrom (everyone)

To mount from another machine, type:

mkdir /mnt/remotecd
mount remotemachinename:/mnt/cdrom /mnt/remotecd

—Marc Merlin,

If your CD-ROM will be used by Windows computers, you should use Samba. Here is the entry you can add to your /etc/smb.conf file:

comment = CDROM
path = /mnt/cdrom
read only = yes
guest ok = yes
case sensitive = no
mangle case = yes
preserve case = yes

You should restart Samba after modifying the file. Just type as root:

/etc/rc.d/init.d/smb restart

If you want to make it accessible to NFS users (UNIX computers), you should add the line /mnt/cdrom to your /etc/exports and restart your NFS daemon by using /etc/rc.d/init.d/nfs restart. —Pierre Ficheux,

Memory Error, Parallel Computing

I am currently running Caldera Openlinux 1.3 on a Compaq Presarion CDS 526 (486 66MHz). Believe it or not, I had no trouble getting it loaded on my machine. I do have a problem with my RAM memory. When I do the free command, it shows I have only 15MB of memory, when I actually have 36MB. Why is this? Is it a problem that has been corrected in a more current kernel, or is it more of a hardware problem?

My next question concerns the world of parallel computing. I have a new computer on order (P3 500MHz), and when I get it, I will be installing Linux on it as well as the one mentioned above. I am interested in hobbying in the world of parallel computing, and I wondered if it would do any good trying to run parallel with a 500MHz machine and a 66MHz machine, or will the whole thing run slower? Thanks for your help. —John,

This 36MB you've mentioned is a rather “non-standard” amount of memory. Please use a dmesg command to see how much memory it finds during boot time. —Mario Bittencourt,

There is a kernel option for limiting the memory to 16MB; maybe it is activated in your current kernel. You should recompile a new kernel without this option, in “General Setup”: Limit memory to low 16MB (CONFIG_MAX_16M) [N/y/?] N —Pierre Ficheux,

On your second question, it will depend on how you do it. Think of a job jar, representing a problem decomposed into independent jobs. Each CPU grabs a job out of the jar when it's finished with the previous job. With good choices of job sizes, you win. If the jobs are too small, the extra communication and coordination overhead negates the gains from the slower CPU. If too large, the faster CPU will finish first and have to wait for the slower one to finish—and it may end up waiting longer than if it had done all the work itself. You may have to experiment to find good job sizes, though the obvious computation based on the two systems' relative speeds should get you in the right neighborhood.

I'd recommend you start by looking into PVM, the Parallel Virtual Machine system. Find PVM at —Scott Maxwell,



White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState