Fabric: a System Administrator's Best Friend

Do you routinely make changes to more than a dozen machines at a time? Read this article to find out about a tool to make that task much easier.

I'll be honest. Even though this library is fully five years old, I hadn't heard of Fabric until about six months ago. Now I can't imagine not having it in my digital tool belt. Fabric is a Python library/tool that is designed to use SSH to execute system administration and deployment tasks on one or more remote machines. No more running the same task, machine by machine, to make one change across the board. It is a simple fire-and-forget tool that will make your life so much simpler. Not only can you run simple tasks via SSH on multiple machines, but since you're using Python code to execute items, you can combine it with any arbitrary Python code to make robust, complex, elegant applications for deployment or administration tasks.

Installation

Fabric requires Python 2.5 or later, the setuptools packaging/installation library, the ssh Python library, and SSH and its dependencies. For the most part, you won't have to worry about any of this, because Fabric can be installed easily through various package managers. The easiest, and most prolific way to install Fabric is using pip (or easy_install). On most systems, you can use your systems package manager (apt-get, install, and so on) to install it (the package either will be fabric or python-fabric). If you're feeling froggy, you can check out the git repository and hack away at the source code.

Once installed, you will have access to the fab script from the command line.

Operations

The Fabric library is composed of nine separate operations that can be used in conjunction to achieve your desired effect. Simply insert these functions into your fabfile and off you go:

  • get(remote_path, local_path=None)get allows you to pull files from the remote machine to your local machine. This is like using rsync or scp to copy a file or files from many machines. This is super effective for systematically collecting log files or backups in a central location. The remote path is the path of the file on the remote machine that you are grabbing, and the local path is the path to which you want to save the file on the local machine. If the local path is omitted, Fabric assumes you are saving the file to the working directory.

  • local(command, capture=False) — the local function allows you to take action on the local host in a similar fashion to the Python subprocess module (in fact, local is a simplistic wrapper that sits on top of the subprocess module). Simply supply the command to run and, if needed, whether you want to capture the output. If you specify capture=True, the output will be returned as a string from local, otherwise it will be output to STDOUT.

  • open_shell(command=None) — this function is mostly for debugging purposes. It opens an interactive shell on the remote end, allowing you to run any number of commands. This is particularly helpful if you are running a series of particularly complex commands and it doesn't seem to be working on some of your machines.

  • prompt(text, key=None, default='', validate=None) — in the case when you need to supply a value, but don't want to specify it on the command line for whatever reason, prompt is the ideal way to do this. I have a fabfile I use to add/remove/check the status of software on all of the servers I maintain, and I use this in the script for when I forget to specify what software I'm working on. This prompt will appear for each host you specify, so make sure you account for that!

  • put(local_path, remote_path, use_sudo=False, mirror_local_mode=False, mode=None) — this is the opposite command of get, although you are given more options when putting to a remote system than getting. The local path can be a relative or absolute file path, or it can be an actual file object. If either local_path or remote_path is left blank, the working directory will be used. If use_sudo=True is specified, Fabric will put the file in a temporary location on the remote machine, then use sudo to move it from the temporary location to the specified location. This is particularly handy when moving system files like /etc/resolv.conf or the like that can't be moved by a standard user and you have root login turned off in SSH. If you want the file mode preserved through the copy, use mirror_local_mode=True; otherwise, you can set the mode using mode.

  • reboot(wait=120)reboot does exactly what it says: reboots the remote machine. By default, reboot will wait 120 seconds before attempting to reconnect to the machine to continue executing any following commands.

  • require(*keys, **kwargs)require forces the specified keys to be present in the shared environment dict in order to continue execution. If these keys are not present, Fabric will abort. Optionally, you can specify used_for to indicate what the key is used for in this particular context.

  • run(command, shell=True, pty=True, combine_stderr=True, quiet=False, warn_only=False, stdout=None, stderr=None) — This and sudo are the two most used functions in Fabric, because they actually execute commands on the remote host (which is the whole point of Fabric). With run, you execute the specified command as the given user. run returns the output from the command as a string that can be checked for a failed, succeeded and return_code attribute. shell controls whether a shell interpreter is created for the command. If turned off, characters will not be escaped automatically in the command. Passing pty=False causes a psuedo-terminal not to be created while executing this command; this can have some benefit if the command you are running has issues interacting with the psuedo-terminal, but otherwise, it will be created by default. If you want stderr from the command to be parsable separately from stdout, use combine_stderr=False to indicate that. quiet=True will cause the command to run silently, sending no output to the screen while executing. When an error occurs in Fabric, typically the script will abort and indicate as such. You can indicate that Fabric need not abort if a particular command errors using the warn_only argument. Finally, you can redirect where the remote stderr and stdout redirect to on the local side. For instance, if you want the stderr to pipe to stdout on the local end, you could indicate that with stderr=sys.stdout.

  • sudo(command, shell=True, pty=True, combine_stderr=True, user=None, quiet=False, warn_only=False, stdout=None, stderr=None, group=None)sudo works precisely like run, except that it will elevate privileges prior to executing the command. It basically works the same is if you'd run the command using run, but prepended sudo to the front of command. sudo also takes user and group arguments, allowing you to specify which user or group to run the command as. As long as the original user has the permissions to escalate for that particular user/group and command, you are good to go.

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Running arbitrary command trick

Athmane's picture

Fabric can be used to run arbitrary command with:

fab -u username -H hostname -- 'uname -a'

thanks

symbian's picture

iam enjoyed and like this

I usually just do something

jbowen7's picture

I usually just do something like the following:

for i in $SERVER1 $SERVER2 $SERVER3; do
scp myTasks.sh root@$i:/tmp
ssh root@$i '/bin/bash /tmp/myTasks.sh ; [[ $? != 0 ]] && echo "stuff broke" || rm -f /tmp/myTasks.sh'
done

Good content, I trust this is

Jiad's picture

Good content, I trust this is a good weblog about Wish to see refreshing content material next time. Thanks for sharing this publish with us. Keep it up.voyance gratuitement

A good alternative

Anonymous's picture

I've started using 'Salt' for this purpose. It handles all the rsa key authentication automatically, allows for remote execution of scripts, and also, offers a higher level configuration management system.
http://saltstack.com/community.html

hello !!!

linda99's picture

Hi everyone your site is great, it is comprehensive and super attractive! your little presentation is very nice! Good luck on my part!
Avenir amour

Perl alternative

Anonymous's picture

There is a similar Perl alternative around, which I prefer for such tasks, as the Perl syntax is just simpler. It's called (R)?ex or just "Rex".

http://rexify.org/

Nice, thanks for the tip.

anti ddos's picture

Nice, thanks for the tip. Looks interesting.

Another Approach To This Problem

A. Coder's picture

I wrote something along these lines some time ago, but packaged it as a turnkey utility:

http://www.tundraware.com/Software/tsshbatch

Another Approach To The Same Problem

A, Coder's picture

Something I wrote along the same lines but more
as a turnkey utility:

http://www.tundraware.com/Software/tsshbatch

pssh!

JonnoN's picture

Been using pssh for quite a while to do just this!

http://code.google.com/p/parallel-ssh/

And since I'm using Bash to execute items, I can combine it with any arbitrary Bash code to make robust, complex, elegant applications for deployment or administration tasks. :)

Better tool available

Anonymous's picture

I have found CSSH (Cluster SSH) to do this task quite well. Plus it sounds a lot more simple to use.

?

Anonymous's picture

Hello there.

I believe that almost every linux admin has done something like that tool-set on his/hers career. Either with python, ruby, or (k,z,c,ba,)sh script...

But what I find good about this is that it's not being kept closed in a "box". :)

Good job, and thank you for doing this.

Rgds,

Me:)

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState