Modeling the Brain with NCS and Brainlab

Beowulf Linux clusters and Python toolkits team up to help scientists understand the human brain.
Brainlab

My first thought for a scripting language was MATLAB, given its prominence in our lab. But repeated licensing server failures during critical periods had soured me on MATLAB. I considered Octave, an excellent open-source MATLAB work-alike that employed the same powerful vector processing approach. I generally liked what I saw and even ported a few MATLAB applications to work in Octave in a pinch. I was pleased to find that the conversions were relatively painless, complicated only by MATLAB's loose language specification. But I found Octave's syntax awkward, which was no surprise because it largely was inherited from MATLAB. My previous Tcl/Tk experiences had been positive, but there didn't seem to be much of a scientific community using it. I had done a few projects in Perl over the years, but I found it hard to read and easy to forget.

Then I started working with Python on a few small projects. Python's clean syntax, powerful and well-designed object-oriented capabilities and large user community with extensive libraries and scientific toolkits made it a joy to use. Reading Python code was so easy and natural that I could leave a project for a few months and pick it up again, with barely any delay figuring out where I was when I left off. So I created the first version of Brainlab using Python.

In Brainlab, a brain model starts as a Python object of the class BRAIN:

from brainlab import *
brain=BRAIN()

This brain object initially contains a default library of cell types, synapse types, ion channel types and other types of objects used to build brain models. For example, the built-in ion channel types are stored in a field in the BRAIN class named chantypes. This field actually is a Python dictionary indexed by the name of the channel. It can be viewed simply by printing out the corresponding Python dictionary:

print brain.chantypes

A new channel type named ahp-3, based on the standard type named ahp-2, could be created, modified and then viewed like this:

nc=brain.Copy(brain.chantypes, 'ahp-2', 'ahp-3')
nc.parms['STRENGTH']="0.4 0.04"
print brain.chantypes['ahp-3']

To build a real network, the brain must contain some instances of these structures and not only type profiles. In NCS, every cell belongs to a structure called a cortical column. We can create an instance of a simple column and add it to our brain object like this:

c1=brain.Standard1CellColumn()
brain.AddColumn(c1)

This column object comes with a set of default ion channel instances and other structures that we easily can adjust if necessary. Most often we have a group of columns that we want to create and interconnect. The following example creates a two-dimensional grid of columns in a loop and then connects the columns randomly:

cols={}
size=10
# create the columns and store them in cols{}
for i in range(size):
    for j in range(size):
        c=brain.Standard1CellColumn()
        brain.AddColumn(c)
        cols[i,j]=c
# now connect each column to another random column
# (using a default synapse)
for i in range(size):
    for j in range(size):
        ti=randint(0, size-1)
        tj=randint(0, size-1)
        fc=cols[i,j]; tc=cols[ti,tj]
        brain.AddConnect(fc, tc)

Our brain won't do much unless it gets some stimulus. Therefore, we can define a set of randomly spaced stimulus spikes in a Python list and apply it to the first row of our column grid like this:

t=0.0
stim=[]
for s in range(20):
    t+=random()*10.0
    stims.append(t)
for i in range(size):
    brain.AddStim(stim, cols[i,0])

Simulating the Models

So far, our brain model exists only as a Python object. In order to run it in an NCS simulation, we have to convert it to the text input file that NCS demands. Brainlab takes care of this conversion; simply printing the brain object creates the corresponding NCS input text for that model. The command print brain prints more than 3,000 lines of NCS input file text, even for the relatively simple example shown here. More complicated models result in even longer input files for NCS, but the program version of the model remains quite compact.

By changing only a few parameters in the script, we can create a radically different text NCS input file. The experimenter can save this text to a file and then invoke the NCS simulator on that file from the command line. Better yet, he or she can simulate the model directly within the Brainlab environment without even bothering to look at the intermediate text, like this: brain.Run(nprocs=16).

The Run() method invokes the brain model on the Beowulf cluster using the indicated number of processor nodes. Most often, an experiment is not simply a single simulation of an individual brain model. Real experiments almost always consist of dozens or hundreds of simulation runs of related brain models, with slightly different parameters or stimuli for each run. This is where Brainlab really shines: creating a model, simulating it, adjusting the model and then simulating it again and again, all in one integrated environment. If we wanted to run an experiment ten times, varying the synapse conduction strength with each run and with a different job number each run so that we could examine all the reports later, we might do something like this:

for r in range(10):  # r is run number
    s=brain.syntypes['C.strong']
    s.parms['MAX_CONDUCT']=.01+.005*r
    brain.parms['JOB']='testbrain%d'%r
    brain.Run(nprocs=16)

______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState