The Gemcutter's Workshop: Many Developments in the Ruby Community

Recapping two busy weeks' worth of news, events and releases in the Ruby world.


The past couple of weeks have been huge in the Ruby world. A number of
major releases of popular Ruby packages were made, and several
interesting posts were made to blogs and the Ruby mailing list. Let's
now take a quick look at the bi-week that was.

One package to keep an eye on is Mongrel, a much faster replacement for
WEBRick. Zed Shaw, who's also working on SCGI as an FCGI replacement,
seems to be picking up steam with Mongrel. He's added sendfile support
to an already zippy little Web server, along with stability improvements
and other bug fixes. It also looks as though IOWA might be supported by
Mongrel in the near future, adding to the already supported nitro and
Rails. For more information about Mongrel, head on over to
mongrel.rubyforge.net.

Another big release to consider is JRuby, a Ruby implementation in Java.
This version features a working irb implementation and several other
enhancements. The JRuby team says that it's getting close to having
Rails run on the platform as well. It will be interesting to see how
this project affects YARV, metaruby and Ruby itself. JRuby's
home page is the
place to go for more information about the project.

Last time,
I wrote about ZenTest. Since then, zenspider and
Eric Hodel have released
version 3.1.0 of ZenTest. It fixes the bugs I mentioned in my last
article and also added multiruby and many other new features, including
automatic syncing with your SCM repository under several
different SCM systems. If you're already using ZenTest, go grab this
update. If you're not using it, why not? More information can be found
here.

The biggest news from the past two weeks, however, has to be the release of
Rails 1.1. The Rails development team has pushed out over 500 changes and enhancements. Some
of the big ticket items are RJS templates (the Ruby JavaScript
templates), ActiveRecord++, respond_to and integration tests. A number
of people have mentioned concerns about backward compatability, but
Rails 1.1 looks to be a solid release, offering a lot of reasons to follow the upgrade
path. More details about Rails are available on its
Web site, and
specific information about this release can be found
here.

In keeping with the Rails idea, two blog posts worth looking at came
from Eric Hodel, a programmer and systems administrator at the
Robot Co-op, creators of
43 Things and other Rails/Web 2.0 goodness. Eric posted
a review of the
hardware
design
behind 43 Things and answered a ton of questions in the comments. He
also wrote about the software
behind the Web site. If you'd like a peek behind the curtains at a
successful, fairly high-traffic Rails site, go take a look.

Mauricio Fernandez also made some great posts recently. One that
really caught my eye discussed using some code analysis metrics to
estimate the value of Ruby and the libraries that people have developed for
it. Mauricio sets the value of Ruby at $20 million dollars, with another
$100 million dollars for additional libraries. You can read his article
here.
One thing that really stood out to me was the number of high-quality,
well-designed packages that showed a lower value, such as RubyInline. It seems that simple,
expressive code doesn't stand up well in traditional analysis. Maybe
there's room to look at how to better evaluate code going forward--any
takers?

Last time around, I mentioned Canada on Rails. This time, I'd like to
touch on another recently announced gathering,
the St. Louis
CodeCamp
to be held May 6-7. The Web site and registration system
was developed in Rails by David Holsclaw of the stlouis.rb. If you're going
to be anywhere near St. Louis in early May, you might want to get involved.

Hal Fulton announced his recently published article on metaprogramming with Ruby.
Available here,
the article got some great reviews on the mailing list. Go check it out. If you're interested
in metaprogramming and other "Higher Order" programming constructs, you
might want to take a look at
James Gray's Shades of
Gray
blog. The bulk of the content there is
a running commentary on Gray's reading of
Higher Order
Perl
, written by Mark Jason Dominus.

Over on
comp.lang.ruby
(gatewayed to/from the ruby-talk mailing list and at least one forum),
a post pointed to a
blog

about the forthcoming plethora of Ruby and Ruby on Rails books. Between
formally announced and informally announced books, it looks
as though we'll soon be carrying around a heavy load of books. Fortunately, many of
these books are, or will be, available as PDFs.
Adventures in Ruby Programming
Because the community-related information went a bit long this week, I'm
going to shoot for a slight change of pace. Instead of talking about a
tool, I thought I'd tell you about the adventure that Sean Carley and I
have been having while working on our "checkr" program--think Ruby Lint.

At this point we're working Test-First through a spike to learn more
about ParseTree, which promises to
be the backbone of our code analysis. We live a couple of thousand
miles apart and can't really pair-program, so we decided to try "ping
pong programming". Sean writes a unit test, and I write the code
to make it pass. Once I've got a passing test, I refactor write a
failing test. Then, the code goes back to Sean to repeat the cycle. We
spend a lot of time using IM to communicate as we're writing code and
tests, exploring ideas, asking questions and giving advice.

One thing that's really been interesting is how much this helps us focus
on the basics--taking small bites, practicing YAGNI (Ya Ain't Gonna Need
It) and letting the tests guide our development. For example, when we
first sat down to write code, we knew that we wanted to use
ParseTree to do the heavy lifting in code analysis. It ended up taking a
couple of tests before we got to the point that we were using it. Then,
in only one more test, we'd dug a level deeper and found we needed
SexpProcessor, a component within ParseTree, to start working with the code.

By the way, working with ParseTree and SexpProcessor has given me a new
appreciation for unit_diff. Let me show you why. Here's a silly little
Ruby snippet:


def foo
  if a == 2
    b = 2
  end
end

ParseTree turns that code into a "sexp" (an
S-expression),
like this:


[[:class,
  :Example,
  :Object,
  [:defn,
   :example,
   [:scope,
    [:block,
     [:args],
     [:defn,
      :foo,
      [:scope,
       [:block,
        [:args],
        [:if,
         [:call, [:vcall, :a], :==, [:array, [:lit, 2]]],
         [:lasgn, :b, [:lit, 2]],
         nil]]]]]]]]]

Once you start working with bigger expressions, trying to find the
difference between the expected and actual values from a unit test can
make your head explode.

Putting the pieces together has helped Sean and I understand our problem
domain a lot better, which means that checkr will be a better tool.
We're writing the code to throw away, but the quality of the tools has
made our disposable code pretty nice, too. Here's an example; it's the
method we use to look for assignments instead of comparisons in the test
clause of "if" or "unless" statements:


class ParseTest < SexpProcessor
  def assignment_in_conditional?(exp)
    @saw_lasgn = false
    test_result = process(exp.shift)
    raise CheckRAssignmentInConditional if @saw_lasgn
    test_result
  end

  def process_lasgn(exp)
    @saw_lasgn = true
    s(exp.shift,
      exp.shift,
      process(exp.shift))
  end

  def process_if(exp)
    s(exp.shift,
      assignment_in_conditional?(exp),
      process(exp.shift),
      process(exp.shift))
  end
end

The assignment_in_conditional? and process_lasgn methods are used by a
number of other methods.

Sometimes we find ourselves moving back and forth over the same code.
At one point, I'd finished writing the code to handle
the first and simplest test for a while loop. I immediately smelled
code duplication, so I performed an "extract method" refactoring
on one of two parallel methods in order to share the code. I then
wrote a test and checked everything in. Sean picked up the code, played
with my test for five or ten minutes and realized that he needed to undo my
refactoring. He did, made the test pass and then saw that he could do
the same refactoring--although a bit less radically than I had.

We also have made some interesting discoveries. For example, once we'd
finished doing some work with if statements, I added a test for a
parallel unless statement. It turns out that unless statements are
converted into if statement under the covers, and we ended up getting
that functionality for free.

In addition, it wasn't only our code that we learned about. We discovered a bug in
the most recent gem of ParseTree, too. This meant we had to spend a day or two
on an educational detour through the internals of ParseTree and Ruby
itself. But that's a story for another time and place.

______________________

--
-pate
http://on-ruby.blogspot.com

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

value != cost

vruz's picture

great post, Pat.

here go some thoughts that spawned from question you proposed:
"Maybe there's room to look at how to better evaluate code going forward--any takers?"

the zillion man-hours invested in creating and developing Ruby are actually the *cost* of building it.
the value is an entirely different thing.

right to the point you made, the expresiveness and beauty of Ruby is what makes it possible to create better software with a smaller investment.

the larger the gap between investment/cost and value the *better*, precisely what makes us believe ruby is the better language for many applications.

so, maybe we should reformulate this question and analyse whether we're talking about pure-ruby applications, computer-assisted ruby code (like that that is partly produced by a generator), or hand-written libraries in C. (it's probably obvious for us that all of these have very different ratios of cost/value)

as for measuring *value* it's probably a hard thing to do, we can probably do it by comparing a ruby application to an equivalent application written in another language. (the "tenth of java" claims come to mind) but these are mostly all subjective valuations, not exactly measurements per se.

there's always an update

pate's picture

This morning, shortly after this went up on the website, Zed Shaw announced an even newer version of mongrel. Version 0.3.12.1 offers:

  • more testing (cool stuff)
  • more features (like handling X-Forwarded-For headers)
  • more digits in the version number

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix