At the Forge - Fixtures and Factories
One of the points of pride in the Ruby community is the degree to which developers are focused on testing. As I wrote last month, tests in a dynamic language have more potential to correct more errors and keep your code trim and functional than even the best compliers. Rails developers are used to working with three different types of tests: unit (for database models), functional (for controller classes) and integration (for testing things from a user's perspective). Combined with coverage and analysis tools, such as the metric_fu gem I described last month, these tests can help ensure that your code is as solid as possible before it is seen by the general public.
Testing your code requires that you provide it with inputs and that you then match those inputs with expected outputs. When it comes to a Web application, those inputs most likely will come from either a relational database or from a user's form submission. Testing form submissions is not particularly difficult, especially in a framework such as Rails, which has extensive testing support built in. Testing data that comes from a database, however, can be a bit more challenging, because it means that you must somehow store the data in the database so that the tests can access it.
One possible solution, of course, is to pre-populate the database tables with test data directly. But, as simple and obvious as that solution might appear at first glance, it assumes that you have a source from which you can pre-populate the database. You could do it by hand, but then you'll find that any modifications your program makes to the database—creating, updating and deleting rows—either will stay in effect for the next test or will need to be reloaded from scratch from another source.
In other words, you need a way to put the test database into a known state before you begin your tests. If you know this beginning state, you can write tests that check subsequent states.
The question is, how do you create that initial state? From the time that Rails was first released, the answer was fixtures—text files containing YAML-formatted hand-crafted data. Fixtures are nice, but as a number of Rails developers have written over the years, they can be hard to write, hard to keep track of and generally brittle.
This month, I take a look at the current state of loading data into a test database. I start by examining fixtures, exploring some ways you still might be able to make them useful inside your tests. Then, I cover a newer approach to test data, known as factories, looking at the Factory Girl gem and then taking a quick peek at the Machinist gem, both of which are in widespread use among Rails developers and might be a better fit than plain-old fixtures for your project.
Fixtures, as I mentioned above, are YAML files containing data that can be loaded into a database. Rails actually allows you to put your fixture data in formats other than YAML, such as CSV. However, my guess is that CSV is mostly unused, and that YAML is the format used by almost everyone working with fixtures.
I created a simple Rails application (using SQLite) on my computer with:
rails --database=sqlite3 appointments
Then, I generated a RESTful resource for people:
./script/generate scaffold person \ first_name:string last_name:string email:string
This not only created a model for working with people, but also a controller for handling the basic RESTful functions, views for all of those controller actions, a database migration that uses Ruby to describe my model and even some rudimentary tests. I can import the database migrations with:
And, voilà! I now have a working application that allows me to add, delete, modify and list a bunch of people. You might have noticed that I named my Rails application appointments. My plan is to create a very simple appointment calendar, so that I can keep track of with whom I'll be meeting. So, I create another resource, named meetings:
./script/generate scaffold meeting \ starting_at:timestamp ending_at:timestamp location:text
(It should go without saying that if I were creating this for real, I would not store the location as a text field, but rather as an ID pointing to another table of locations. Keeping data in such normalized form, so that the text appears in a single place and is referred to from elsewhere in the database using foreign keys, makes the application more robust, as well as more efficient.)
Finally, I create a third table, meeting_person, which allows one or more people to have a meeting. If I were willing to restrict appointments to a single participant (or two participants, if I include the person using this software), I simply could have a person_id field in the meeting table. To get this, I create a new model:
./script/generate model meeting_person \ person_id:integer meeting_id:integer
Now that the three models are in place, I can add associations—those declarations in the model classes that link them to one another. While I'm editing the model, I also will add some validations, which ensure that the data fits my standards. The final version of the models is shown in Listing 1. Perhaps the only particularly interesting part of the models is the custom validation that I placed in the Meeting model:
def validate if starting_at > ending_at errors.add_to_base("Starting time is later than ending time!") end end
Fast/Flexible Linux OS Recovery
On Demand Now
In this live one-hour webinar, learn how to enhance your existing backup strategies for complete disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible full-system recovery solution for UNIX and Linux systems.
Join Linux Journal's Shawn Powers and David Huffman, President/CEO, Storix, Inc.
Free to Linux Journal readers.Register Now!
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- Seeing Red and Getting Sleep
- Secure Desktops with Qubes: Installation
- Fancy Tricks for Changing Numeric Base
- Secure Desktops with Qubes: Introduction
- Working with Command Arguments
- The FBI and the Mozilla Foundation Lock Horns over Known Security Hole
- Petros Koutoupis' RapidDisk
- Varnish Software's Varnish Massive Storage Engine
- Linux Mint 18
Until recently, IBM’s Power Platform was looked upon as being the system that hosted IBM’s flavor of UNIX and proprietary operating system called IBM i. These servers often are found in medium-size businesses running ERP, CRM and financials for on-premise customers. By enabling the Power platform to run the Linux OS, IBM now has positioned Power to be the platform of choice for those already running Linux that are facing scalability issues, especially customers looking at analytics, big data or cloud computing.
￼Running Linux on IBM’s Power hardware offers some obvious benefits, including improved processing speed and memory bandwidth, inherent security, and simpler deployment and management. But if you look beyond the impressive architecture, you’ll also find an open ecosystem that has given rise to a strong, innovative community, as well as an inventory of system and network management applications that really help leverage the benefits offered by running Linux on Power.Get the Guide