Yehuda Katz is a member of the Ember.js, Ruby on Rails and jQuery Core Teams; he spends his daytime hours at the startup he founded, Tilde Inc.. Yehuda is co-author of best-selling jQuery in Action and Rails 3 in Action. He spends most of his time hacking on open source—his main projects, along with others, like Thor, Handlebars and Janus—or traveling the world doing evangelism work. He can be found on Twitter as @wycats.
May 2nd, 2009
One of the things I love the most about the Ruby community is how easy it is to try out small mutations in practices, which leads to very rapid evolution in best practices. Rather than having the community look toward authority to design, plan, and implement “best practices” (a la the JSR model), members of the Ruby community try different things, and have rapidly made refinements to the practices over time.
It is natural to assume, looking from the outside, that the proliferation of practices is dangerous or fracturing. It is not. Instead, it functions more like biological evolution, where small mutations conspire over time to refine and improve the underlying organism. Consider the example of testing. There are a number of testing frameworks used by Rubyists, but they have largely converging feature-sets. As the feature-sets converge on superior solutions (e.g. Rails’ flavor of Test::Unit now comes with Rspec-style declarative tests), another round of differentiation occurs, allowing the community to zoom in on the now smaller differences and allow evolution to take its course.
The analogy isn’t perfect, but the basic idea is sound. It’s tempting to find consolidated practices on May 2, 2009, and find a way to shout them from the rooftops in more official form, so that those who haven’t caught up yet will have a way to immediately select the “winner” practice without having to do detailed investigation. Further, some have suggested that we should rank Ruby firms by how well they conform with the most popular practices of the moment. This will allow those who are looking for a firm to hire to determine whether or not their potential hirees conform with those practices.
Unfortunately, while that might work for a given slice in time, it provides unwelcome and artificial inertia for the practices of today. Now, in addition to having to contend with the normal inertial forces that resist changes until they are proven (wise forces), firms that want to try out new practices will need to contend with the artificial inertia imposed by being moved down on a list of firms conforming with other practices.
In effect, in creates a chilling effect on experimentation and innovation, and a drag on natural evolution.
It makes perfect sense to create a forum for sharing and aggregating the practices that people are finding useful at the moment. What makes less sense is creating a ranked list of “popular” practices, with no obvious mechanism for mediating differences except pure popularity. And even worse is ranking firms by their aggregate level of conformance.
As Rubyists, we need to discourage artificial attempts to encourage conformance and discourage innovation. Rails shops should find other ways to advertise the quality of their practices without falling back on appeals to the masses, and those in the market for Rails services should do their due dilligence. Measuring the popularity of a practice as a replacement for due diligence is frankly a recipe for failure, and once real investigations have been done, hollow measures of popularity won’t add much.