The New York Times cited Terra Nova this past week. They highlighted Dan's tongue-in-cheek poke - the 2006 in Review. Yes, the internet is a murky place. Don't understand the problem, surely you must get the solution? Don't grasp the solution, just worry about the community, right?
Admittedly I've been sceptical of e-crowds of late [1.]. After all, who cares if the Mechanical Turk could be used as "a community substitute:" if you can't aggregate enough wise friends to your site, just hire them (see also [2.]).
My grumble could end here were it not for another internet-driven excitement that is bothering me: Ruby On Rails (RoR, Fn1). Sounds arcane. Consider it as a metaphor for Web 2.0 and its symbiosis of people and technology, and I will explain my squeak.
There is a programming language called Ruby. It exemplifies (IMO) much of what is right about a programming language: unpretentious, succinct, and above all, flexible. Much like community it too embraces an openness in ideas and membership that serves it well. In any case, who can argue with its increasing favor.
Now, somewhere along the way bright folks got this idea of bundling Ruby with a number of other components - a webserver and database, say - in a way that made it much easier to use in applications that seemed suited to the language and how folks were using it. RoR was born and its attention was geared to those interested in rapidly creating interactive websites, that sort of thing.
When it comes to RoR, it is hard to quibble with the practicality of packaging Ruby with its commonly associated components. What is not to like with Domain Specific Languages and Model-View-Controller (Fn2)? Especially in this day and age, software development should be moving towards conceptualizing solutions in terms of a full-stack framework. Future productivity demands it.
Yet after casually deconstructing Instant Rails one weekend I was struck by how deeply ingrained in the RoR view of the world the database and the limitations of object-relational mapping  played. This leads to this question. Given an efficient technical paradigm, for better or for worse does that paradigm eventually finds its reflection in the application to the exclusion of alternative solutions?
Or to state it with more subtlety. If the advantage of an RoR type solution is that (using the lingo) it pushes the complexity of software to the edge cases (in other words, it is really easy to do the usual stuff and really hard to do the unusual stuff), then at what point need one worry about what they are missing by clipping the edges?
I suspect a similar issue with many virtual world implementations. For example, by relying upon the database for all the persistence and transactional heavy-lifting, the game/world becomes shaped to its way of thinking. How many times have you played an online game and thought - the reason why they structured an interaction or an object design in such a way was because it was easier to map into the server scripting and ultimately the database?
Think about those edges you are missing. I'll think about the ruby slippers I wish I had.
[1.] "Where's the catch." (TN) On "Henry Jenkin. Collective Intelligence vs. The Wisdom of Crowds. "
[2.] "Games with a purpose." (TN) On Luis von Ahn, social algorithms.
[3.] "Guess my game." (TN) On the "Vietnam of Computer Science" and the "Grammar of Gameplay."
Fn1. Also rubyonrails.org.