It's been some time since I haunted the distinguished halls of TN, but after some tumultuous times that got me out of the habit of putting my working papers up on ssrn and pointing to them here (and at my own blog), I do have a piece that I wanted to share (and I'll be cross-posting this to Doubt is an Art, as I do with all game-related stuff). I'm sure my skin has grown thin from all this time away from the rough-and-tumble world of collaborative blogging. Be gentle. ;)
Last year I had the opportunity to give the keynote address in February at the Ray Browne Conference on Popular Culture at Bowling Green State University, as well as to participate in a symposium in April convened by the Potomac Center for the Study of Modernity on Modernity and Chance. Both venues seemed apt arenas for developing some ideas about game as a cultural form, one that we could place alongside ritual and bureaucracy in our understanding of institutions and the techniques for control at their disposal. The core question I'm asking is: What might we learn by examining the increasing use of games by modern institutions in the digital age as analogous to their longstanding and effective use of rituals and bureaucracy?
These are not new ideas in my work (and especially not in some posts I've done here), but I had long planned to set down my ideas about these cultural forms and their relationships to each other in one place, mostly for my own satisfaction (for that reason, too, I am sure that many of the examples will strike readers of my work as familiar ones for my thinking).The paper is still in a working state, without citations and still set up for oral presentation (and includes many of the images from these occasions – I do not plan on including them for final publication). You should be able to download the paper here. Here is the abstract:
The projects of governance at the heart of state and other institutional control under the context of modernity have been marked by a heavy reliance on two cultural forms, ritual and bureaucracy, each of which organizes action and meaning through distinctive invocations of order. The steady rise of liberal thought and practice, particularly in the economic realm (following, if partially, Adam Smith) has gradually challenged the efficacy of these cultural forms, with open-ended systems (more or less contrived – from elections to the “free” market) exerting more and more influence both on policy and in other areas of cultural production. It is in this context that games are becoming the potent site for new kinds of institutional projects today, whether in Google’s use for some time of its Image Labeler Game to bring text searchability to its image collection, or in the University of Washington’s successful deployment of the game Fold-It to find promising “folds” of proteins for research on anti-retroviral vaccines.
But even as they are so used, we can see how these contrived, open-ended mechanisms create new challenges to the structure of the very modern institutions which would seek to domesticate and deploy them. While a longstanding example would be Hitler’s unsuccessful use of the 1936 Berlin Olympic Games’ results as part of his project of political legitimization, digital networking technology is making new and more complex gambits of this sort possible today. Linden Lab, makers of the virtual world Second Life, found itself in a state of organizational contradiction as it sought to architect, from the top-down, a game-like space premised (and sold) on a playful ideal of user freedom and control. Google’s recent and reluctant turn to curators for certain search terms also reflects the limits of their previous attempts to continually refine their algorithms so as to let search results reflect perfectly the aggregate actions of web users. In all of these cases we see that a turn toward open-ended, game-derived mechanisms (which often mirror the market) generate paradoxes for those who sought to leverage the potency of games for generating meaningful outcomes.
In this process digital technology has played an important role as well, making the use of games possible at a scale vast in both scope and complexity, while subtly changing even what a useful conception of games would be that could account for the game-like elements now proliferating in much of our increasingly digital lives. From this twenty-first century vantage point, what may we learn by setting the cultural form of game against these other cultural forms, with attention to their shared and distinctive features? By considering what has changed to make the domestication (as it were) of games possible, and also reflecting on how these other forms have been put to work by institutions, we can begin to chart the landscape ahead for games and institutions under the context of modernity and ask key questions about what issues of policy and ethics it raises.
Any comments welcome, of course. It feels good to be back.
Holy smokes, but someone did a real Mapping test using SimCity and his hometown's traffic. It's not exactly rigid science, but this is the sort of application I've been hoping to see since writing this thing.
Ha, got your attention, eh? But this is actually an important topic, of the 'reality is broken' variety. Like the fact that we're obsessed over sexting and other digital phenomena related to sex, yet we have done little to improve sex education in this country. In fact, we have vilified and cut funding to Planned Parenthood and other organizations that save people's lives by providing them critical information that affects them physically, emotionally, spiritually. I ranted about this on Quora recently:
Sexting isn't the issue. The lack of good, ongoing and honest sex education and support (from elementary school) is. I personally don't care what kids do in this regard, as long as they are well-informed and not succumbing to pressure from peers or romantic partners. And obviously this behavior is probably not appropriate in classrooms.
As an anthropologist, I will tell you that sex play in early pubescence and later is very, very normal, and in some cultures, very well tolerated with positive effect. We are extremely backwards in our proclivity to bury our heads in the sands.
I do, however, think all kids need to be educated on the potential ramifications of having a digital trail of activity like this, and what it can mean in terms of reputation (immediately) and career later. It's outrageous that kids learn most of what they know about sex from each other, tv/movies and the Internet. This leaves gaping holes in their knowledge and their judgment about something that can affect their lives in such profound ways, and can even lead to illness and death.
I've been thinking for some time that games could play an important role in helping to eliminate a lot of the misinformation that is spread among kids and teens. Typically this sort of thing is handled a la the serious game: take some existing curriculum - the sort of thing you'd see in a high school sex ed class (if a school is lucky enough to have it). 'Chocolate-covered broccoli'. Seldom about the realities of sex and the social and emotional contexts that surround it. And boring.
One of my favorite sites is Ask Alice, a community effort from Columbia University that provides a forum for kids and teens to ask any question about sex, drugs, what have you, and get a truthful and reasonable answer. I think it's an incredible resource, but most people don't seem to know about it. So where are people getting their sexual educations?
I was really inspired by a TED Talk a lovely woman by the name of Cindy Gallop gave not long ago. You should watch it yourself, otherwise I might ruin it, but I will tell you that she makes some rather stunning points about how porn culture has distorted the way people think about sex. Clearly we need to figure out some better ways to communicate all of this, aside from ignoring the groundswell of sexual activity that is incredibly normal for our species.
There's rather a dearth of recreational, digital sex games, a fact that surprises me given the proclivity of clever porn mongers who hawk every kind of sex ware imaginable. Have throughout history, using any available technology. It's well established, for instance, that early photography and film thrived on sexual innovations. And we certainly spent a lot of time discussing the ins and outs of cybersex back in the day, when everything digital was a novelty. Are we jaded? Or recession economics?
Well, it seems like a business opportunity to me. They appear to sell plenty of books and board games in those novelty sex shops. People could certainly use some variety in their sex lives. Yet the ecosystem somehow manages to eschew innovation, just like the video game industry. Microsoft, for instance, is blocking sexual uses of their Kinect device, citing 'unintended puposes' (imagine a mash-up of a Kinect device and teledildonics - long distance love, FTW!). I did find A-Chat , but it seems like a graphics enhanced chat room app, and that's boring, too... I suppose there's the seedy underbelly that is Second Life's sex subculture, but it seems, well, seedy. And not terribly educational. But if people are into it, great. Let's just have some other options.
Most sex and videogames conjecture has been about either glorifying or bastardizing sexual content. There are few balanced perspectives: Brenda Braithwaite's work is very insightful, and Bonnie Ruberg has made quite a few contibutions, too. That's sort of not the point I'm trying to make, though. Sure, we could be more mature about sex and sexual imagery in games. But I sort of don't care about that stuff. I want us to ask, yet again, how can this incredible platform for persuasion be used for the greater good? How can we inform people, encourage safe play and experimentation... delight with escapism, encourage fantasy and role-playing... do all the things that we know video games are so good for?
So, brilliant Terra Novans... what games would you design to solve this problem?
It's that time again... the persistent rush at the beginning of each new cycle of time to reflect and predict. Well, we like that sort of thing around here. Sometimes we're right, sometimes wrong. But we're always trying to draw out our inner oracles...
My 2011 (and onward) predictions:
- our small people will continue to overrun our Facebook accounts as they fiend for more and more digital bling, especially since Facebook apparently doesn't let kids under 13 have their own accounts. I will continue to shell out the credit card for $10 of 'presents' for my kid's best gamer friends. Perhaps this economic boom will fuel the 'maybe we will survive this media change!' mentality.
- the fantasy MMO reaches saturation levels except for the truly committed. This is not a lore problem, but a pattern matching one. Expect regeneration in 5-10 years or when the new LOTR movie comes out. Oh wait. Guild Wars 2. Does war count as fantasy?
- more 'brand-affirming' virtual worlds. Some might be good.
- more alternative/augmented reality and transmedia MMOs (mobile plus tv plus Kinect plus books plus movies plus 3D-everything). More and more exodus.
- more sci fi, speculative fiction, near term possibility exploration (simulation, as predicted by Ted eons ago)
- Is the MMO inside out yet? More and more I find myself gaming with people like my ex mother-in-law (lovely woman, not a gamer of any description tho!)
- More worlds, fewer games? (does Facebook count as a world?)
- The phrase 'casual gaming' will die as everyone begins to game, casually and otherwise. Already so in South Korea (I find it useful to consider parts of Asia as possible reflections of our future(s)).
- The gaming industry will more fully begin to fund and rely on research.
- Singularity?
There are far too many of my interests resurrected in this post. Please add your favorite memes and join me in documenting our predictions! (how will we otherwise remember?)
While wandering through Mass Effect 2, I was struck with the vitality of the world. Circa 2004, the main attraction of a multiplayer environment relative to single player worlds was that single player worlds felt dead. Multiplayer, on the other hand, had vitality but also the annoyances of dealing with other people and their inevitable failure to be perfect friends, or perfect foils.That problem can be reduced by Social Engineering (SE): Designers use policy (sometimes enforced by code) to optimize an individual's experience when dealing with others. Judging from ME2, the problem of dead single-player worlds can be addressed successfully using a suite of tools involving digital storytelling, emotive animations, deep conversation scripts, and a strong responsiveness of the emotive/relational space of characters to the protagonist's actions. Altogether, let's call this bag of tricks "Artificial Emotion" or AE. It's not a new term, indeed Professor Turkle has paved the way here, as before.
As the market for fantasy evolves, these two approaches to improving happiness seem to be facing off.
Would humans be happier in an environment constructed just for them, when every other being behaves just so, even though none of those beings are actually people? Or would they be happier to live with other real people in an optimally designed social environment? Is it easier to improve AE or SE?The issues extend far beyond the game industry. In this area as in others, the game industry is charting territory that business and governments will deal with soon enough. If developments in SE dominate those in AE, look for a future of massively-linked online communities whose policies produce much more happiness than offline communities. If AE wins, look for a future involving isolation pods. Most likely, we will have both. As for offline existence, SE advances might translate into better governance in the real world - better companies, better neighborhoods, better schools. AE advances seem less likely to help the offline world.
Life in the soon-to-be-launched Old Republic may combine the best SE and the best AE in one world.
The economy continues to move slowly and economists seem as uncertain as ever about the causes and what to do. Months ago, I began to wonder – could this possibly be the first “exodus recession”? In my first book I sketched out the idea. Suppose economic activity moves from the real world into the virtual world. Human happiness is unaffected or even goes up, however, the goods that produce the happiness are now produced and consumed in a virtual environment rather than the real one. Measurements of economic activity, being all based in the real economy, would begin to show weakness. I argued that contemporary political and economic control systems do not tolerate much weakness, thus, there might well be some sort of crisis in the real world, for no good reason, simply because production and consumption was going “off the books” and into virtual environments. One term for this would be an "exodus recession" - an economic downturn caused by the movement of human attention and energy into virtual environments.
Are we in an exodus recession right now?
First, let's consider the reasoning by which an interest in virtual things would cause a recession in real life. Why would a movement to digital living cause a recession? My point of view has to be preceded by a disclaimer. Despite years of training by macroeconomists, I have to confess that I don’t feel that we (they or I) understand the macroeconomy very well. For better or worse, I tend not to look at macroeconomics using the Keynesian and Monetarist models bequeathed to economics students today. Over the years I’ve become convinced that the macroeconomy is primarily a matter of mass psychology. If we believe that the economy will grow, it will. Employers will invest and hire, workers will borrow and spend. If we believe that the economy will shrink, it will. Employers will hang onto cash and lay off workers, workers will save their money. Log-rolling and self-confirming expectation rule the day, in much the same way that money’s value is a huge social convention. So is the safety of your bank deposit a convention, insisting that the deposit is safe. Convention – think of the economy’s health as a social convention, an aspect of culture. After World War II, Germany and Japan – the most beaten down nations in human history – suddenly became the most vibrant economically. Is it because their governments (and ours) gave them a stimulus? Or was it because their governments carefully preserved the real purchasing power of their currency? Probably not. We call these events “miracles” because the models don’t explain them at all. The economic miracles happened because Germans and Japanese decided, on a cultural level, to be vibrant economies. Workers threw themselves into consumption and work. Companies threw themselves into investing and hiring. The economies grew like crazy. Hope is the thing to hope for; fear fear.The sensitivity of the state of the economy to our cultural understanding of the state of the economy is greater than ever. We live today in a world where the health of the economy is a widely-reported and narrowly-followed pseudo-fact. Business people focus like laser beams on employment figures, GDP growth rates, and so on. If the GDP growth rate falls from 3% per year to -1% per year – Ye Gods! It’s the end of the world! The impact of such changes on the happiness of an individual person is minimal by itself. But when we read about such things, we all react. Hiring doesn’t happen. Purchases are not made. Lo and behold, the drop from 3% to 1% becomes a drop from 1% to -2%, unleashing another round of anguish. People start to lose jobs – which DOES affect their happiness, a great deal.
The central government responds with stimulus and quantitative easing, none of which will work unless we all believe that a stimulus or QE is just the thing to make everyone believe that the economy is turning around. And if we all have some kind of pessimism about the long haul, if we believe that none of this is going to work, it just won’t – whatever Lord Keynes said.
In such an environment, even a little thing, if persistent, could touch off a prolonged mood of pessimism. Is it possible that the virtual economy is that thing?
George Will recently wrote about the increasing speed with which our experiences are going digital. Using data from Robert Weissenstein, chief investor for Credit Suisse, he notes that “In 2001, the iPod arrived. Less than a decade later, the number of employees of music stores has declined from about 80,000 to 20,000.” And “Three million iPods were sold in 2.5 years; 3 million Kindles were sold in two years; 3 million iPads were sold in 80 days; 3 million iPhones were sold in three weeks.”
Let’s construe the notion of “virtual economy” quite broadly: If you receive an experience by yourself through a machine that runs on digital technology, without doing or buying anything physical (other than press a few buttons), it’s virtual. To download a song and listen to it on your iPod is virtual; to go to a concert is real, to buy a CD and play it is real, to play your own instrument is real. The difference I want to highlight is in the physical nature of the economic transaction. The virtual transaction does not require the movement or alteration of anything physical. Not even physical money changes hands. The real transaction involves material being created, moved, consumed, all by human hands.
Using these concepts, there’s some evidence that an exodus from the real to the virtual is not only already underway (as I argued in my second book) but that’s it’s gotten big enough to affect our sense of a whether the real economy is healthy or not. In support, here’s a series of random judgments about the state of the real world.
TV viewing is down among 18-34 year old males, and movie attendance is flat. Meanwhile, more and more time is being spent online or playing videogames. If you want to get 80 hours of fun watching movies, you need $1000. You can get the same fun from a game for $50. Spending time online or playing videogames simply involves less expenditure in the real economy.
Human eyeballs see a lot fewer ads than they used to. As noted, some people are watching less TV. For most others, the TV they’re watching is increasingly DVR’d or Hulu’d, that is, stripped of ad content. On the internet, we avoid ads easily – they are usually in the periphery, and if not we can click them away, or surf to something else. Advertisers have made an industry on the presumption that ads make people buy things. If they are right, it follows that fewer ads would result in us buying less. Ads are less and less a part of our daily experience. HBO’s success with a show about evil advertisers is perhaps apt now, because we feel we finally have gotten the upper hand on these miscreants. The net result of our power over advertisers, according to their own model, would be a weakness in general real-world consumption.
Facebook is a great way for people to connect. In some FB games, you can buy someone else a beer. You can poke them, write on their wall, friend them. None of this causes anything in the real world to be moved or changed. There are 500m people on FB, hundreds of millions more on other, similar social networking sites. If you’re friending people on FB, you’re ever so slightly less likely to be sending them a real Hallmark card, ever so slightly less likely to write them a note on paper, ever so slightly less likely to give them a call. That’s probably not going to turn around, either. Our ability to socialize online puts a crimp in our general need to move stuff or change stuff in the real world.
People who spend time online don’t have to worry about what they are wearing. Suppose that some percent of a given day can be spent in pajama’s, the rest must be spent in decent clothes. For decent clothes, you need a whole and varied wardrobe. For PJ’s, you need a few comfy ones. Now increase the amount of time that can be spent in PJ’s. The demand for decent clothes falls, if ever so slightly. The internet allows us to do all kinds of stuff in our PJ’s – so it must have an ever so slightly dampening effect on the market for fashion.
One could go on. It is possible, slightly, that there’s a general weakness in consumer spending simply because, to get our social, emotional, informational, and needs met, we just need fewer movies, fewer beers, fewer trips, fewer shoes, fewer things in general. What if the world of human beings suddenly became converted to the idea of consuming less stuff? Why, there’d be a recession, of course. Less buying means fewer jobs and less investment, which means economic contraction. It would mean a general pessimism about the prospects of business.
If our culture suddenly went Green, for example, we would have a recession but we would also understand its cause. We would know that a dissatisfaction with materialism led to economic weakness. But if this conversion to less consumption came from a different and more obscure source, how would we identify it? What if real world consumption refused to grow not because people were becoming hippies, but because they remained selfish materialists who had, however, come to enjoy virtual matter? If an exodus recession were underway, what would the world look like?
For one thing, the general pessimism about the economy in an exodus recession would not extend to the industries that produce virtual experience. Indeed, the video game and social networking industries are booming right now. Computer and digital entertainment hardware and software – doing quite well, thank you. Bold innovations in devices happen every year, and the number of apps is skyrocketing.
Another aspect of an exodus recession would be that consumers, in general, would not be expressing much general pessimism about being consumers. There’d be no sign that people had given up on the idea of buying and selling things. They’d be as interested in money, the economy, and jobs as ever. However, they would consistently say that they’re slightly less interested in buying a washing machine than they were last year. You don’t have to do as much washing when you spend more time living through your avatar. They’re going to be slightly less interested in a car because they’re not going to go driving around quite as much. This has nothing to do with malaise or lack of government stimulus or the conversion of a culture to moderate spending. It begins with people buying digital stuff instead of real stuff. And indeed, we find in the recent US election that people are very interested in jobs and the economy. Yet collectively they seemed to react less powerfully than expected to efforts to stimulate their real-world spending. This would make sense if people are turning their consuming energy to mp3s, FB gadgets, and Xbox Live Achievements. Having a new road is not going to have much effect on an economy based on digital goods.
These are all conjectures, of course. It’s a what-if. Is it possible? I thought we would not see a real-world recession caused by the removal of consumption energy into virtual environments until sometime in the far future. But I didn’t think about the possibility that the term “virtual environment,” in its economic meaning, might expand to environments as diverse as Hulu and Facebook. Are people now spending enough time fiddling around with digital stuff that their interest in physical stuff has weakened to the point that it catalyzes an ongoing cycle of economic pessimism? Perhaps not. But some trends certainly point in that direction. Even if this is not the first exodus recession, one wonders how far off that first one may be.
Mark and I have collaborated on a number of things these last few years, and now that he has finished his doctoral work (woot!), I invited him to guest author and summarize his work for us. Mark is another virtual worlds researcher who relies on traditional anthropological methods to understand social dynamics and learning. He's a great researcher and an extremely passionate gamer.
Using ethnographic methods, Mark Chen focuses on teamwork, communication, and expertise development in situated gaming cultures. Currently, Mark is a post-doctoral scholar at the UW Institute for Science and Mathematics Education, working with computer science folks to study player learning through science and math games that take advantage of massive amounts of computational and human power. He received his Ph.D. from the University of Washington, College of Education, looking at the practices of a group of gamers in the online game World of Warcraft. Prior to doctoral work, Mark was the webmaster and a web game developer for the Oregon Museum of Science and Industry in Portland, OR. He holds a B.A. in Studio Art from Reed College and grew up in the Bay Area as a child of the 80s. You can read more about Mark on his blog...
This past weekend, I saw the film Coraline, which I found terrific in many respects. Among other things, I think it has a lot to say that applies very strongly to virtual worlds, about why though we may all complain about bad pick-up groups, griefers, loot farmers, Barrens chat, virtual worlds are not a demonstration that hell is other people. Quite the opposite: virtual worlds live (and sometimes die) on whether they infuse authentic sociality into everything we do within those worlds.
In Coraline (modest spoilers ahead), the title character is frustrated with her parents' quirks and lack of attentiveness to her. Drawn into a magical world that exists inside her new home, she is at first enchanted by her Other Parents, who have marvelous talents, live surrounded by wonder, and are utterly devoted to Coraline herself. I don't suppose I'm giving away much when I say that there's a big catch to all this, and the last portion of the film is about how wonder gives way to horror.
The only seeming clue that Coraline's Other Mother is anything but perfection is that her eyes (and the eyes of almost everything else in the magical world) are made from buttons. And yet there is another clue right from the outset, in some ways a much more unsettling sign of just how wrong this world is. Everyone and everything in it exists only for Coraline. They have no apparent interests of their own, no desires apart from hers, nothing to do but please and delight Coraline.
Virtual worlds occasionally toy with treating each player like Coraline. In World of Warcraft, my character is greeted with delight by guards and non-player characters who allude to his past adventures. This lasts only until I begin a new round of quests in a new zone, whereupon my famous achievements are forgotten and I am merely one more anonymous grunt. When my character seems to make momentous choices that should hang about him forever, those too disappear into the haze. I am torturer one moment, and the next a saint who seeks all across the world for a cure which will save the life of hero faced with enslavement to the Lich King. In my most important adventures, I exist inside wholly private instanced worlds with a small number of friends or allies. That world, too, exists only for me.
What keeps World of Warcraft or any other virtual world from being as ultimately empty and terrifying as Coraline's magical hideaway is that these worlds are full of people who do not exist for my own pleasure. They may be people I know and like, people I tolerate, people I find pathetic, people who infuriate or disgust me. But they mean that the world is not merely my mirror.
Now I think that virtual worlds themselves could function more that way: they could react to my actions (or the actions of many players together) in much more dynamic and autonomous ways. Reading Jim Rossignol describing the latest astonishing developments in the long-running war between BoB and Goonfleet in EVE Online makes that very clear. The underlying world in EVE does not exist in a one-to-one relationship to individual players, and its basic economic and politcial infrastructure transforms in relationship to collective action in some striking ways. A world which is itself a dynamic presence in play need not be as harsh or treacherous as EVE's world is, but the basic principle is an important one.
Until we have a fuller range of dynamic worlds, though, other people, acting in the most unmanaged and unfiltered ways possible, are the only thing that keep virtual worlds from total sterility. Sometimes we all feel like Coraline: we'd like a world which exists only to delight us, full of cheering throngs and valiant allies. But like Coraline, we'd be better off knowing from the first moment of that desire that we're really chasing something horrible rather than something pleasant.
I've just posted a piece to SSRN about play. In the past I have focused on games as a culturally-shaped activity (what we anthropologists would call a "cultural form"), and in the course of that I have made explicit efforts to decouple games from the concept of play (see here, for example). I argued that it is not very useful to see play as an activity, with games as a subset of it, and suggested that play more usefully denotes a disposition, a way of approaching the world.
In doing that I wasn't trying to argue that games and play are not related to each other, but rather that we need to move beyond seeing them as intrinsically linked (where the question of, for example, whether something is a game boils down to whether it brings about a playful experience). The primary motivation was to make room for an approach to games on their own terms, but the issue of play has been simmering with me for a long time. The posted essay is the result – a long-planned attempt to articulate play as a disposition.
In the piece I look at how anthropology as a discipline stumbled a bit in thinking about play, but simultaneously managed to develop a useful approach to ritual. This approach avoided making the litmus test of a ritual whether it brought about religious experience, and therein is a lesson for those of us studying games and play. Pushing further in this direction, I assert that the ideas of William James and the pragmatist philosophers in general may hold the key to moving forward in our understanding of games and play.
Here is an excerpt (the many footnotes excised here, for convenience):
Huizinga set the tone for much of the inquiry into games and society in the latter half of the twentieth century with his book Homo Ludens. This book bears much responsibility for fostering the unfortunate view, developed more rigidly still by Caillois, that games are culturally sequestered and consequence-free activities. Still, here as in many such midcentury works of cultural history, illuminating contradictions abound. As Huizinga’s argument develops, near the end of his text he focuses on something quite different: “Civilization is, in its earliest phases, played. It does not come from play…it arises in and as play, and never leaves it.” Huizinga is much more enlightening when he speaks of the “play-element” (just the type of experience or disposition that interests us here), rather than of “play” as a (separable, safe) activity. For him the play-element -- marked by an interest in uncertainty and the challenge to perform that arises in competition, by the legitimacy of improvisation and innovation that the premise of indeterminate circumstances encourages -- is opposed above all to utilitarianism and the drive for efficiency. Caillois likewise, despite his misleading claim that games are occasions of “pure waste,” recognizes the centrality of contingency in games. Huizinga felt that the play element had been on the wane in western civilization since the eighteenth century, threatened by the drive for efficiency and the routinization of experience it brought.
These tantalizing recognitions of the contingent nature of experience in the world direct us to sources and analogues in philosophical thought. American pragmatist philosophers broke from the Western tradition in their rejection of an ultimately ordered universe: for them the universe was, as Louis Menand put it, “shot through with contingency.” The pragmatists were not alone in this insight. The phenomenologists also gestured toward it, notably in Martin Heidegger’s concept of “thrownness” (which was developed in anthropology by Michael Jackson). The ideas of “practice theory,” as Ortner described it, are also consistent with this picture of the world as an ongoing and open-ended process: Pierre Bourdieu, Marshall Sahlins, Michel DeCerteau, and Anthony Giddens each have sought in different ways to overcome determinative pictures of the world. Although the scope of this essay allows only a broad description of these connections, I suggest that we are at a point where, in recognizing these commonalities, we can begin to forge a useful concept of play that will inform our understanding of experience in a uncertain world.
What are the features of play as a disposition toward the world in all its possibility? First, it is an attitude that is totalizing in the sense that it reflects an acknowledgment of how events, however seemingly patterned or routinized, can never be cordoned off from contingency entirely. As the scientist James Clerk Maxwell put it, the “metaphysical doctrine that from the same antecedents follow the same consequents... is not of much use in a world like this, in which the same antecedents never again concur, and nothing ever happens twice.” The earthier popular sentiment in American English, “Shit happens,” signals the same conviction. Second, the disposition of play is marked by a readiness to improvise, a quality captured by Bourdieu in his development of Mauss’ concept of the habitus. To be practically equipped to act, successfully or not, amid novel circumstances is the condition of being a social actor at all, Bourdieu argues. One can also note Dewey’s argument that uncertainty is inherent in practice, and that it is in contrast to this practical open-endedness that theoretical claims to certainty seek to marginalize and denigrate practical knowledge. Finally, play is a disposition that makes the actor an agent within social processes, albeit in an importantly restrained way; the actor may affect events, but this agency is not confined to the actor’s intent, or measured by it. Rather, it allows for unintended consequences of action. This is consistent with Oliver Wendell Holmes' “bettabilitarianism,” his answer to utilitarianism; every time we act, we effectively make a bet with the universe which may or may not pay off.
I look forward to any comments.
P.S: I am moved to post this by the kick in the pants given to us here at TN by Keith Ellis, who has reminded us to continue to involve TN in our thinking through of these issues, even as our changing circumstances tend to sap our time and pull our attention elsewhere. Many thanks, Keith.
[UPDATE: As with some other posts on TN, the comments for this post have become borked, and are not showing up properly. My apologies to those who have tried to post, Chris most recently. -TMM]
[UPDATE II: Comments are fixed! TN has now incorporated the code that allows you to navigate through multiple pages of comments. See the "Next" link at the end of their first page (after 25 comments). Huge thank you to Greg for sorting this out this week! -TMM]
From time to time here on TN I've delved into methodological territory, and in my last effort, quite some time ago, I focused on the charges of "anecdotalism" that qualitative research in the social sciences sometimes faces, and argued that generalizable claims can be generated out of such methods. But, in retrospect, that piece did not confront the root of the problem directly, given the degree to which I do not there question generalizability itself as the core aim of scientific inquiry.[fn 1] As research on virtual worlds continues to increase, and as the different parts of the academy ramp up their efforts to fight for their funding (and perhaps thereby seek to discredit other approaches), it seems worthwhile (and consistent with the ecumenical spirit that largely characterizes TN) to consider how scientific the pursuit of other kinds of claims apart from the general are.[fn 2] And that's where James Clerk Maxwell comes in...
When it came to generalizability, Maxwell (yes, that Maxwell) was ready to wield a not-so-subtle hammer against those he saw as seeking to hitch science to a positivist view of the world. He said (in a speech the text of which is available here):
It is a metaphysical doctrine that from the same antecedents follow the same consequents...[I]t is not of much use in a world like this, in which the same antecedents never again concur, and nothing ever happens twice.
By highlighting the irreducibly contingent nature of the world, Maxwell joined Charles Darwin in a view of scientific inquiry that saw its provisionality as perfectly consistent with a world that was not, in the last analysis, law-driven and ordered. Instead, they argued that the proper aim of science was to explore the processes that are in place under different conditions, with an awareness that those conditions never perfectly reproduce themselves (for Maxwell, this anti-positivism was also tied to his religious views).[fn 3]
In a sense, all academic research is based on critical observation of such situated events and circumstances. It may be concerned, yes, with making provisional comparisons across them when possible, but it is just as often concerned with understanding the specific processes in place that led to unique outcomes not generalizable elsewhere. For this reason attempts to trumpet generalizability as the primary (or exclusive) aim of the social sciences (where I see it happening quite often) not only marginalize particularist work by historians, sociologists, anthropologists, economists, and others, but (ironically, to me) thereby also seek to exclude a vast swath of the natural sciences (such as much work in paleontology, geology, and biology, to name a few).
As work in the sociology of science has shown, expert critical evaluation (usually associated with the humanities), observation, and hypothesis-testing are all used by all branches of the natural sciences. Efforts to claim special status for the natural sciences (or any field) by pointing to hypothesis-testing ignore not only this, but also the fact that, as Maxwell suggests, hypothesis-testing in the absolute sense does not, in fact, exist (what you have instead are very very very very close approximations of it, and this is only possible for certain kinds of conditions).
What this means for research on virtual worlds is that we must be wary of how the drive to fight for resources may prompt researchers to claim that a certain kind of project (generalization, particularization), or a certain kind of methodology is "scientific" (or, one might imagine, "humanistic," although the comparative lack of money makes this more of a localized danger!), while others are not. A broad view of science, in all its variety, and, ultimately, of academic inquiry, should inoculate us from this kind of divisive maneuvering. Critical observation, exploratory research, and hypothesis-driven work are all going to be vital components of understanding what virtual worlds are all about.
[fn 1] Alert TN reader "Rex" (aka Alex Golub) pointed out this issue in the comments on that post, and I have long wanted to give that observation a proper response.
[fn 2] I am also moved to write this because there is something of an ongoing conversation about scientific "truth" and methodologies here on TN (one example).
[fn 3] For further critical discussions of the limitations of generalizability see the Preface of Anthony Giddens' The Constitution of Society, and Chapter 8 of Alasdair MacIntyre's After Virtue.
Steven "Play No Evil" Davis, in a great comment on Mark Wallace's thread, asked the following question:
Is griefing simply emergent play that some folks don't like?
I think this is an interesting question to pursue, and I'm going to take a somewhat provocative stance and answer "no," partly to explore some territory and partly because I think there's a case to be made against griefing that doesn't founder on a libertarian objection (i.e., that if some people do something in a low-consequence environment, then it must be fun to them/their choice, and therefore must be okay).
I should state at the outset that studying cheating, griefing, and similar topics is not a principal part of my research, and there are several esteemed folks around here that do it, so I hope to learn from them if they'd like to weigh in. Here, I'm just following through on some ideas that have been percolating on meaning and games, and how they might help us answer Steven's question.
To begin this speculation, the first thing I'm going to do is narrow the topic a fair bit. Rather than discuss "griefing" in the broad sense, I'm going to focus on one activity in MMOGs that is often seen as griefing: ganking. Very specifically, I'm talking about a human player, piloting a higher-level/better geared toon, attacking a toon that is much lower level, without any other circumstances (game objectives and narratives), histories (they, or their guilds, know each other or similar), or players (on either side) involved. This is simply the killing (frequently, one-shotting) of another toon by a vastly more powerful toon. I'm drawing my sense of this phenomenon from the open PvP servers of World of Warcraft -- other games/server types may vary considerably and interestingly.
What I would like to suggest is that this kind of PvP is meaningless. Or, perhaps more precisely, that the meaning it has is so narrow, rationalized, and improverished that it is outside of, or rejects, the game in which it is situated. Games, as ends in and of themselves, are things that can generate new meanings and experiences. For the ganker, however, ganking is a means to other ends ("Personal best crit!"), not a potentially generative new experience. (And, by the way, please keep in mind that I am not talking about all PvP -- there are many other kinds, both institutionally designed by the developer and emergent, which would not fit with the argument I'm making here.)
I'm speculating that ganking happens when a player who does not want to be challenged to play a game (i.e., encounters where the outcome is contingent), instead opts to do something where the outcome is a foregone conclusion: kill a player that is vastly lower in capabilities. If meaning is found at the meeting point of inherited systems of interpretation (cultural expectations) and the performative demands of singular circumstances (something I talked about here), then ganking is a denial of that meaning. It is a retreat from the demands of the new, and it signals a disposition that does not want to be performatively challenged. Ganking lower level players is, then, a somewhat pathetic attempt to feel, well, something. But that something is not the meaning that participating in a challenging game would create -- it is removed from that. If there is no contingency, it follows that there is no meaning -- all you have left is an impoverished environment where pointless negative reciprocity (I was ganked at L24, so I’ll gank at L60) reigns.
It might be argued against this that an environment of open PvP, rather than erasing contingency, actually spawns it, generating a wide open landscape of ganking possibility for the lower level players. This would be a way to argue that there is still a game, on a broader level, and it is a cat-and-mouse game. The difference in capabilities once the battle is joined is not in question -- the cat wins -- but the game is actually about avoiding that encounter (thanks to David Simkins for voicing this argument to me). This is an interesting way to go, and I agree that it can turn out this way, under certain game design conditions. I would argue, however (again, I'm being provocative to see where this leads), that in WoW this doesn't hold, because the architecture of the game is not very flexible about alternative places to go to accomplish objectives. The quests for any given level are in a small set of vastly distributed places, and the transportation costs (in time) for low level characters are high. This means that if someone is trying to get quests done in Stranglethorn Vale, there is not a viable game in avoiding the gankers -- they have every advantage also in the "meta" game of cat-and-mouse. For most players, this means that the ganking feels, again, like a foregone conclusion, it is only the question of when it will happen that is utterly contingent (that is, too contingent). In neither aspect is there a performative challenge for the gankee or the ganker. One is left with either too much determination, or too much chaos; either way leads to a loss of meaning.
So why does it happen at all, if it's so meaningless? To answer this, one would have to make a normative, critical claim (and goodness knows those are popular around here). One would have to say that what happens is that the game objectives get replaced by utterly personal objectives, individualistic and empty goals that are the simulacra of actual (new) meaning. Gankers, this argument would say, are getting their jollies in an endless circle of confirming their own expectations, mistaking the increasing number of notches on their belt for actual personal development. In fact, this line of reasoning would argue, they are each stuck in an iron cage of false objectives.
Now, I can spin this argument out, and understand how to get from point A to point B, and it's consistent with my experience and preferences. But, on the other hand, I have lots of friends who enjoy open PvP, even the random but inevitable ganking part of it, so I hesitate. I'm also certainly one to be wary of normative claims about other people's experiences ("Yes, yes -- you say you're having a good time, but you're really just deluding yourself").
On the other hand, the argument that if people choose to do something in these domains it is just a different "style of gameplay," and therefore morally unassailable, also rubs me the wrong way. It seems to rest not only on a separation of play from real experience (and I have a whole set of strong empirical objections to that view), but also on a modernist, individualistic ethic -- it's all about the individual experience, this seems to say, and that should be our final arbiter of all matters ethical.
I don't have any real answers here, but I'm quite taken with the notion that ganking is, effectively, not a game, and with thinking through the consequences for meaning and experience that follow from this. To what extent this could be extended to other kinds of griefing, I'm not sure, but it does seem to me that quite a few players out there actually don't seem to want to play a game at all.
So I've been having my usual beginning-of-the-semester chats with my graduate students about their projects and progress. I enjoy these, and I think they do to (they almost never complain about the thumbscrews, or -- more of a shock -- having to read Habermas). One of them, Krista-Lee Malone, is a master's student and long-time gamer who is completing an excellent thesis about hardcore raiding guilds. During our chat she said something about how these raiding guilds went about preparing her to participate in their activities, and it prompted me to follow up on some ideas from here. It's about Foucault, bodies, institutions, and whether the relationship between developers and guilds is changing in important ways.
Krista-Lee plays a priest (one with more purples than I'll ever see for my druid, I'm sure), and what she said was (paraphrasing), "I can healbot Molten Core in my sleep, but if I'm thrown into a new situation, I can't heal at all." While that's probably an overstatement, it suggests something about the nature of raiding guild discipline -- at least, pre-TBC. It turns out, and this is not unusual, that the guild power-leveled her toon and then taught her to follow a very specific and detailed script for the instances they were running, starting with UBRS and then through Naxx.
Michel Foucault famously argued that the power of modern institutions is driven, at root, by the ability to discipline people, or, more directly, to discipline their bodies -- to mold those bodies and order their actions in ways that allow groups to achieve institutional objectives effectively. To do this, they draw on practical techniques developed first in places like early Christian monasteries and the Roman legions. Bodies are organized, regimented, taught to sit, to stand, to kneel, to match their singular shapes to the demands of regularity -- no pinky out of place, the leg held just so. The effect of this "bio-power", as he most convincingly shows in Discipline & Punish: The Birth of the Prison, is not only effective institutional control over otherwise unruly subjects, but in fact a re-shaping of their selves. They come to see this discipline as consitutive of who they are, as shaping their very desires. The classic (and idealized -- practice is messier) example is the panopticon, where prisoners are architecturally situated in view of an invisible and authoritative observer. The guard watches from in a darkened room while they are laid out in a brightly-lit Cartesian grid. It comes to matter little if the guard is there at all, as the prisoners internalize the surveillance.
I'm not saying that Krista-Lee was a prisoner of her guild. Um, exactly. Foucault argues (in later works) that this disciplining of bodies is something taking place all around us, particularly as we learn to act within highly-regulated contexts, like schools, the military, hospitals, and airports. And, like the prisoners, he asserts that we come to accept and even celebrate the kind of self the institutions have made of us.
All of this is to get us thinking about to what extent hardcore raiding guilds should be seen in a similar light. The essence of disciplined bodies is that they are malleable; they can be shaped to perform in lock-step (literally) under a command hierarchy. The tension, of course, is that this strategic control always involves a tradeoff with the tactical, the ability of a group to respond on the fly, to emergent situations. For Krista-Lee, this effect was directly discernible -- while she enjoys soloing and quest-grouping, she felt lost in new instances, when there wasn't an explicit script to follow.
As I've pointed out, for WoW, this had -- before the expansion -- created a mutually constructive relationship between the 5(10)-person instancing and the large-scale raiding. While small-scale grouping not only allows for, it depends upon, tactical rethinking on the fly, large-scale groups narrow and leverage the set of available class skills (maybe hunters begin to leave pets behind, druids get pushed into healing, only one hemo rogue is called for) into more strictly-defined roles. The small-scale was, perhaps like boot camp in the military, an intense and necessary part of enculcating a set of competencies (what is a pull, sheeping, aggro), but one that ultimately is left behind, smaller in comparison to the institutional ambitions which these competent bodies now serve to realize. Rationalized systems of resource distribution, like DKP, along with political structures and communications tools, play a role as well for these institutions, harnessing individual desire into organizational discipline, to get the 40 people needed together all at one time, ready to down Onyxia, or tackle a world boss.
The reason I think this is particularly interesting for us to think about now are the cases of both WoW and Second Life and some of the recent changes these VWs have undergone. The downsizing of endgame instances in WoW, the availability of soloable loot roughly on a par with Tier 1+ in Outland, and (to my unsystematic eye) the prevalence of small group quests there with excellent rewards, all suggest that Blizzard's moving away from supporting the emerging institutions (guilds) of its creation, ones which had dominated server culture for pretty much the whole game. This is an interesting contrast with past TN conversations, like the one here.
By contrast, the revamped estate tools in SL (which I'm sure many folks out there know more intimately than I), increase the amount of governance by island owners not only over a piece of property, but also over a group of people, and in fact these tools have thereby become deeply intertwined. To my eye, this enables the generation of institutional players on the SL landscape that LL has never had to deal with before. I'm not thinking first of the existing external institutions with a "presence" in SL, but rather of those entities that until recently we could somewhat reliably continue to think of as individuals, but which are now better understood as institutions. While the relationship of LL to some of its major content creators has been undoubtedly cozy, one can't help but wonder how long that will last -- institutions are competitive. The interesting thing about Second Life is the extent to which Linden Lab has had a "free-ride" for a long time, effectively being the only large institutional player in the arena. Social convention was emergent from the users, and was (is) something with which to contend -- a lot of time at Linden is devoted to this "community management". But architecture, the market, and "law" (others modes of governance, as I see it) were all firmly in Linden's hands. That's changing now, and the question is whether Second Life will fly apart at the seams once these other institutionalized interests find their footing.
All this is really just to wonder whether we're entering an era where the relationships between virtual world makers and the people involved them are changing. It is probably wise for us to get in the habit of thinking just as readily about developer/(in-world) institution relationships as we do about developer/individual player relationships. I actually think this will be a hard habit to break -- the idea of the game maker/game player relationship as primarily institution-to-individual is just one instance of the engrained tendency for those in industrialized societies to think about social institutions primarily as they relate to individuals.
WoW and SL both demonstrate, at a very broad level, different solutions to the emergence of institutions within their creations, an emergence that was, I believe, inevitable once resources began accumulating within these persistent and contingent domains. Foucault, like Weber, thought that people banding together to accomplish something was fine, but was wary of what happens next. Once any nascent institution begins looking for something else to accomplish, its primary raison d'etre has already changed. At that point, it's more interested in its own reproduction than in its original aims or purview. Once that happens, look out.
[Addendum: Ever-alert Julian Dibbell points to ShaunConnery's Rapwing Lair. Surely the script in Krista-Lee's guild never sounded so good.]
The recent flurry of attention to SL and its numbers (here, here, here, and, most recently, here) leads me to think that folks might be interested in having a chance to chew through some methodological stuff, along the lines of the "Methodologies and Metrics" panel on which Nic, Dmitri, and I served at the State of Play/Terra Nova Symposium early this month. Below the fold, some tweaked ideas from some emails I circulated among the panelists in preparation for the panel. While I'm not discussing virtual worlds and the methodologies we'd use to understand them specifically, I hope this will be helpful background for such a discussion.
It is hard to get away from a common conception, both within and outside academia, that numbers are the one, true path to understanding. This is part of a set of cultural expectations that are reproduced precisely because they are so rarely challenged. Most commonly, one hears that claims with numbers are "grounded" or otherwise true in a way that other kinds of claims (such as the ones based on the kind of research that Tim talked about here), are not. Claims based primarily on those other kinds of research, particularly on interviews and participant observation, often get branded as "anecdotes", with the suggestion that they hold no real value as reliable claims. Here I would like to push against this association, and help clarify our understanding of what qualitative social science research methods (ethnographic research ones in particular) bring to the table. In short, they are not "anecdotes", and they can form the basis of reliable claims, even without numbers, although as Dmitri and I never tire of saying, having both is better than having just one.
No social scientist, of course, would want to "generalize from anecdotes," but the problem is that often we do not really understand what that means; or perhaps it is more accurate to say that across the academy many scholars (not to mention the public at large and policy makers) do not know enough about methodology (this is true of both qualitative and quantitative methods, and more broadly about exploratory versus experimental research), and therefore these charges are in essence a political move meant to marginalize the other side's research that can succeed because of that lack of broad grounding. From my conversations with everyone involved with TN I have never felt that we (as a group of authors) were particularly prone to make these errors, but there is no question that it finds its way into the discussions on TN, as in the recent threads.
The goal of all social science is "generalization" in a sense, but the legacy of positivist thinking about society (that it is governed by discoverable and universal laws) has left us in the habit of thinking that the only generalization that counts is universal. It is always interesting to me how some work (especially that done by the more publicly-legitimized fields, such as economics) can proclaim itself to be about the universal despite the fact that only a moment's thinking reveals the application of the ideas to be narrow (to industrialized, capitalist contexts, etc). The strange thing is that this doesn't end up being a problem for those already-legitimate fields; instead, it is largely ignored -- this is what being well situated on the landscape of policy and academic relations of power gets you (to be Foucauldian for a moment).
But of course generalization, in the more limited sense of seeking a bridgehead of understanding across times and spaces, has long been the hallmark of history (the first social science, in a way). The strange thing is how difficult it seems to be for those who would like to criticize methods such as participant observation and interviewing to see the projects those methods support in a similar light to history and its efforts. There is nothing inherently problematic with such claims; they are just as able to inform policy as universal ones, and have the benefit of incorporating more nuance.
So then what is an anecdote? It is a description of an event isolated from its broader context, so no wonder all of us would like to shy away from the suggestion that we are drawing our conclusions in isolation of the broader context. But ethnography (meaning principally participant observation, along with interviewing, surveying, and other methods), to speak of that relevant methodology most familiar to me, quite distinctly does not treat these events in isolation. Brief descriptions are often presented in the course of ethnographic writing in order to illustrate a point concretely, but the point made is only as sound as the degree to which we trust the author's command of the broad array of processes ongoing in the context at hand. How is this credibility established? Through a complex of many, many, many techniques of writing, thick description, peer review (always including experts in that period or place), solid reasoning itself, track record of previous research, etc, etc. This form of generating reliable claims is not somehow "less" viable than other ones, and its strengths and weaknesses of similar scope (though differing in their particulars).
So one of the tropes that one finds in the recent spate of posts about SL and its numbers is the suggestion that only when numbers that we trust are present do we feel that the claims authors make are "grounded". This is not true. As anyone with much experience with statistics knows, the numbers say nothing without the ability to interpret them provided by other kinds of interpretive research. In fact, given the above, if any research has a claim to being "grounded" it is the first-hand research of participant observation.
Even when this kind of contribution from qualitative research methods is acknowledged, however, there is still a tendency to see the claims of work based on them as always and severely limited to a "niche", at least until numbers come along. But a social history or ethnography of a place and time is not this narrow. They are able to make general claims at the level of locale, region, or even nation, and they often do (when done well). The idea for ethnographies is that the ethnographic research method, at root, inculcates in the researcher a degree of cultural competence such that he or she can act capably (and sensibly) as a member of that culture. Supported by observation, archival research, surveys, or interviews (usually some combination), as well as (possibly) prior work, this learned disposition informs an account of the shared disposition of the actors on the ground, and is laid out in the published work (as best one can in writing) as representative of a worldview from a particular time and place. Thus, my claims about gambling in Greece were made beyond the level of the city where I did my research, and I argued for the existence of a cultural disposition that characterizes Greek attitudes toward contingency at something like the national level (without holding too much to hard boundaries).
Of course, these claims are further bolstered by the broadening of one's research methods, whether through surveys, demographic data, archival research, media studies, or any other means that support the big picture. Relatedly, there is nothing about quantitative methods that dictates that they must "stay big." They can be productively focused and narrow as well.
This is not to say that there isn't a limit to the level of generalization for qualitative research that is exceeded by quantitative methods. So, for example, while an ethnography could make reliable claims about Greek culture, I don't think it could about American culture. The reason for this connects to what culture is -- a set of shared expectations, based on shared experience and continually re-made through shared practices -- and why it is far too fragmented and varied across the US for an ethnography to make such claims. But while this is true, the important point is that qualitative methods' levels of claims are not as particularist as they are sometimes made out to be.
I become, I confess, a bit sad whenever I encounter this kind of marginalization in action (for me, it most often happens on interdisciplinary fellowship review panels and the like), because at root it bespeaks a lack of trust across the academy. There is little doubt that there have been excesses across the gamut of methodologies and theories that the social sciences use (reductions to representation, or materiality, or power all come to mind), and perhaps this accounts for the parochialism and suspicion, but let's hope that we don't fall prey to what are more often, in my view, essentially not battles over the nature of sound inquiry but instead part of gambits meant to direct or redirect institutional resources.
I must confess: I like PvP, a lot. Perhaps this has something to do with the inordinate amount of time I spent playing Counter-Strike with my friends in college, or just a basic need to show off my 3l33t skills. But the bottom line is that I've been spending a lot of time in World of Warcraft's Battlegrounds lately, trying to climb up the ladder and equip my avatar with a nice armor set.
Here is my problem though: I am at rank 8 (Legionnaire) and I'm stuck. In fact I've been stuck there for weeks. This is really a problem since I cannot complete my coveted armor set until I reach rank 10! But I also have the luxury of some fresh data about PvP play patterns that Nick, Eric and I computed recently - why not look at some numbers and see if they could help?
Using our data we can approximate how many hours of weekly play time it takes to move from one PvP rank to the next (see previous link for the full analysis, and this other post for a more complete overview of the limitations of the data). It turns out that reaching rank 5 (Sergeant Major / First Sergeant) requires spending about 20 hours per week in the game - about the average weekly play time for most MMORPG gamers, it turns out, and pretty close to the maximum I can personally invest in the game. But then the curve ramps up steeply: Legionaire requires almost a 30 hours/week commitment, and Field Marshal / Warlord (rank 13) almost 80 hours/week!
So much for my shiny new armor... I simply cannot compete with this level of commitment. I guess I will have to turn to raiding now (or not). Or maybe I should just wait for the expansion: Blizzard apparently decided to rework the PvP system entirely - a recognition that the current system is broken, perhaps?
In a recent musing on VWs as petri dishes, Ted Castronova commented on the impact that creating four new auction houses in WoW's Azeroth has had on the populations of the capital cities. Wondering just what exactly did happen, I went scuttling back to the PlayOn data.
Here are the pretty pictures. Explanations of the data are left as an exercise for the readers.
The data was collected on five different servers over different periods. The date of the auction house change is January 3, 2006.