« Indiana University to Host Digital Media Festival | Main | How will "free-to-play" business models affect the gaming landscape in the West? »

Aug 16, 2006



Actually, has anyone ever done that? The MRI thing?

Maybe the biological question is less messy than the cultural one.


Tim said, 'I think it's not wrong to say that some games scholarship has gotten entangled in relatively profitless debates....'

Oh the irony of that statement at the end of a long winded diatribe.


I wonder what the software/game design guys mean when they have 'immersive-ness' as one of their goals. They must have some qualifiable things they think are required to provide the opportunity for immersion.

Why worry about academics when it's the architects would actually test this stuff?


One good diatribe deserves another, Dave.

I think John is pointing at one thing that academics could provide, if one was willing to be tolerant of the nature of academic inquiry. It's not at all clear what different people, especially gamers themselves, mean when they say "immersion". (Which is one reason the MRI study would run into trouble.)

Some people mean that they're involved in "co-creation", e.g., that the game sucks them in and gets them to imagine their own narrative versions of the gameplay. I can think of some games of Alpha Centauri that were like that, where I was doing a sort of parallel form of elaborate story-telling in my own imagination that was enhancing the gameplay.

Some people mean an intense quality of psychological projection into a first-person perspective, a sense of merging oneself into the avatar. (This, incidentally, has been studied quite a bit--just not by the people Varney talks about.)

Some people mean a sense of obsessive focus on the game, that the game somehow creates a total environment that makes the player forget or ignore his real-world context. Some people mean something rather like what makes Lord of the Rings appealing for many, the sense that a gameworld is a "total" environment that can be explored comprehensively and has a sense of stable, consistent fictional reality to it. (Say, Oblivion).

Some people mean several or all of these things when they say they're immersed.

It's also not entirely clear that immersion is always what sells a game or makes it popular, or that all or most gamers crave immersion. A lot of games that are much more social, casual, non-immersive, sell very, very well.


It might be unreasonable to expect a nice, neat definition of "immersiveness" from academics. But it doesn't sound like that's what Allen is asking for -- what he actually said was, "Does any of this bring us closer to an understanding of immersion?"

I don't believe it's fair to read "closer to" as "solve this hard problem for us."

But the larger issue is even more interesting. Other than the focus on immersion in games, what's so unusual about Allen's request that academics justify their work? Everyone else (except artists) has to -- why should humanities scholars be exempt?

"Why does [Allen] want anything substantive from scholars in the humanities ... ?" I would guess it's because he feels that scholars are capable of generating practical, applicable knowledge, and, since they can, they should. I'm not seeing why that's so threatening a view that it needs to be swatted with extreme prejudice.

Maybe it comes down to this: Although games may be both art and artifact, what about game scholarship itself? As a process, should it be considered primarily art or artisanship? Is it inherently worthy of existence, or must it produce something of value to others to be considered worthy of existence?

It would be fine for humanities scholarship to be considered art. Art reveals something of the meaning of being human -- that makes it worthwhile in and of itself; art doesn't have to justify itself to anyone. So academics are free to align themselves with artists... but then they don't get to call what they do a "discipline."

Otherwise, if scholarship in the humanities is to be considered a productive field that exists for something more than its own sake, if academics are artisans (albeit of a slightly more abstract kind than bricklaying), then it's not unreasonable to expect academics -- like all artisans -- to provide something of substantive value to the culture that subsidizes them.

So what's game scholarship for?



Bart: game scholarship is, or ought to be, for helping people to understand complex phenomena and experiences that they cannot understand purely on their own. I mean, read a group of gamers talking about immersion, and often they themselves will recognize that they arrive quickly at the borders of their capacity to understand or describe the phenomenon. Same for developers. That's my thought, Bart, about Allen's piece. If immersion is simple, or Allen already knows what it is in a very certain way, then what's he complaining about? If it's complicated, intractable, difficult, then he shouldn't expect diagrammatic, "scientific" answers, and be more patient with the roundabout twists and turns of scholarly discourse.

I also do think that he's really looking in the wrong place to find answers about what immersion is: the ludology-narratology debate doesn't pretend to be centrally engaged with that question, so I don't think you can hold it accountable for failing to deal with it centrally. The debate is really more about, "What is a game, and how should we interpret it?" Immersion may turn out to be an important subset of that question, nestled somewhere under the discussion of whether "interactivity" is a central property of games that distinguishes them from all other cultural experiences or texts. But if I was concerned mostly about immersion, I think I'd want to bypass people worried about grand definitional problems and move on to people doing more fine-grained interpretative work on games, or to people doing ethnographic work, and so on (of which there are some). Or just look for scholars centrally concerned with immersion as a concept (of which there are some).


"In order to achieve respectability in scholarly circles, many academics working on popular culture feel an implicit need to distance themselves from the subject of their study."

I actually applaud that distance. If one is truly concerned with study, it's important to keep one's distance in the actual work of research. However, it's equally important to have a broad and intimate knowledge of one's subject. One way to accomplish that with games is, naturally, to be regularly engaged with them, including enjoying them outside the realm of research.

The danger of intimacy is the risk of becoming the subject of one's study, rather than simply its author.


Gosh, this is dumb. Asking "what is immersion" is an awful lot like asking "what is an orgasm". You either have it, or you don't. Now, scholars can then take up a critical stance. What is the subject's *attitude toward* or *relationship with* or *expectations about* or *means of achieving* or 100 other nerdy wonky egghead things like that. It's like Jauani Wu recklessly trying to establish a "Builder's Canon" in SL that would "define immersion" -- and of course that leads to Zhdanovism, i.e. oppressive cultural norms tied to some putative social good.

There's some simple recipes for determing "what is immersion". One is called "Time flies when you're having fun."

Or as Kermit, the inventor of Tringo in Second Life says, "Time's fun when you're having flies."


There are people for whom questions like, "What is fun?", "What is happiness?", "What is ecstasy?", "What is anger?", "What is epiphany?" and so on are obvious, intuitive questions. Such people should go write novels, or maybe even just go and feel and be and contemplate and not bother much with the rest of us.

And there are people who think about these things, and puzzle over them, and complicate them.

The people who just know may have a tough time telling anyone else how to do what it is that they just know, though. So next time you have to tell a team of 100 people, "make an immersive game", or teach a class of 100 people about what an immersive game is, or write about what it means to be immersed (even in a novel) you may be a bit stymied if you really just know what that is.


There is in this debate, like in so many others, a kind of "bell curve," I think, of reasonable discussion and, eventually, total hogswaller.

On the left hand side of the curve, we have what Prokofy would say: "immersion is." You get it or you don't. Which is always going to be fine. You cannot argue with taste, with preference, with humor, with enjoyment, with value judgements... with definitions of art. You cannot say to me, "You are not immersed in this." Only *I* can say whether or not I am, and whether or not I am more or less so than in another experience. It's like food: I like my mama's meatloaf best. You can argue your mama's meatloaf with me, but I'm still gonna like my mine's.

In the middle of the bell curve, you'll have an increasing number of "directional" arguments, becoming more "scientific" and "learned" as we move to the right of the X-axis. You can poll people; "Do you find the use of a mouse or joystick more immersive?" You can measure time spent between bathroom breaks. You can watch for pupil dialation. All kinds of (I imagine) scientific and kinda-scientific "stuff." And we'll find out some obvious things -- like that bad 70's Hammond Organ music in a fantasy role-playing game hurts immersiveness -- and we'll find out some not-so-obvious stuff.

And on the extreme right of the curve, we'll have folks who find out stuff, or try to, that's so esoteric it becomes un-operational-izable (sorry, it's midnight and I can't think of the real word).

The point is, to those seeking a point, that defining and then working with the definition will help us understand how to do it *better.*

Now, to Prokofy's very valid point... definitions should never prohibit behavior or limit creativity, but can be useful in terms of guiding thought or keeping us from tripping over the same weeds again and again. We notice, for example, when bad authors or film makers often have not taken the classes or read the books that outline many of the basic definitions and points about what makes for the equivalents of good "immersion" in those media.


As for the question about what humanities games scholarship is for -I think it is unfair to demand each and every scholarly "buffleheaded pedantry" paper written on games should be of use to game designers and gamers in terms of "understanding what a game is". This tends to be the crux of the problem with the hostility between the sciences/industry and the humanities. Many academics in the humanities that work on games have little interest in "helping" the industry design better games, what they care about, as has been said above, is understanding cultural and aesthetic meanings of games.

This doesn't mean that scholarship in the humanities is pointless. More than anything, I think the debates about games, over time, will result in legitimacy for games as a "proper" and important cultural object, or even (whisper it) art... (Yes, we already know it is, but lets face it, the majority of people think of videogames as an unimportant and potentially dangerous activity.) It took cinema over 50 years to be appreciated as the art form it is today, only when Truffaut and Godard etc. started writing about directors as artists with visions and ideas did film begin to be seen as a potentially culturally important medium.
I think gamers and designers should appreciate the debates in the humanities, as with increased legitimacy should come increased funding opportunities outside the publishers, more willingness to take risks etc. Generally a more rich and interesting spectre of games than what we are seeing right now.


The problem with "immersion" is that the terms have different meanings. Being immersed in a book is not the same as being immersed in a 3D sensaround environment is not the same as being immersed in a game of Civilization IV is not the same as being immersed in a game of WoW. They may all use the word "immersion", but they're different.

Academics who write about immersion using studies in which they get people to wear goggles and manipulate 3D graphical objects have a different idea of what it means to academics who study flow, and for whom immersion in a computer game is just the same as immersion in making model jet fighters from kits.

The immersion that players often cite as a factor in their enjoyment of virtual worlds is different from all of these. Yes, a "seems as if you're really there" graphical image can help people become immersed, but it isn't itself immersion. My own take is that virtual world immersion is a measure of how closely you identify with your character: you're "fully immersed" when your character is you. Therefore, in order to enhance immersion (if that's your goal), you need to help players be/become themselves, which in my theory at least means buying into the Hero's Journey thing.

I didn't find Allen Varney's article all that out of tune, really. Many academics really do look at things so remotely that they miss important details; you really do need to be able to say "come back when you're level 60 and tell me that" to some of them. However, not all academics are that way, and Allen's article seems a little unfair in tarring them all with the same brush.



I'm an academic (grad students count, right?), though not of gaming.

I have to say that I think that 'the definition problem' and the idea of 'critical distance' are overblown issues. From my, admittedly limited, teaching experience over the past three years, the academic's role is that of a mediator between some specific topic and the larger culture. We are the go-between, and we are responsible for explaining specialized material to the average person.

As translators, we do deal with the meaning and significance of particular elements of our topic, but this occurs in the form of a report. These translations of meaning and significance are always imperfect, but what's important about them is that they try to move the terms from one discourse community to another. The fact that no one definition fits every meaning of a term is a red herring. Specific definitions are designed to suit the combination of the reporter's intent, the user's understanding, and the reporter's audience's understanding.

The distance issue arises naturally from the academic's role as an interpreter. If they are really trying to bridge the gap between some specific subject and the learner, they are necessarily a member of neither community.

Anyway, that's my take. I don't think the academic work has anything to do with making better games (their audience is not the subject of the study) but rather making some specialized field of knowledge and endaevour comprehensible to a larger audience. As a religious studies instructor, I rarely expect my work to result in better religions, and frequently the terms I present to students do not have agreed upon meanings. There's always a dynamic between my understanding, the religions' understanding, and my students' understanding.


So what's game scholarship for?

I'm in the academy as well. I study games the way a medical researcher studies rats, or a sociologist studies singles bars. The medical researcher is not focused on rats qua rats, but rather rat physiology as a field in which to examine another phenomenon. And, I don't think the bar owner expects to get useful payoff from the sociologist's study. I study games to understand informal learning in community, to observe how interface designs support that (or not). That might actually be useful to game devs who view community as a selling point in their product and hope to do a great jorb of supporting an active, vital, dynamic surround for their ingame community. However, my main interest is not in game development as a field. I think a lot of us "studying" games and writing about them, consider them a locale where something happens of greater interest than the game itself. And by the way, I"m not interested in defining immersion. It's a moving target.


I know I'm coming to this a bit late, so my apologies.

Anyway, I have a few questions that this discussion has stirred up, at least for me.

Specifically - am I wrong in asserting that the preponderance of game scholars are filtering in from a certain subset of the humanities disciplines? Specifically media studies, literary theory, and certain kinds of sociology? Or is that an incorrect assertion? I'm just basing that on my gut experiences trying to keep up with current game scholarship in my off time.

I have to say, as someone who has been in and out of the commercial game industry over the last decade and who has been making an earnest attempt to understand games, that the fields that see the most written about them seem... well, like odd choices to me - all this assuming that my gut read of the scholarship is correct. I would by no means say that literary theory, for example, has nothing to say about games. But at least in my own experience, I feel like math (especially graph theory), computer science (especially artificial intelligence), economics, psychology, architecture, music theory, and many other disciplines besides have at least as much to offer, depending on the game or games in question.

And perhaps this is why this escapist article makes references to the work on "flow"; if you're inclined to see game scholarship as being heavily defined by work in the disciplines I first mentioned, "flow" might very well stand as a stark example of the type of concept current game scholarship isn't focusing enough on.

I suppose, realistically, that most of these issues are simply social in nature; it seems, up to this point, that scholars from a handful of humanities fields have made the most compelling cases for their game-related work to be respected or at least tolerated by their existing disciplines (which is of course essential to staying afloat as an academic). And, as a result, the terminology used to discuss games (and existing meaningful scholarship) skews heavier and heavier towards those disciplines. Being able to participate in new discussions requires being conversant with old discussions as academics, and most old discussions seem to be in the inherited jargon of a specific handful of fields.

I see the claims every so often, "Why should game scholarship have to be useful to game makers or players? What connection does there have to be? Our job is to understand games, not to tell designers how to do their job." I guess I'm sympathetic to that position in certain ways. Obviously games are fantastically complicated, and there are many different ways to understand and appreciate games.

I think it has to be more complex than this, though. It's sometimes hard to shake the feeling, coming from an in-the-trenches game designing perspective, that an alien group has snuck in, planted their flag, and claimed ownership before anyone had a chance to catch his or her breath and really assay what topics are actually essential to understanding games. In short, it can feel very much like game discussion, at least on an academic level, has been colonized. I'm sure that's not the intent, of course, of academics who are, after all, just trying to make interesting new contributions to the discussion, the participants of which are a fluid and evolving group. But as the articles and discussion accumulates, and as more and more pre-existing terminology, concepts, and references from certain fields are added to those discussions, it becomes very hard as an outsider to extract any meaning from the conversation, and it becomes very hard for an outsider (who, in many cases, will have a vastly larger background in actual games) to recognize the discussions as being about games at all. It requires an astonishing amount of research just to reach the point where an outsider could safely say, "Oh, right, this definitely doesn't help me understand games in a way that I value. This is the wrong lens, and the wrong focus, and the wrong set of paradigms for me." Obviously, this complexity of references and terminology seems to be the inevitable fate of any academic discipline, given enough time. Less obvious, at least to me, is what happens when a discipline, such as game making, seems like it straddles so many other disparate disciplines. Can a person say important things about games, intellectually, without knowing about literary theory? Without knowing about graph theory from math? Without knowing deeply about Artificial Intelligence? Without knowing the current trends in existing academic game discussion? What about when existing academic game discussion relies so heavily on some of those disciplines and not others for largely historical reasons?


I apologize for hopping in on this conversation late as well. I'm a Californian law school student and avid gamer who happens to also enjoy these sorts of conversations. After rummaging through the internet for a while for articles on gaming theory, I stumbled across this blog entry and felt the need to contribute. With that said, I'm not an academic in the field of game theory and I'm open to the notion that I'm talking completely out of my ass.

First and foremost, I think you need to look at Varney's position in order to understand his critique. He designs games for a living. Dare I say it, but he "immerses" himself in the guts of these things. His relationship with gaming theorists and frustration with them is not unlike the relationship between the practicing physician and professor of biology, or, in my case, between practicing lawyer and professor of law.

If you've ever been to a professional school, you can probably understand Varney's criticism. The image is all-too common in my field: that of a professor, a guy who is practically considered a national treasure in the area of law he teaches, who has written treatise after treatise and tome after tome on his particular area of study, but has never actually practiced it.

I could imagine that sort of thing would be pretty frustrating at times to an attorney. How would you feel if you worked very hard on a particular case, had it published, then had an alleged "expert," who has never even been a lawyer, parse and gut it to fit their own agenda?

Perhaps maybe you'd be honored. Perhaps you'd also feel like the professor was being disingenuous. Personally, I think that intellectualization can only go so far. If you're taking a class on sexuality and your instructor is a eunuch, it's going to affect the nature of the class. At a certain point, you [i]have[/i] to be involved in order to provide something of value.

As if the image didn't appear pompous enough already, the whole endeavor reminds me of the old subjective/objective debate in Anthropology. Back in the day, we had people running around cataloging native cultures like they were plants. When we decided that maybe it wasn't a good idea to treat them in such a way, we graduated from cataloging to observing them as if they were fish in the tank, never stopping to think that maybe the fish realized they were being watched and it altered their behavior.

The lesson learned? There is no such thing as pure objectivism. Your very presence affects the outcome of a study. If you try to deny this when it involves people and their stuff, you will probably piss them off.

Video gaming is a difficult case because it's such a new artistic medium and western culture has always greeted these things with a heavy degree of skepticism. Rock n Roll, Jazz, Hip Hop, television, and film all initially started as "trivial" forms of art that took years to become socially accepted. In fact, I'm not even sure Hip Hop is there yet and gaming most definitely is not. The stigma attached to it exacerbates and legitimizes the desire for one studying it to distance themselves from it. This is why a lot of "gaming theory" reads like 19th century Anthropology to me and perhaps one of the reasons Varney is frustrated. You have to remember that he has a strong sense of ownership over the material that is being theorized.

I do think that Varney thinks game theory has value, or at least has the potential to have value. The standards Varney holds theorists up suggest, that at the very least, he thinks there is genuine potential in this area of study. From his perspective, it seems like he just thinks the field needs to get rid of a few sacred cows.

Speaking for myself, it's pretty hard to deny that there is a lot of interesting material there that says a lot about who we are. I consider game theory legitimate intellectual material in and of itself and not something that needs to or should support the gaming industry. An article about Hip-Hop need not further the agenda of Hip-Hop, if there is one. Every self-examined society should be looking at things like this and if I didn't find this subject so fascinating, I wouldn't be posting here.

The comments to this entry are closed.