« Business and Legal Primer | Main | Superstition »

Oct 29, 2006

Comments

1.

Are you sure the audience noticed the humour...?

*sighs*

2.

Nate>In 1993 Chip Morningstar penned "How to Deconstruct Almost Anything, My Postmodern Adventure".

This is one of my all-time favourite papers. I've read many, many fitting-virtual-worlds-to-some-pet-theory writings in which I understand every word individually but together they make no sense whatsoever, and Chip's attack on them still holds as strongly today as it did then. This is because, sadly, such ivory tower thinking is still as prevalent now as it ever was.

It was in response to a similar wash of up-its-own-backside rhetoric that I wrote A Post-Modernist Essay. To save you the madness of reading it, it consists of sentences and phrases extracted from the proceedings of the 2003 SELFWARE conference, sewn together at random so as to make syntactic and grammatical sense while being devoid of any actual content (which is pretty well how the originals read, too). Every time you read the essay, it generates a new version of itself. This is itself post-modern, and reflects my understanding of where post-modernism must ultimately go; thus, what began as a poke at post-modernism's pomposity ironically became a genuine critique of it. The first line, "This is serious", is actually true.

I'll stop now, in the knowledge that most of the previous paragraph is just garbled nonsense...

Richard

3.

"Are you sure the audience noticed the humour...?"

What?! Are you suggesting the Emperor is wearing no clothes?

As a recent escapee from that sort of blather, I am sure that most of the audience did NOT notice any humor.

4.

@Richard:

On a blog where we frequently deconstruct the living shite out of various esoteric gaming concepts, it's odd to be bashing another academic discipline, let alone one that has proven -- in many cases -- incredibly useful for the understanding of everything from politics to marketing to economics to fashion.

Yes, postmodernism and deconstructionism can be taken too far. In those extremes, examples range from the oblique to the hysterical to the terrifying. But so too can almost any "-ism" be taken too far if not leavened with many grains of salt.

Chip himself says in his essay:

"Buried in the muck, however, are a set of important and interesting ideas: that in reading a work it is illuminating to consider the contrast between what is said and what is not said, between what is explicit and what is assumed, and that popular notions of truth and value depend to a disturbingly high degree on the reader's credulity and willingness to accept the text's own claims as to its validity."

Well, that's one layman's decent boil-down of deconstruction, sure. But then to go on to claim that the entire field of post-modern literary, social, psychological and historical criticism...

"...would make a worthy topic for some bright graduate student's Ph.D. dissertation but has instead spawned an entire subfield. Ideas that would merit a good solid evening or afternoon of argument and debate and perhaps a paper or two instead become the focus of entire careers."

... is like someone coming into your back yard and saying, "Well, geez, Richard. Why do we need more than one fantasy MMORPG? It's all just a bunch of elves, dwarves, and dudes in tights, ain't it? What's the diff between Everquest and WoW anyway? And how come we're still arguing at all about avatars vs. text? Text is just dumb, right? We're so over that. And it's all just a bunch of games anyway, isn't it?"

Before postmodernism and deconstruction, even the *idea* of a discussion about the importance of gaming or popular culture or entertainment within society would have been laughed off by academia. Without postmodernism, what you do doesn't count, Richard. Neither do I, as I'm in marketing. Without deconstruction, the idea of the importance of the content of our VW/MMO worlds and the contributions of the art of such things as animation, comic books, anime, advertising, manga, machinima, popular music... even whole genres like sci-fi, fantasy, mystery, romance, rock-and-roll, blues, musical theater... even whole media like film, radio and TV... they would still be written off as unworthy of study.

Yes, there is bulldookey in the weeds, of course. You get that in any field. Not just in academia, but in the commercial world (we call it PR), and politics (spin) and religion (dogma). But, as movements go, deconstructionism is pretty important to how we all view the current universe of content, art and participation.

I've heard it said again and again, here and in other gaming arenas, for example, that the "particulars" of an MMO/VW aren't, for example, the most important thing when it comes to their participatory nature. The idea that working together, guilding, grouping, etc. builds new skill sets is what is truly important. We've had posts here about dating in MMOs. About how skills acquired online can transfer to real world managerial activities. About political issues transferring into games. These are all "postmodern" discussions; deconstructions of the game-space as more than just "when an orc meets an elf, going through the rye."

You can't have it both ways. Either postmodernism is very important -- in which case, VWs and MMOs get to be more than "just games where people pretend to be goofy fantasy critters banging on each other (WoW) or bugger each other senseless in a visual chat room (SL)" -- or it's not, in which case we revert back to the previous methods of literary and social criticism, which say, basically, "A thing is what it is. Period."

Is the Horde evil? Do people "fall in love" online? Are 2/3 of the questions posed here worth discussing at all? If so, face facts... you're a postmodern deconstructionist in your own field. So stop throwing stones.

@Nate: I'm not sure what you're asking.

5.

Andy: The idea that working together, guilding, grouping, etc. builds new skill sets is what is truly important.

Hardly a breakthrough for human kind... It is more interesting that so many that spend so much time in them end up feeling dissatisfied. Let's face it, they would have been better off collaborating on Wikipedia and open source software.

That said, virtual worlds are a gift to the postmodernists...

6.

@ Andy

Well done! I can't believe though that you actually have to explain these things to people on this blog.

7.
'I don't know what you mean by "glory,"' Alice said.

Humpty Dumpty smiled contemptuously. `Of course you don't -- till I tell you. I meant "there's a nice knock-down argument for you!"'

`But "glory" doesn't mean "a nice knock-down argument,"' Alice objected.

`When I use a word,' Humpty Dumpty said in rather a scornful tone, `it means just what I choose it to mean -- neither more nor less.'

`The question is,' said Alice, `whether you can make words mean so many different things.'

`The question is,' said Humpty Dumpty, `which is to be master - - that's all.'

'Nuff said.

8.

@Bill:

Yes, yes. We've already admitted that, at its extremes, postmodernism and deconstructionism and lots of other -isms can be laughably troublesome.

Does that mean that pure science is always right? Always best? Always and only what we need to get by?

The way of thinking you are, I guess, "'nuff saying" would completely eliminate, oh... let's imagine... metaphor. Right? Because a metaphor compares two things that aren't logically related. We would also get rid of cultural context and any level of moral relativity, too. Because "a thing is only the one thing what it is."

A banana is only and ever just a banana, Anna, eh?

The first time I read "The Red Badge of Courage," I was in the ninth grade. I knew nothing about it besides I had to read it. OK. No big deal. I read it, took the "comprehension test" to make sure I'd read it, and that was that. Years later, I had to teach it. So I went back and read it again, this time with the help of several study guides and a "teacher's guide" at hand.

What a difference, eh? Crane buries all kinds of interesting metaphor and reference and allusions in that book -- he uses color descriptions, for instance, to mirror the main character's mood. It is also very interesting to understand the historical context of the time in which he wrote the book; the fact that most war stories of the time were distinctly "glory focused," and the idea of a novel that featured an anti-hero was decades ahead of its time. Although I couldn't bring in "Apocalypse Now" to watch in the public school where I taught, I compared the book to that film; a very gritty, realistic depiction of war that much of the public was not ready for.

What I just did there was "postmodern deconstruction." A mild form, to be sure, but the form just the same. What's the big deal? Without deconstruction, you only get to say, "It's a war story about a guy that ran away in a battle, and then didn't."

Why isn't it OK to look at a piece of art or poetry or a film or even (gasp) a game from the point of view of another academic discipline? Say... women's studies? Or sexual roles? I think we've done that here before, haven't we? We've talked a bunch of times about cross-gender avatars and why we choose that or not. Ding! Deconstruction.

Yeah, you can put one over on a crew of self-involved, highly specialized folks who have their own inbred, linguistic foibles...

Here's a scary question for y'all, though... Anybody here want to bet that a good scammer from the other side of the fence couldn't get a fake article published in a game magazine by cutting and pasting a bunch of high falootin' phrases in the same way Sokal did?

9.

Andy Havens>Before postmodernism and deconstruction, even the *idea* of a discussion about the importance of gaming or popular culture or entertainment within society would have been laughed off by academia.

It still is laughed off by academia. I speak as someone whose planned Online Games degree was recently canned, with one of the excuses offered being "we don't want people to be put off coming to study telecommunications when they see we have a degree in online games".

>Without postmodernism, what you do doesn't count, Richard.

What I do counts regardless of whether someone wants to deconstruct it to oblivion or not. It doesn't need the approval of a deconstructionist point of view to count, any more than I need such approval to think.

>Without deconstruction, the idea of the importance of the content of our VW/MMO worlds ... would still be written off as unworthy of study.

With deconstruction, they still are.

Richard

10.

HTML fix.Check.

11.

Ah, Michael reconstructed the deconstructing HTML! Way to go!

Richard> It still is laughed off by academia.

Well, if not laughed off... they do get a very empty face expression and change the subject. And I don't blame them.

12.

I'm currently reading a pretty good book: David Lodge, Small World, published back in 1984, that nicely skewers structuralism and academic conferences in an amusing way. The Sokal Affair is probably the most well-known clash between the sciences and humanities jargonizing.

That said, personally, I find this whole "Mr. Smith goes to the Ecole Normale Superior" pretty tiresome. As Garber says in Academic Instincts, sometimes jargon is obfuscatory, yes, but sometimes it's just a shorthand to streamlined communicative form that ought to be streamlined. Done well, academic jargon is analogous to Perl.

13.

@ Andy
here is an interesting view on the Sokal affair that may be interesting to you, given that you seem to seek a balanced approach:
http://www.mathematik.uni-muenchen.de/~bohmmech/BohmHome/sokalhoax.html

@ Richard
So whose fault is it that game studies have such a weak reputation? Maybe the people who laugh at Foucault & Derrida in ignorance are the same as the ones laughing at you?

14.

Glad you stepped in there, Andy. I'll bet nobody reads "The Red Badge of Courage" anymore, let alone "Lord Jim".

Small wonder tekkies don't appreciate the humanities and ridicule them, never read novels or philosophical essays, scorn the great books or ideas, and only have brief "serious brushes" with them. What passes for "the humanities" these days in universities is a post-modernist word-salad, self-referential and impoverished. You'd have to reconstruct the humanities and re-present them to tekkies to get them to re-engage.

Oh, wait, somebody did that -- in all those elaborate legends and quest narratives in online games.

15.

greglas: Done well, academic jargon is analogous to Perl.

I guess that explains why Perl is reknown for its poor syntax...

Richard Dawkins has his say on this too of course, in his review published in Nature...

16.

Yes, and it's probably no coincidence that I find Richard Dawkins pretty tiresome too.

17.

It's funny... Because it boils down to "What's good is good, and what's crap is crap," doesn't it?

It's just that some people really, really like crap. Really good crap, anyway. ; )

Richard: I hope you know I meant no offense; I don't think anybody needs any permission, overview, oversite, etc. to do good work -- and your certainly qualifies in my book.

But I didn't mean that a postmodern review or deconstructed reading of games and their play is necessary in order for them to be taken seriously by any modern scholar in any particular. I meant that without the historical precedent of the movement itself, it is likely that we would not have anything approaching fields where popular culture could be studied in the ways it is today. And that includes any kind of support (limited though it may be) for game theory. I think it's a damn shame that there isn't more support for game studies in academia.

But throwing stones at postmodernism is probably not going to get grant money for an endowed MMO/VW chair... since the idea of formally studying the social, economic, art, literary, psychological and sexual integration of games in our society is going to have (if successful) LOTS of fans in the postmodernist movement.

Saying that you want to study these things in games, but don't want to do so with the postmodernists (for whatever reasons) would be like somebody from another (let's say literary) field coming to you and saying, "I want to study everything about MMOs... but leave out all the silly Tolkein-esque fantasy crap and all the sci-fi garbage and giant robots. Because all of that is based on childish, genre fiction that isn't taken seriously by any real, mature authors. So... besides those MMOs... what've we got to talk about?"

It's a real force in the academic world. And the amount of overlap between a things like "The Hero's Journey" that we're all fond of talking about in games is also incredibly prevalent in deconstructionist talk. In fact, arguments about the role of THJ in games are a type of deconstruction; an anthropological form.

It's not all bull-dookey. Taken to an extreme? Sure. But then again, playing WoW for 80 hours a month is pretty kookoo-bananas. We get really tired of reading the mainstream media reports of that one guy who played so long that he died of dehydration, and the "Everquest Widows" and the kids who reenact D&D scenarios and hit each other with flails... they are the abberations. The low-end of the Bell Curve. Right?

I hope.

Same for postmodernism. I spent four years as a lit student "deconstructing" poetry and fiction and essays at a pretty good college. Never ONCE in that time did I come accross a prof who fed me something so deep in the hoopla that I went, "Geez. That's just bug nuts." Some stuff I disagreed with, sure. But that's part of the fun of scholarly discourse, right?

We do that here, too, eh?

18.

Andy:It's funny... Because it boils down to "What's good is good, and what's crap is crap," doesn't it?

No, I don't think so. To quote Wikipedia:The analytic tradition values clarity of expression, whereas Heidegger thought that "making itself intelligible was suicide for philosophy."

Now, there are at least three reasons for being obscure:

1. The author is struggeling with a concept that feels important, but is hard to crack open. Such as Heidegger's topic "being".

2. The author is exploring the border of a field, but somehow forgets that he is actually using metaphors as a device, creating an analytical mess for the reader.

3. The author wants to get published and has to work in the right references, even if they are completely superflous. Unfortunately this osbcures what the interesting points he actually has, but that's the way of life.

Type (1) obscurity is a necessity, but (2) and (3) are not. What is unfortunate is that type (1) rhetorics pave the way for the acceptance of type (2)/(3) obscurity.

Some above pointed to an essay which stated that physicists themselves pointed out analogies between QM and society. Sorry, but that is to be expected. If you look hard enough you are going to find "evidence" for anything. That professionals think about the world in terms of their own work, which they master, is typical. That is insufficient to undermine Sokal's critique.

I also reject the idea that technolgists per se are mysteriously incapable of appriciating postmodern ideas. Proof: Nate Combs is without doubt Terranova's most abusive postmodernist. He keeps on wrecking havoc to Computer Science in every other post... (I assume he is a technologist, as I doubt he would be able to wreck it so thoroughly if he wasn't.)

Of course, postmodern ideas are important for society. A pure modernist comprehension of the world would for instance define homosexuality as a fault, flaw... The role of a programmer requires a modernists mindset, but programmers can be artists too...

19.

Ola: No, I don't think so. To quote Wikipedia:The analytic tradition values clarity of expression, whereas Heidegger thought that "making itself intelligible was suicide for philosophy."

Making itself intelligible may well be suicide for wikipedia. Speed the day!

Of course it's bogus. Everything to do with liesure is bogus and that includes art, literature, games, TV, and pleasant walks beneath an azure sky. The thing is, though, bogosity in these cases doesn't imply worthlessness even to the strictly modernist outlook since all of them are productive in that they become commodities and generate wealth (hah, literature is a form of economics, people!).

As an engineer, it's only by accepting that bogus concepts can have value in and of themselves that i can ever live my with my lit-crit-loving wife.

20.

?>So whose fault is it that game studies have such a weak reputation?

It has to be someone's fault?

There are a number of reasons that game studies have such a weak reputation, one of which being the simple fact that we haven't studied them for long enough to build up a corpus of knowledge that says anything of consequence to any discipline beyond our own. However, there are many other reasons, two of which especially hurt:
1) Scientists (physical and social alike) see games as tools for helping them advance areas of knowledge within their own field, but don't see games as anything intrinsically worthy of study. A sociologist might get a PhD and a dozen books out of studying game players, but whether what they say would help a game designer make a better game is not a concern to them.
2) Researchers in comparative studies see games as being just another aspect of their own field, whether that be narrative literature, performance, oral storytelling or whatever. Games aren't of interest to them per se because they see them as a mere special case. Again, this may help them validate their theories but it doesn't directly make for better games. I'd fit post-modernism in here: yes, it can be used to explain virtual worlds, but can it do so in a way that makes any contribution to our getting better games?

For me, as a game designer, I want better games. I'm prepared to listen to arguments about what "better" means, and what "games" means, but ultimately it's the games I'm interested in. Can I write a better novel having studied narrative structure? Almost certainly. Can I write a better novel having studied post-modernism? Only marginally so (unless the novel is about post-modernism). It's the same with games: I want to see better games, not better explanations of how games fit into a one-size-fits-all theory.

I suspect, however, that the reason games studies has a weak reputation is nothing to do with games studies and everything to do with how people view games. Games are "for fun" and "trivial". The very fact we have a Serious Games Initiative suggests that other uses of games are not serious. If they're not serious, why would any self-respecting academic wish to be associated with them? That's our problem.

>Maybe the people who laugh at Foucault & Derrida in ignorance are the same as the ones laughing at you?

Maybe they are, maybe they're not. What difference does it make?

Richard

21.

Andy Havens>Richard: I hope you know I meant no offense; I don't think anybody needs any permission, overview, oversite, etc. to do good work

No offence was taken.

>I meant that without the historical precedent of the movement itself, it is likely that we would not have anything approaching fields where popular culture could be studied in the ways it is today.

I agree. That's not to say we couldn't have studied popular culture in some other way. For example, had people developed theories of game a couple of centuries ago, we could be looking at all popular culture now in terms of games and games-within-games. That may or may not have been a useful way to go about it, and the same can be said of post-modernism.

>I think it's a damn shame that there isn't more support for game studies in academia.

I'm ambivalent. I want to see games studied, but I'd prefer it to be by people who understand games, not people who understand some other subject. Too many times have I seen flocks of academics descend on games, prognosticate on them, then fly away as if the whole field were now solved.

What I want are better games. The more people study games, the greater the chance we'll get better games, however if some bandwagon starts rolling that takes us off in some wild and less-than-useful direction then what I mean by "games studies" could bear little resemblance to what "games studies" has become.

>But throwing stones at postmodernism is probably not going to get grant money for an endowed MMO/VW chair...

Nothing is going to get such money, therefore I can throw stones as much as I like. It's like saying I shouldn't say rude things about Sarah Michelle Geller because then she won't have my babies: it's not going to happen either way.

To be honest, what annoys me about post-modernism isn't so much the theory itself as the sheer smugness of many of its proponents (present company excepted). It's as if someone has come up with the concept of colour and whenever you make a pot or a shed or a hang glider they point out that oh look, it has colour. Yes, I KNOW it has colour, but how does that help me make it? I KNOW I can deconstruct games - I knew that before I'd heard of post-modernism - but how does that help me make games? It's not the fact that I CAN deconstruct them that's important, it's how I DO deconstruct them. It's not the fact that a pot has colour which is important, it's what the colour IS.

>In fact, arguments about the role of THJ in games are a type of deconstruction; an anthropological form.

Yes they are, but so what? I can call you a collection of atoms and be absolutely correct, but that doesn't mean I know what you feel about toothpaste.

>I spent four years as a lit student "deconstructing" poetry and fiction and essays at a pretty good college.

You spent four years analysing it, looking for symbols and levels of meaning and a great deal more. You could have done that without someone slapping the "deconstruction" label on it, just as people did before post-modernism came into vogue. Or are people now able to say things about poetry that they couldn't say about it before post-modernism's arrival? And, particularly, are people writing better poetry as a result of their acquaintance with the theory?

Richard

22.

Making itself intelligible may well be suicide for wikipedia. Speed the day!

:-) Actually, some of the sections are unbelievably good and up-to-date for an open collaborative effort. Correctness? You shouldn't trust what you read, no matter what the source is.

all of them are productive in that they become commodities and generate wealth (hah, literature is a form of economics, people!).

You ARE american.

23.

Richard> And, particularly, are people writing better poetry as a result of their acquaintance with the theory?

My answer to that is a tentative "yes" -- see Ken Koch. But was Koch postmodern? Depends who you ask.

"Postmodernism" is a big word, in fact so big that it is well nigh useless except as a catch-all for many interesting theories that post-dated modernism and tried to get past its universal theories and towering figures. And I think the current meaning of "deconstruction" is more closely associated with "cultural relativism" than it is with its original origins as a matter of hermeneutics and semiotics.

But that said, when I think of the broad range of visual artists and authors who might fit under the "postmodern" heading (and again, that category is much too broad to be useful), I do see a lot of "better" stuff there, Richard. And actually, I think we can pretty easily trace the influences of those folks into the domain of games, in terms of some basic anti-modernist markers (e.g. pastiche, eclectic visual symbols, disjointed narrative structure, anti-foundationalist tendencies.) In fact, my personal reading is that artists like Duchamp and Warhol -- heck, even "modern" authors like Melville and Conrad -- had sensed out the structure of postmodern theory before the critics got there.

The best critique of Derrida & Foucault that I've heard is that they overstated the newness of their theories and arguments. I actually think that's a fair critique as applied to some discussions of virtual worlds.

24.

greglas I think we can pretty easily trace the influences of those folks into the domain of games, in terms of some basic anti-modernist markers (e.g. pastiche, eclectic visual symbols, disjointed narrative structure, anti-foundationalist tendencies.)

Then I think you should do the tracing if it is so easy... Which classical games would not have existed if their authors hadn't been inspired by postmodern readings? Walking in the park is a disjointed narrative structure too, does that make it postmodern?

In fact, my personal reading is that artists like Duchamp and Warhol -- heck, even "modern" authors like Melville and Conrad -- had sensed out the structure of postmodern theory before the critics got there.

Hey, if you include Duchamp, you may as well include all of dadaism and surrealism, and by virtue of pictorial resemblance, Hieronymus Bosch. Before you know it postmodernism predated Aristotele.

25.

And while we are at it, why not include Jesus Christ and George Berkeley in the fold of early postmodern thinkers?

26.

Ignoring your sarcasm, that's my point. Postmodernism is not well defined (see the link above that lists Koch among the postmodernists). So attacking postmodernism is simply attacking a catchall phrase for various responses to modernism, many of which incorporated important insights from both modernism and pre-modern figures.

27.

But it isn't difficult to define Postmodernism for a particular context. Sherry Turkle did IMO a good job in relation to computer software in "Life on the screen". Her take on it made sense and was useful, even for designers!

The trouble starts when Postmodernism is used to distinguish between us-and-the-others as an eclectic academic symbol. You obviously end up with something messy then... I am sure many artists are inspired by postmodern ideas, but when I look at work from those artists I see more modernism than not: minimalism, cubism, colourist... This is where things go wrong classification wise: creative-influences does not imply membership for the output.

I might be inspired by Mozart when I create pop music, but that doesn't make my output classical music.

28.

Ola: "I might be inspired by Mozart when I create pop music, but that doesn't make my output classical music."

Yes, but if that's true for you, then to say, "Classical music is crap," would be disingenuous, wouldn't it? Or to say, "All those people who spend all their musical lives listening to, studying and playing classical music are idiots. Only pop music is truly worthy!"

I've got absolutely no problem -- zero, niente, nada -- with the statement(s), "Postmodernism and deconstruction can be taken too far, can be made to be ridiculous, can be faked, can be absurd and are, at the extreme, goofy." Not a problem in the world. Go for it. You can not only argue those statements, you can probably prove them.

But I'm also going to argue the statement, "Postmodern and deconstructive ideas have provided more interesting and artistic fodder for creative growth in the last century than many other cultural and social theories."

Ezra Pound, one of my favorite poets and editors, wrote some absoltely gorgrous "clasically modern" poetry. Beautiful, accessible stuff (if you spend some time with it, especially). He also got up on tables at poetry readings and ate dried flowers as performance art.

The latter was meant to be a self-deconstructing act. It was, in its time, incredibly important. He was a well-known, well-respected writer. The shock value of doing something "ridiculous" and "meaningless" in a forum that had been reserved for a medium in which he was a recognized expert was a statement that, he believed, said something about the nature of a world that hand changed radically, largely due to the First World War. That's a gross oversimplification... but there you go.

If I got up in Starbucks and ate dried flowers? Not so much. And today, the idea that a "poetry reading" is reserved for quiet, contemplative, tea-drinking aristocrats is not a socially imprinted ideal.

That's part of the point of deconstructionism; that the context is at least as important as the content. McLuhan said is another way; "the medium is the message." He wasn't talking about specific communicaitons, but about societal movements as a whole.

Could many of these ideas, as Richard says, have taken place without the Postmodernist Movement per se? Of course. And many elements of it were taking place for thousands of years before people said, "Hey, let's call this crap deconstruction."

It is not, in many ways, ridiculous, to call Christ a deconstructionist in relation to His review of the Old Testament laws. He read them in new ways, applied them in a changed context -- with Himself as the center of their fulfilment and nexus of their new interpretation -- and gave His followers a New Testament; a new reading of them all. That's a pretty close metaphor for deconstruction; breaking apart old texts that have ossified and reassembling them based on more appropriate, meaningful constructs.

"So what?" I can almost hear Richard saying. If remixing meaning based on new ideas is what's at the heart of deconstruction, why even call it a movement or a field of study? Isn't that just another way of defining history? Because isn't history a series of steps where that happens again and again? What's so important that justifies an endowed chair? ; )

What's important is that in almost every phase in history, the idea of "this is what is right, what is normal" ends up being enshrined; that is the heart of most movements -- the old definition of "IS" passes away, and the new definition becomes king. Postmodernism does not enshrine any one definition, but the ideas of transition, context, flow, perspective, etc.

So when I say something as simple as, "Andy really enjoys playing computer games as his main entertainment," someone might reasonably go on to ask questions like, "How old is Andy?" or "What kind of games?" or "In what year is this statement made?" or "Is Andy male of female?" Before making the judgment call, "Therefore, Andy is a giant geekoid."

I'm not, by the way, a fanaticaly, giant fan of postmodernism as a "movement." I just like some of the tools in the set. It's like, for me, psychoanalysis. There's some good stuff to learn there, but I don't want to set up house. The tagline for my blog is, in fact, a paean to an era of pre-postmodernism: "Creative flux for our heap of broken images." The idea that it might be important to know how to put s**t BACK together, not just pull it apart. The "tinker" in "TinkerX," the name of my blog is in reference to the old craft of the tinker... someone who fixes broken things and makes useful "stuff" from broken, useless-seeming bits.

So... while I am a fan of deconstruction as a way of observing the universe... I'm not always a fan of the results. But it is also not... useful... to deny that it has had, and still has, a major influence on our world.

29.

I was going to just jump on in here, but I see that Richard has (as is so often the case) already done a lot of the heavy lifting.

I'll just add that the core of my beef with the academic humanities is not that they necessarily produce a higher proportion of foolishness than any other activity (they may, but that's beside the point). It's that they've been far too successful at establishing institutional arrangements to insulate themselves from the darwinian workings of reality. Consequently what value they produce is buried much deeper in crap than it is in my world. This hurts us all.

30.

Chip's paper about 2Cyberconf is signficantly superior to my archived thread on the topic of "Litcrit vs. Engineering" as I saw it then, but it is definately fodder for this group.

The paper/thread is archived on ibiblio as:

The Second International Conference on Cyberspace:
Literary Criticism Collides With Software Engineering

by F. Randall Farmer

I'm a bit older now, and more politic - but the issues were enough for me to withdraw from the xCyberconf community of conferences and look for more folks that would be helpful as things were actually being built.

31.

*nods at Andy* Well, to be honest I find Mozart's music rather boring... but it doesn't really matter. You are right about the tool-approach, because really, even if Freud is 100% wrong, one might walk down new avenues and see new perspective by trying to do a Freudian analysis. It doesn't really matter how you got there, the fact that you got there and found new perspectives is what counts. It's when Postmodernism is posed as the One-True-Way and a Revelation it becomes tedious... The Church of Postmodernism, so to speak.

I've been trying to think of Postmodernism in classic games, but ended up looking for surreal aspects... and I don't think these gamecreators were inspired by Postmodern ideals?

Case 1: My favourite MUD entrance is that of MUD2 (and partially MUD1). You arrive in an English Tearoom and I think you then entered the world by.. ehm... Sipping Tea! It didn't work that way, when I tested it yesterday, so either my memory is wrong or perhaps it has been changed.

Case 2: Jeff Minter's shoot'em'up Attack of The Mutant Camels which is silly in a rather british manner.

Not the best examples... I hope someone can point out the Real Postmodern games for me?

(I did pick Jesus and Berkely because they opposed zealous "objectivists" in their own time, but as you point out that is just the pendulum of history doing the swinging. Hard objectivism is easily transformed into anti-human ideals, creating openings for heroes ready to oppose that.)

32.

@Chip: By day, I'm a marketing guy. I've done it in retail, B2B, advertising, direct markeint, merchandising... a bunch of areas. It's as "darwinian [a] working of reality" as you can get. Survival not just of the fittest, but the fastest, meanest, orneriest SOB's in the market. Bare-knuckle capitalist art at its best/worst. There is no talking yourself out of a shoddy sales-cycle. There is no BSing a bad campaign. You can't talk your way out of your numbers. The only definition of *good* in marketing that I accept is "did it work?" If it worked, it's good.

But...

By night, I'm a poet. Where metaphor is king. Where the sound of words matters, as well as their meaning. Where meter and rhythm are important and where the choice of your next work depends on the last and the next for many, many reasons.

And, guess what? Those worlds actually have a lot in common. Because the *path* to useful, good marketing that works often runs straight through the garden of metaphor and poetry. Understanding one is incredibly helpful at knowing the other.

The reason some of the "less darwinian" humanities studies are a bit more protected within the confines of academia in many cultures is because they are *not* supported much (or at all) in other environments -- such as the economy. But there is an understanding that an overall support of blended knowledge and art is good for the complete health of the culture.

Nobody will ever pay me for my poetry. I understand that perfectly and completely. And it doesn't bother me in the least. Yet I *protect* this "non-darwinian" activity of mine jealously, because it will make me a much better hunter-killer overall.

Those engineers who take the time to get to know a couple crit-lit types as undergrads... who befriend psychologists, sociologists, or poets... who take a few courses outside their majors... who go to conferences and expand their horizons and even accept that maybe a freaky-ass world view, while not their own, may be just as valid as theirs... these engineers may come out *better* at what they do.

Because we (as a whole) are the *environment* not an individual species. Virtual worlds are, I think, big enough to be important to lots of different people for lots of different reasons.

Are some of the lit-crit types full of their kind of crap? Yes. Are some of the software engineers only interested in the software and none of the social implications? Yes. Are some of the VC types only interested in making a buck? Yes. Are some of the Red-Light types only interested in having sex online? Yes. Vegas-types only want to gamble in the VWs? Yes. Tolkeinites only want to get they elf on? Yes.

Lots of views, eh?

Writing and talking about how they all intersect is fascinating... just fascinating... ; )

33.

Randy --

Thanks for that slice of virtual worlds circa 1991. It sounds like that conference was pretty awful.

I was going to write something in response to Chip's statement about "the darwinian workings of reality," but the point of it seems to be promoting disengagement with those of us who find substantial value in the "academic humanities."

I'll just give up on this thread...

34.

Richard Bartle wrote:

It was in response to a similar wash of up-its-own-backside rhetoric that I wrote A Post-Modernist Essay.

That is the funniest thing I have read in a long time.

--matt

35.

Everyone, Greg >

Well.

I was hoping that this discussion would have gone a different route (see the 2nd half of post). However, bested by Chip's words from over a decade and a half ago...

I'm guessing one wouldn't have to dig very deep in the TN archive to guess where my instincts lie here. Yet to Andy I give my largest handshake: what a marvelously spirited and eloquent defense of the alternative viewpoint. Perhaps he's mad, perhaps I am. I don't think that is the point. Played out in his sporting defense comes a rambling though intricate exchange of ideas. I'm sorry Greg is exhausted by it.

I often think that what goes on on TN, at its best, is like what goes on at game developer conferences.

IMHO. Game developer conferences are the last best example of enlightened amateurism left in the professional sphere. Seemingly they fly against the forces governing every other endeavor, academic or professional, where talent and energy has become increasingly specialized, forming stove-pipes of conversation. A modern world requires specialization, and perhaps too these forces will eventually drive game and virtual world development...

The level of discussion in game dev circles, relative to their referential established discipline, has by and large seemed to me amateurish, even if gloriously so. I've been to many presentations where I thought them, at least on some level, a thin but exuberent gruel of some sort of pop - pop-psychology, pop-economics, pop-engineering, whatever, mysteriously mixed.

Yet, together they can sometimes form a brilliant and illuminating synthesis.

However a synthesis requires more than one part, more than one participant, more than one language. Better this blog, this discussion for them all.

I believe, I guess, the heart of a game is the heart of an amateur. From such comes creativity. Chew that one over.

Have fun!

36.

Nate, I've got no problem with anything you've said. I, too, am generally delighted by listening to game designers (and other artists) pull in ideas of all shapes and sizes to explain what they think they are doing or need to do. I'm often a passionate amateur myself in a variety of areas. Indeed, as Garber points out in her book, we're all amateurs when we're at our most interesting.

What I'm getting from Chip and Randy (and to a lesser extent, Richard), though, is hardly the endorsement of an interdisciplinary eclectic or amateur approach to learning. Instead, it's pretty much the smug pillorying of the "word salad" of "postmodernists" and "deconstructionists" (in Chip's case, the whole of the "academic humanities") by those who consider themselves more adept at "the darwinian workings of reality."

Yes, actually, it is tiresome trying to respond to that kind of thing. Should I start by defending the enterprise of the "academic humanities"? I don't have time for it.

37.

Hats off to Andy for an incredibly well-voiced statement showing, as I see it, how our stance toward all of these perspectives must ultimately be pragmatic rather than dogmatic.

I continue to be surprised at the excesses one finds on all sides, some replicated here. No one approach to accounting for the phenomena of our world has a corner on the market; their incompleteness is inescapable. (Sadly, hubris is also quite widely in evidence.) Yes, the play of meaning in discourse is consequential in human affairs, but it's not all there is, so it is not always determinative. Yes, the material conditions of the world are consequential in human affairs, but they, too, do not account for all we see. And yes, the strategies and tactics of power relations (surveillance, biopower, discipline) are consequential in human affairs, but they, finally, cannot account for all we see. Why this leads some to think we are stuck in a postmodern dilemma of hand-wringing is beyond me; we can still develop sound arguments, marshal evidence (broadly defined), and make reliable claims (though the ways of doing so will not be unitary). So we don't have a warm blanket of transcendent certainty to wrap ourselves in -- too bad. That doesn't mean we can't say anything.

And as for Darwin, he is the perfect person to invoke here, but not for the reason you think, Chip. It is as unfair to him as it is to Adam Smith to hold them up as trumpeting some version of truth through unfettered competition. Darwin saw no inherent progress in evolution, no crucible of competition which necessarily yields the better. He saw what we so easily forget as we reach for the comforts of certainties: that the world is a messy, unpredictable place where there is no grand plan, and which will never be compeltely tractable to human control. Does that mean he was a postmodernist? No (though perhaps a poststructuralist, I guess). It means he was humble about the capacities of human knowledge. Everyone is well served to read him again from time to time.

I agree, Nate, that Terra Nova is as promising a place as anywhere for the productive discomfort that ideally appears when divergent opinions but shared goodwill meet, but at times I'm left wondering about that optimistic view. Perhaps Jake was right when he replied, "Isn't it pretty to think so?"

38.

I hope that nobody is arguing against the humanities in principle. Clearly human culture produces art, literature, music, and myriad other forms of communication, just like culture has produced mathematics, logic, and science.

Confusion, defensiveness, and anger arises when attempts are made to assert that different cultural tools are equally useful (or valid, effective, correct, etc) in all domains. Given the contentious nature of this thread, let me try to keep things civil by choosing a safe, calming example.

So, let's talk about religion. Specifically, old books.

Given that we have various examples of religious writing going back thousands of years, clearly there are a host of tools that might apply to studying the old books. Linguistic analysis could help understand where and when they were written, how they have changed over time, how transcription errors were impacted by printing, or the impact of changing cultural mores on translation choices. Carbon dating could help date specific examples. Studying old books as literature could help demonstrate how they influenced (or failed to influence) other writers throughout history. Cognitive science and neurobiology could help explain why so many people find these old books compelling. Art history could help track how stories from the old books spread across cultures. Anthropology could understand the impact of that spreading. All of these approaches generate testable and falsifiable assertions about the real world. In each of these cases, appropriate tools are applied to maximize understanding.

Because understanding is what scholarship is all about. Despite its well known liberal bias, the real world does allow for falsifiable assertions to be tested. Pronouncements without testing simply become dogma, which does nothing to advance understanding, and problems come about when someone wants to insist that they do. It would be like relying on the old books to explain events in the real world rather than biology, physics, mathematics, sociology, or economics! The tool kits offered by those fields do a demonstrably superior job of explaining the real world, so it shouldn't come as a big surprise when proponents of old books respond with anger, derision, and attempt to avoid the discussion. Defenders of old books who generate the least coherent arguments may even achieve a local maxima of defensibility.

To return to the topic at hand, it is important not to choose dogma in other aspects of human understanding. Music is unlikely to be the best tool for understanding embryonic stem cell development, but the really cool thing is that if for some particular case it is, the process of testing those assertions will surface that fact. If biology consistently provides better tools than music, that is measurable and the musician shouldn't get bent out of shape when biologists default to their tools. Nobody is shocked when genetic music (http://whozoo.org/mac/Music/Sources.htm) fails to set the world on fire. Nor should the musicians complain when the neuroscientists start to explore why humans enjoy music -- even if their theories go against music theory -- since the assertions will be testable, falsifiable.

Yes, the world is a complicated place, but throwing up our hands and saying "well crap, that's unknowable" has been demonstrated time and time again to be wrong. The most empowering, exciting, and spiritual view of the human condition is that we're good at figuring stuff out! This isn't saying that all human effort should go towards explaining the real world or that we'll ever know everything, but the trend line is towards greater understanding, despite the FUD and dogma that so often oppose this trend.

So, I hope that while we can celebrate all forms of knowledge creation and exploration, we should be comfortable accepting that different tools can be more and less effective in different domains and that domains differ in what we mean by "effective." More specifically, certain domains require approaches be tested against each other in order for progress to be made. If an approach or field moves into a domain where this requirement exists, it is foolish to claim "No, no, don't judge the efficacy using the tools of science. I'm special and get to play by different rules."

39.

Thomas> Darwin saw no inherent progress in evolution, no crucible of competition which necessarily yields the better.

Sure, but differential survival of replicators will result in replicators that are well adapted to replicate within their environment (yes, where "environment" includes other replicators, the changes the recplicators themselves make to the environment, the mechanisms the replicator uses for replication, etc etc etc). There is no need to invoke a value laden term like better.

If you are trying to understand billiards, you could write out all the equations, spend a lot of time in pool halls playing, or watch "The Color of Money." Depending on what you were trying to understand, you could define better in a variety of ways and any of those approaches would be best.

That does not mean that watching "The Color of Money" is the best apporach to building a physical simulation of a game of billiards nor that tensor analysis would help you decide when your opponent is going to throw a game because of a bet. But, in both cases, you could pick an approach that led to greater understanding the domain you were curious about. In both cases, certain approaches get to be smugly superior to others, which is ok, because none are superior in all.

And, you could devise ways to test whether they really were superior or not. Hell, you could even devise a way to test whether my assertion about their superiority is correct, which is very cool.

40.

Here's the deal.

Chip and I didn't start this thread here today - We started it 15 years ago. We didn't ask to ressurect it here or now. The original comments were not aimed at anyone that participates here (that I am aware of.)

My point in including the original discussion thread was to set context and to show *where* the collision started, not the suggest current state of affairs.

2Cyberconf took a turn into the 'less-than useful' for us. It introduced a two-worlds collision to us that, apparently from this thread, has still not sorted out much, which I agree is a shame.

[I did see a glimpse of hope for the future of this dialog in the form of a certain Ludium attendee, who was attempting to live in "both worlds" and was being somewhat rejected by the humanities side as a result...]

But, I will not accept a summary that says that we didn't at least try to deal with this rift. If you read my original position statement carefully, you will see that the contrary is true - I called for dialog and did get a bit of it in the form of the thread that I archived. Unfortunately, the xCyberconf community, as it existed at the time, did not have the will and/or skills to bridge the gap. So it goes.

So - I'd like to turn this positive and make one of the same calls I made back then: Could someone recommend something to read from the deconstructionist circles that 1) would help with understaning how to build online communities and/or virtual worlds and 2) would be readable by someone outside the discipline?

I have been able to find great texts in sociology, economics, cultural anthropology, architecture, and media, etc. that meet these requirements that have had sigificant effects on my virtual communties/worlds. I look forward to reading (and recommending) more.

Randy

41.

Exactly, Cory -- no argument here (I wasn't the one implying that a Darwinian process meant the production of something better, or more true).

And, as the grandson of one of the best pool-players in the depression-era Ohio River valley, I love the pool example. :-)

42.

Cory, thanks for trying to put out a fire with gasoline. Strange that you never mention reading and interpreting these old books as a useful tool. ;-)

Cory> More specifically, certain domains require approaches be tested against each other in order for progress to be made. If an approach or field moves into a domain where this requirement exists, it is foolish to claim "No, no, don't judge the efficacy using the tools of science. I'm special and get to play by different rules."

Corresponding, I suppose you are saying that if certain domains do not require approaches to be tested against each other -- or if certain domains make the notion of "progress" or "verification" inherently problematic -- it is, in those case, foolish to rely solely on the tools of science?

Cory> "I'm special and get to play by different rules."

This is actually Richard's argument for virtual worlds, I think. :-)

43.

Here's the deal.

Chip and I didn't start this thread here today - We started it 15 years ago. We didn't ask to ressurect it here or now. The original comments were not aimed at anyone that participates here (that I am aware of.)

My point in including the original discussion thread was to set context and to show *where* the collision started, not the suggest current state of affairs.

2Cyberconf took a turn into the 'less-than useful' for us. It introduced a two-worlds collision to us that, apparently from this thread, has still not sorted out much, which I agree is a shame.

[I did see a glimpse of hope for the future of this dialog in the form of a certain Ludium attendee, who was attempting to live in "both worlds" and was being somewhat rejected by the humanities side as a result...]

But, I will not accept a summary that says that we didn't at least try to deal with this rift. If you read my original position statement carefully, you will see that the contrary is true - I called for dialog and did get a bit of it in the form of the thread that I archived. Unfortunately, the xCyberconf community, as it existed at the time, did not have the will and/or skills to bridge the gap. So it goes.

So - I'd like to turn this positive and make one of the same calls I made back then: Could someone recommend something to read from the deconstructionist circles that 1) would help with understaning how to build online communities and/or virtual worlds and 2) would be readable by someone outside the discipline?

I have been able to find great texts in sociology, economics, cultural anthropology, architecture, and media, etc. that meet these requirements that have had sigificant effects on my virtual communties/worlds. I look forward to reading (and recommending) more.

Randy

44.

@Randy: Well, finding a way to articulate what the humanities can contribute to "knowledge work" and information technology is the explicit aim of Alan Liu in The Laws of Cool, but I haven't had a chance to crack it. A brief glimpse suggests that it's not exactly written for a wide audience, I'm afraid (but it's so on point, or at least it's trying to be, that I thought I should mention it).

45.

@Greg: Here and there in Cory's post he acknowledges that "testing" may not apply as the standard for verification for all knowledge claims (which is a damn good thing, since not everything can be replicated; take most of evolutionary biology's purview, for one, or parts of geology for another, and that's just to consider a couple of sciences). I chalked this up to different languages for talking about this stuff, which to a certain degree we must be flexible about if we're ever going to move forward. The pool example is on point, although I note that we have little reason to believe not only in the universality of testing, but also of measurement (as suggested at the end). In the end, it becomes, again, reliability or usefulness (not in a utilitarian way!) which lie at the heart of what we believe to be true (not to be a broken record or anything).

46.

(A few comments slipped in while I was writing this)

Quite the straw man being erected and bashed over Chip's comments on Darwin. He said:

I'll just add that the core of my beef with the academic humanities is not that they necessarily produce a higher proportion of foolishness than any other activity (they may, but that's beside the point). It's that they've been far too successful at establishing institutional arrangements to insulate themselves from the darwinian workings of reality. Consequently what value they produce is buried much deeper in crap than it is in my world. [Emphasis added]
This was not a comment about Darwin himself or his outlook (postmodern or poststructuralist or anything else). Chip's point, I believe, is that the intellectual insularity provided by the impenetrable jargon of what postmodernism has become has kept at bay the acid tests of good ideas.

When you can say anything you like with tortuous language, whether it makes sense or not (and I do not agree with Andy Havens that this is only a problem at some imagined extreme ends of academia), then your ideas go untested. There are no consequences, no meaningful exploration or evolution of significant ideas. Authority and group-think rule the day, fed by an intellectual swamp of ever more ambiguous and gordian language. No one wants to 'fess up that they don't understand a word of it or point out that it makes no sense, that the postmodernist Emperor has no pedagogical clothes (no one except maybe the odd physics professor who also becomes fed up with this sort of intellectual dissembling).

On the other side of the coin we have the narrow-minded linear-thinking developers who remain willfully ignorant, from the POV of at least some academics (as evidenced here), of everything not immediately applicable to their next project. To these philistines considerations of the manifold intellectual underpinnings of virtual worlds in how they are designed and experienced is not only irrelevant, it's nonsensical.

There's a lot of crap on this side too, just different in kind, not amount, from what Chip noted in academia. Here the problem is that the darwinian forces are too active: developers don't have time to protect seedling ideas that might grow into something useful, much less coddle them in swaths of logorrheaic academic papers. Good ideas are mowed under with the bad; developers don't wax poetic about the essential paradigms of cyberspace because they're too busy trying to get something, anything, actually done.

Terra Nova is a rare place where these two worlds can sometimes come together. When they reinforce each other it works wonderfully; both sides learn from the other. When they interfere -- when ungrounded postmodernist thought meets with disdain for anything not immediately and technically useful -- then we see the gap between academia and industry at its widest.

Myself, I think Chip's original essay should be required reading by academics and developers alike (and not just because it pays homage to Danny Kaye, though recognizing that has its own value). Personally I have as little use for postmodernist "sound and fury, signifying nothing" as I do for excessively dogmatic technical views of what virtual worlds are and will thus always be. Neither view is as uncommon as we might like to think. But there has got to be a way of synthesizing academic and technical views without giving in to the blindspots of either.

47.

Thomas, do you anything that doesn't involve gambling?!

Also, as someone trained in science and engineering who spent a lot of years in the "science == truth" camp, I would like to thank Jim Gee for writing books (and Constance for giving me the reading list) that helped me to change my thinking on the topic. I am pretty sure that it is possible to create accurate models of what happens in the world and I often still default to the position that the physical modeling of the game of pool is the "true" nature of the game. But as I tried to explain, there are other valid ways to get at the nature of pool that might have nothing to do with the physics.

But -- and this underscores Randy's comment -- I didn't change my mental model by being told that I just didn't understand my Derrida or Lacan. Instead, Jim writes incredibly clear prose that covered the terrain in a way that I found completely approachable. Where's the equivalent book for Randy's deconstructionist list?

48.

Randy,

OK, truce. I haven't read it yet, but I think Ian Bogost has a book that might fit the bill. Ian learned from Derrida, who coined the term you're interested in. Alex Galloway's book Gaming also has some discussion of Derrida in the context of play. No promises that they will help you design, but I'm sure they have some interesting ideas.

Btw, it sounds like that 91 cyberconf was not just a clash between comp sci types and lit crit (broadly) types, it sounds like it was a clash between presentation cultures -- slides vs. MLA-style "paper readings." Getting anything meaningful from the latter is really an acquired skill (that I have not yet acquired). Also, I'll concede to having seen a fair share of pointlessly obscurantist jargon in lit crit circles. As an undergrad, I studied with Shakespeare with Harold Bloom and lit crit with Paul Fry back in the late 80's. It was very dense stuff, but I loved it, and it taught me ways of thinking that I use every day.

But... I'll never forget a grad student replying to a point made by another grad student by saying "Let me clarify that my prior statement was not made in contradistinction!" Apparently, what this promising scholar of the English language meant to say was "I agree."

49.

@Mike: Of course there is; we agree, I think. Though I would ask for a bit of consideration of the fact that the language of academics is also a technical language, one that uses specific concepts with hard-won nuance to convey complex and often counter-intuitive ideas. Of course there's a lot of bad, obfuscatory writing out there in academia, but I find the reflex of assuming any important idea must be possible to put in plain speech (especially on the fly, in not-to-be revised posts to a blog) to be a marker of arrogance, not sharp criticism. Cantankerous and not intellectually humble physics professors can be cases in point, not sages of good writing. Dismissing ideas in this way is rarely done for anything but strategic purposes. Besides, the best of that kind of writing is usually honed over years of presentations, papers, and research, as Jim Gee's writing shows (perfect example, Cory); to expect the juggling of these new thoughts here to be perfect prose denies the advantage of the medium -- the back and forth that we're having here.

@Cory: I'm still thinking about the right text; but since I fall more on the social practice rather than representation/discourse side, I'm not sure I'll think of it, to be honest (and it may not have been written yet). On gaming specificially, Bogost's and Galloway's, which I've looked at (I like the Bogost in particular), are certainly worth a try, Randy. The Galloway is more in the Hardt/Negri decentralized control school, which isn't really deconstruction; it's more poststructuralist (to the extent these are useful terms at all; I'd just rather not go on for five paragraphs).

50.

@Mike: One other thing, btw. I don't think I was misreading Chip, although I would be happy to hear otherwise. It's pretty clear to me that the suggestion is that the competition of the free market winnows away a lot of crap, leaving a higher proportion of better, valuable ideas, whereas humanities departments, insulated from the market (this is ridiculous in its own way; there is an economy there, as everywhere, it's just not always about market capital), produces a lower proportion of valid ideas to crap. It was this suggestion that merited, imho, the response about Darwin.

51.

Thomas> Here and there in Cory's post he acknowledges that "testing" may not apply as the standard for verification for all knowledge claims

We need to be really careful about what we mean by "knowledge claims." I don't see a lot of reason to test a sonet -- although you could certainly do a study to see which sonet is best to read prior to sleep by middle-aged game developers -- unless someone is making a claim about the sonet. If you tell me that reading Shakespeare's Sonet 18 (always a personal favorite) twice a day will cause me to lose weight, that is a testable assertion. Absent testing, I'm not sure how useful the statement is since it's just a claim. In my pool example, your goal would allow us to test the different approaches against each other, even with fuzzy concepts like "I'd like to enjoy talking about pool with my grandfather more", right? Perhaps you could give me an example of knowledge claim that isn't testable but that is still an assertion?

Greg> it is, in those case, foolish to rely solely on the tools of science?

What tools exist outside of science, exactly? I tried to make this point with the art and stem cell example, but maybe we could use another. To return to old books, there are people who claim that old books provide a useful guide to morals. Now, this seems like a testable claim. Do people who read old books act in a more moral way? Do they live longer? Shorter? Are they happier? This is science and it is not -- in any way -- a value judgement of old books. I'd actually expect the folks who like old books to really be pushing for these studies since that would be an incredibly interesting result, don't you think? Merely making the claim without testing, again, makes no sense to me.

I strongly feel that giving up, this plea of "we don't understand it now, so we'll never understand it", is a horrible position to take. I'm all about the fact that there is all kinds of stuff we don't know, but I take this as inspiration for us to learn more.

Greg> Strange that you never mention reading and interpreting these old books as a useful tool

Sorry, I thought that I did. But see previous comment. Why not measure the impact of reading and interpreting them. After all, we have these great cohorts that do and don't. How many in both groups are in jail? Married? Tall? President? In all of these cases, we have lots of tools -- both quantitiative and qualitative -- to analyze the distinction. Merely saying "this is an old book and reading it is important" doesn't generate a lot of knowledge. Being able to say "people who read this book are happier" might be the start of a very interesting discussion -- and would probably spin off more interesting research, right?

Thomas> "take most of evolutionary biology's purview"

Be very careful with this statement. Despite what gets constantly spewed in the US (but not in the rest of the world) evolutionary biology and the modern Darwinian synthesis have proven to have remarkable predictive prowess (http://www.livescience.com/animalworld/060713_darwin_finch.html, http://www.botany.org/newsite/announcements/evolution.php).

52.

Cory wrote:

Perhaps you could give me an example of a knowledge claim that isn't testable but that is still an assertion?

I may be misunderstanding you, but here are a few, off the top of my head (importantly, while not testable, some have come to be seen as wrong):

-The protestant ethic played a central role in the explosion of capitalism. (Max Weber)

-The modern prison is not a result of explicit debates that were very much in evidence in the 18th and 19th centuries, but instead the product of a set of reliable techniques for disciplining human bodies, with roots in monastic orders and Roman military training. (Michel Foucault)

-The shift from hunter/gatherer society to pastoralism was the one of necessity, not innovation. (many writers)

-The works of J R R Tolkin are best understood as a meditation on evil in the wake of his WWI experiences. (Tom Shippey)

-Changes in Ivy League admissions practices in the early 20th century, broadening criteria for entry beyond narrow academic achievement, were in large part an attempt to stem the rise in Jewish admittees. (Jerome Karabel)

-The rise of sugar as a crop of colonialism was only possible because of the extent to which it served as a cheap source of calories for an emerging labor class. (Sidney Mintz)

-Reciprocal violence in Pakistan is not intrinsic to their "culture," but rather the product of a number of factors, including developing infrastructure, increased exposure to fundamentalist versions of Islam, and the legacy of Nawab rule. (Lincoln Keiser)

Mostly, these are historical and anthropological claims, the ones with which I am most familiar, but one could add examples from geology, biology (see Gould's writings), and more from literature, certainly. (By the way, I'm a huge fan of Darwin, if that wasn't immediately obvious, and prediction is not the same as testing, but that would be another conversation.) The point is, much of even science is not about testing, replication, and prediction. Those that deal with the unrecoverable past (and present), as well as the play of meaning in works of art and elsewhere, must necessarily reason differently, and they do. (Now, if only Tim were to show up right about now; he'd have dozens of examples, and put it all much better too, I'm sure.)

53.

Erk. Instead of "reason differently" that should probably be "support their claims differently". Oh, for an edit button...

54.

Despite what gets constantly spewed in the US (but not in the rest of the world) evolutionary biology and the modern Darwinian synthesis have proven to have remarkable predictive prowess

And so? This hardly explains everything there is to know or understand or intuit about the human mind, soul, heart, and their interactions in the universe, the multiverse, the metaverse.

Cory, do you imagine your world of Second Life which you helped create merely to be a platform someday to attach neural networks, and not a world?

I find this extraordinarily literalist logical positivism to be profoundly troubling. Zealously clinging to such scientific belief systems can be as fanatical as extreme religious belief.

You could never make your way in the world if you couldn't assert the kinds of anthropological and social claims that Thomas has indicated here.

The hallmark of an open society as Karl Popper explained it was the right to mount an unsupportable thesis. If I'm always to be chased around and asked whether a statement is a testable claim where its opposite can be posited or not posited, I will never be able to imagine.

55.

OK, so let's look at this:

-The rise of sugar as a crop of colonialism was only possible because of the extent to which it served as a cheap source of calories for an emerging labor class. (Sidney Mintz)

First off, this may still be falsifiable. After all, we can look at human caloric intake while working, estimated calories in sugar cane at that time, alternate caloric options in the area. This is completely normal -- after all, you don't need to redo all experiemtns back to Galileo to work on quantum mechanics. But, if good data didn't exist on how many calories you burn harvesting sugar versus caloric density we could devise ways to get those numbers. If they worked, we'd be more trusting of the assertion. The statement itself is supported (or not) by a scaffold of additional information. At some point, that scaffold is either really a scaffold (caloric numbers add up, no earlier examples of expanding crops into new areas that worked, etc), it's falsifiable (sugar alone can't support the labor to harvest it) or nobody knows anything about sugar at the time of colonization so we can't say anything about the statement.

In addition, which could hurt us when the next comment comes along:

-The rise of sugar as a crop of colonialism was only possible because of the extent to which it generate sufficient profits to enable land owners to purchase food from local naugahide trappers (Foobar T. McTivish)

How we choose to compare and contrast these two assertions still come down to reasoning and science, even when we can't be certain, right? If we have no ability to weigh the relative merits of the two assertions, we really should let them both flow forward in as neutral a way as possible, so as more data becomes available, later generations can make a more informed choice.

Human knowledge is able to build on itself really, really quickly -- in fact, many of the folks on this blog are part of the wave that is changing just how quickly that knowledge can build. If we don't know what the confidence is in our knowledge -- either through experimental data or supporting evidense -- how do we know whether what we're building is going to hold together?

56.

Prok> This hardly explains everything there is to know or understand or intuit about the human mind, soul, heart, and their interactions in the universe, the multiverse, the metaverse.

Interesting way of putting it. So, do you disagree that evolution is sufficient to explain the mechanisms of our physiology, including the brain? If so, what exactly is the special sauce evolution is missing?

However, explaing the source of the complexity is completely different from talking about how to study the results. As I tried to make clear with the billiards example, even if the interplay of billiard balls is governed by physics, it is very possible that you will learn more about the game by operating at a very different level -- including watching a fictional movie about the game.

As for the heart, again I am happy to both understand that it pumps blood and has limited independent impact on cognitive function as well as talking about the heart as a metaphor for relationships. Hell, I referenced the "Shall I compare thee to a summer's day?" sonet.

As for soul, all evidence points to a purely metaphorical concept. Useful in communication and writing? Sure. Explanatory power for how humans interact? About as much as the heart as metaphor. Building social systems in Second Life based on the medieval concept of a soul would make about as much sense as building the server infrastructure on an Aristotelean concept of 4 elements. It would be silly. Of course, it would be equally silly to build based purely on Darwinian evolution.

Once again, we're at my original point: decide what your metrics are and pick the right tool kit for those metrics! Why do you think we've had Thomas in our building helping us change how Linden Lab grows as a company?

57.

Ola Fosheim Grøstad>You arrive in an English Tearoom and I think you then entered the world by.. ehm... Sipping Tea! It didn't work that way, when I tested it yesterday, so either my memory is wrong or perhaps it has been changed.

You can't leave the Elizabethan Tearoom with 0 experience points. Sipping the tea gets you 1 point, which means you can then leave. On subsequent occasions, you usually have 1 or more points, so you don't need to sip the tea.

Richard

PS: If you SIP DARJEELING rather than SIP TEA, you get 200 points.

58.

May I humbly suggest Peter Drucker: Shaping the Managerial Mind by John E. Flaherty as a worthy book to read and consider?

It’s a good summary of Drucker’s life pursuit of understanding of corporations as the new form of social institution and managers as its new leaders. The first section talks about his formative years and the establishment of his fundamental thoughts through his three early books: The End of Economic Man, The Future of Industrial Man, and Concept of the Corporation. The second section focuses on his assertions, dogmas, and practical advices on managing the entrepreneurial enterprise, ever focused on results rather than “profits” in the three time dimensions of now, present future, and future. The last section focuses on effectiveness in a pragmatic rather than prescriptive manner.

Mr. Flaherty is well suited to write about Peter Drucker, given their long friendship and academic (perhaps postmodern, deconstruction) perspective.

Key takeaways are:
1. Drucker tackles this subject neither from the corporate nor from the academic perspective, but from an integrative perspective based on his framework of systems theory and social ecology. Flaherty states “systems theory was an attempt to create a mental model of unity out of diversity, to view institutions from a holistic rather than a fragmented perspective, and to perceived organizational reality in terms of dynamic disequilibrium, whether the unit under consideration was a tribe, nation state, city, region, family or given. In Drucker’s multidisciplinary approach, the social ecology perspective was used to improve understanding

2. Whereas Drucker studied corporations as the new social institution of the 20th century, many are studying virtual worlds as the new social institution of the 21st century. His lauded approach and insights as proof of results give weight to the idea that his approach can be applied to the study of virtual worlds.

3. Drucker is likely to offer dogma as he is likely to offer superb analysis and rational conclusions. It’s all about the results. In the case of uncertainty, it's about managing the process of dealing with uncertainty in a way that produces results. So in Cory's sugar case, Drucker probably wouldn't care whether either of the two assertions (knowledge claims) are provable; he would only care about wh

Frank

59.

greglas>What I'm getting from Chip and Randy (and to a lesser extent, Richard), though, is hardly the endorsement of an interdisciplinary eclectic or amateur approach to learning.

I've looked at many, many disciplines - it's hard not to when you want to create worlds - and I'll scavange anything I can of use from them. If I were somehow reticent to engage in an interdisciplinary approach, I wouldn't keep harping on about the Hero's Journey, for example. The reason I don't have a great deal of use for postmodernism is because of what postmodernism offers, not because I have some kind of academic insularity.

>Instead, it's pretty much the smug pillorying of the "word salad" of "postmodernists" and "deconstructionists" (in Chip's case, the whole of the "academic humanities") by those who consider themselves more adept at "the darwinian workings of reality."

And here was me thinking it was the postmodernists who were smug. After all, they follow an untouchable theory of theories which, when you attack it on any theoretical grounds, can absorb the attack as an example of the theory in action. The only grounds you can attack it on are practical grounds, those being that the theory has no actual use. It's a self-referential edifice that can explain but can't predict.

Andy Havens>Postmodernism and deconstruction can be taken too far, can be made to be ridiculous, can be faked, can be absurd and are, at the extreme, goofy.

The problem is that once you apply the everything-can-be-deconstructed theory, there's no way of stopping it. Ultimately, it renders everything meaningless, which, while perhaps a true objective reflection of reality, isn't a great deal of use to people (who are necessarily subjective at their core). So you wind up looking at something objectively and deconstructing it through layer after layer until eventually your subjectivity kicks in. They theory, though, just keeps rolling ever onwards, leading to the kind of goofiness that you identify.

Another way of looking at things is to start with the subjective and build layer upon layer of objectivity on that until you match what it is you were trying to understand. This has a natural end point, so you don't get caught in an indefinite loop applying the theory until there's nothing left to apply it to but itself. I don't know what the philosophical term for this is (constructivism?) but I do know what the mathematical term is: combinatory logic.

I've no problems with the basic premiss that if you want to know how something works, you take it apart and have a look. What frustrates me with post-modernism is that while you're taking something apart, you have to take apart your process of taking it apart, and so on indefinitely, until there's nothing left, and then when you finally get back to taking your original thing apart you do exactly what you were doing anyway. In practical terms, it looks vacuous.

Richard

60.

Cory> "I'm special and get to play by different rules."

greglas>This is actually Richard's argument for virtual worlds, I think. :-)

What do you mean, Greg?

Richard

61.

>Interesting way of putting it. So, do you disagree that evolution is sufficient to explain the mechanisms of our physiology, including the brain? If so, what exactly is the special sauce evolution is missing?

Sure, what thinking person wouldn't? Evolution can't explain everything; you don't have to be some fundamentalist nutter or some zealous intelligent design freak to be curious about the obvious symmetry in the universe and to know a simple thing about it: you didn't create it, and you don't know where it comes from.

"That than which no greater can be conceived." It comes down to whether you believe in God or not; I take you don't. So it will always be a rancorous sort of conversation to have. You'll dismiss me as some backward opiated benighted creature, and I'll score you as dependent soley on the clearly limited and selfish creature of man. But there it is -- it comes down to whether you believe there is something higher than yourself and others or not.

As for that special sauce, if you have no apparatus within you to taste the special sause not only of the universe but the metaverse, Cory, I don't know what dish to serve you to put it on. Probably nothing will work.

>However, explaing the source of the complexity is completely different from talking about how to study the results. As I tried to make clear with the billiards example, even if the interplay of billiard balls is governed by physics, it is very possible that you will learn more about the game by operating at a very different level -- including watching a fictional movie about the game.

You don't attempt to explain the source of the complexity; there is such a thing as awe.

>As for the heart, again I am happy to both understand that it pumps blood and has limited independent impact on cognitive function as well as talking about the heart as a metaphor for relationships. Hell, I referenced the "Shall I compare thee to a summer's day?" sonet.

Sonnet. And that's a plusrate for you, but if you first discuss and experience the heart as some pumping organ, again, I don't know whether to start the conversation. A sonnet is merely an artifice you paste on your scientific notion just to prettify it, and not a plane of understanding of its own.

>As for soul, all evidence points to a purely metaphorical concept. Useful in communication and writing? Sure. Explanatory power for how humans interact? About as much as the heart as metaphor. Building social systems in Second Life based on the medieval concept of a soul would make about as much sense as building the server infrastructure on an Aristotelean concept of 4 elements. It would be silly. Of course, it would be equally silly to build based purely on Darwinian evolution.

All evidence? Whose evidence? Your evidence? The evidence of limited and literalist scientific minds? But who says they get to decide what a soul is lol? Thousands of years have been devoted to this question in all the great religions and volumes have been written and libraries filled and you're going to come along and suddenly declare the soul a metaphor? Because...why? Because you're a game-god controlling the imaginations and dreams of millions?

>Once again, we're at my original point: decide what your metrics are and pick the right tool kit for those metrics! Why do you think we've had Thomas in our building helping us change how Linden Lab grows as a company?

? I have no idea what that's all about because I can't fathom the work that Thomas might do for LL "in the building".

I can only suggest that it might become more relevant if it were "in the world" for a significant portion of time every day.

62.

Frank, weren't all those same thoughts posited in the 30s and 40s and 50s in various books and in the phenomenon of the company town?

63.

Richard said: "I've no problems with the basic premise that if you want to know how something works, you take it apart and have a look. What frustrates me with post-modernism is that while you're taking something apart, you have to take apart your process of taking it apart, and so on indefinitely, until there's nothing left, and then when you finally get back to taking your original thing apart you do exactly what you were doing anyway. In practical terms, it looks vacuous."

Indeed.

What frustrates Creationists is the idea that you can look at a single-celled organism, or a bat, a cat or a ring-tailed lemur and take it apart, on a genetic level, and work backwards through some insane, inhuman amount of time (billions of years? c'mon... people have only been playing football for 120...), finally merging into a point where "we was all together" in a primordial soup. What frustrates them further is the idea that a universe that has cars and ferns and fish that swim in schools and 12-part harmony and pin-ball machines and submarines and cable TV came out of one giant explosion (that still can't be explained) another couple billion years before that.

What's frustrating is that the matrix is complicated and we're inside it.

What's beautiful is that the matrix is complicated and we're inside it.

I can take a hammer and build a good house. I can take a hammer and build a bad house. I can take a hammer and kill your goat. I can take a hammer and claim it is a hook and hold it in my hand and pretend to be a pirate and have some fun chasing my 7-year-old around the front yard yelling, "Avast!"

What is that thing? It is a hammer. Right. What does it do? It depends entirely on my process.

Before postmodernism, *THE THING* was all important. We go to war because we are *THE FRENCH* or *THE BRITISH* or *CHRISTIAN.* To protect *TRUTH* or *COMMERCE* or *HONOR.* All very Germanic, with lots of capitalized nouns. Art was only good if it was of a particular style; literature needed to be correct. There was a bit to do with imperialism and colonialism in the mix, too.

So many things changed so quickly, and so badly, during the early part of the 20th century, that the postmodernists questioned the very assertion that *A THING* can truly be understood to even be "a thing" without understanding the process in which it is brought to bear.

Is *A NATION* a meaningful concept, when the rulers are willing to cede large (really, really large) chunks of its population into a meat-grinder, and give up pieces of important historical real-estate in order to fuel their own economic and familial interests? Is *MONEY* tangible, when "money in the bank" goes away because of shaky foreign investments that go south, drying up 1/3 of the banks in your country? Is *NEWS* real, when the same battle can be reported three different ways by three different papers in the same country? Is even *SCIENCE* reliable, when it begins to come out with theories that have names like "the uncertainty principle" and question assertions about the steadfastness of time and space? Are *THOUGHTS* my own, when my sub-conscious can be said to "think" things that I don't even know about?

I'm not saying, myself, these things are all completely accurate. But this was the state-of-mind for many thinkers at the time. A world that had been "modern," and reliable... now, to a degree, shown to be somewhat mad.

What do you use as a theory of reality when all previous theories have been shown to lead to chaos, war, depression, mayhem and despair? When previous theorists have been shown to be masters of either deception, chicanery or foolishness?

You go with a theory that says, "We must know your bias. We must understand, at all times, context before content. The quality of the mirror is more important than what you are holding up to it. Because you will say 'Socialism' today, and 'Democracy' tomorrow then 'Capitalism' then 'Newton' then 'Einstein' then 'Heisenberg' then 'RTS' then 'RPGs' then 'MMOs' then 'VWs' and then and then and then... But we all have been fooled before. So we will work on the mirror while you work on whatever it is you want to hold up.

"Because maybe it is not so bad a thing for you to be frustrated."

Again... there is, for sure, a limit to the usefulness of the mirror. And when you just hold it up to itself, well... as the song says, "Masturbation can be fun." But it don't often pay the rent.

The basic theory, though, is, IMHO, quite sound: you need to examine context, bias and not just the "what" of your tools, but the "why" and "how" of your process.

Otherwise, you don't know if I'm a murderer, chasing a kid around with a blunt object, intending to cause severe cranial trauma... or a scurvy pirate's mate... taking orders from my 7-year-old captain. Aaargh!

64.

MM... But Heidegger was a Nazi member. Ding an sich or not.

65.

Cory> "I'm special and get to play by different rules."

greglas>This is actually Richard's argument for virtual worlds, I think. :-)

Richard> What do you mean, Greg?

Not a joke, just an observation -- see:

Richard> With MUD, I knew that people might break the unwritten rules that protected its virtual world from the real one. Some indeed did so. Individually, they were usually easy to deal with; I would speak to them and explain the problem: it was unfair to the other players if they behaved however they were behaving, and please would they
stop. Most understood and obliged. Those that did not were reminded that I had my finger on the off switch for their character and that I could therefore obliterate them entirely if I so chose. Some, very few, I did obliterate entirely.

Which is fine -- really. I was just pointing out that Cory, in describing the putative fallacy of different rule sets for some disciplines, was criticizing a statement that sounded a heck of a lot like your argument for magic circles.

And Richard, I've seen Chapter 6 -- I know your reading on VWs is as admirably interdisciplinary as one can get. And you're not alone in saying that exclusive reliance on anti-foundationalist interpretive methods would seem to lead inevitably to relativism, nihilism, or existentialism. Which is why these methods need to be understood as interpretive tools, not ends in themselves. (Reminds me of a teacher of mine who was fond of saying that even though existentialism might have been correct on the merits, it was impossible for him as a lifestyle.)

Cory -- I'm not opposed to using any empirical tools w/r/t the old books. Sorry if I gave that impression. I'm all for it, and some people (as I'm sure you know) are doing it. What I'm concerned by is exclusive reliance on the tools, or the faith that we'd all be better off by focusing on proving and disproving our way to utopia. The hammer of empiricism doesn't turn the world into one big nail. You say you get this about your sonnet, but then you start up with the talk about nails again. Give the sonnet a chance to speak to you. :-)

66.

Prok> You don't attempt to explain the source of the complexity; there is such a thing as awe.

We're going to have to agree to disagree here. There is a nearly infinite list of knowledge that started with fire and is currently passing through the human genome that demonstrates what happens when we *do* attempt to explain the source of complexity. If you are going to take a moment for awe, why not focus on the amazing capabilities for groups of people to solve apparently insurmountable problems. That is something worth appreciating. Of celebrating.

And of investigating and understanding how we keep doing it.

Greg> Give the sonnet a chance to speak to you. :-)

Of course the Sonnet speaks to me, but that doesn't stop me from wanting to know why and how it speaks to me. Understanding those aspects -- for me -- would make the Sonnet's voice all the more beautiful and intersting. A flower isn't less beautiful because it is in an arms race to create the most attractive ultraviolate beacone for polinating insects.

Greg> What I'm concerned by is exclusive reliance on the tools, or the faith that we'd all be better off by focusing on proving and disproving our way to utopia. The hammer of empiricism doesn't turn the world into one big nail.

When someone says "don't use empiricism, it reduces your understanding" I'm baffled. We have Thomas' list of knowledge that is difficult to test, but even in all of those cases we can explore how and why we have chosen to build or not-build upon those pieces of knowledge.

One can choose to agree with a piece of knowledge for one of several reasons, which break down into:
1) Empirical evidence in support of the assertion
2) Agreeing because a large group of poeple agree with it
3) Agreeing because one person you trust agrees with it
4) Agreeing with it because you just want to

Throughout history we have uncountable examples of numbers 2, 3, and 4 proving to be less effective reasons to agree with something. This doesn't mean that because current eidence supports something that we won't later find more or different evidence to change out position -- this is perhaps the most important component of our process of building on each other's knowledge!

Ironically, even if one were to rely on the most "non-empirical" choice and go with option 2, 3 or 4 for determing whether you agree with an assertion or not, one would still be leveraging tools to make the choice. Whether your own observational data ("wearing this magnet bracelet makes my joint feel better"), allowing someone else to determine your opinion ("President Bush says that activist judges are destroying the family"), or deciding to trust what most people think ("They wouldn't let us use those voting machines if they weren't tamper proof"), you are using data you collect about the world to make decisions.

Given that, you can either chooce to understand why people make those choices or not. Again, I'm not saying that mathematics or physics is the right tool for every question. I have specifially said the opposite several times. But I do see it as worthwhile to understand why we reduce dissonance, are so likely to believe figures in authority, or have such a difficult time breaking free of the majority opinion. These aren't simple questions -- and we certainly don't have all the answers -- but exploring them makes the world -- and our place in it -- more amazing and interesting, not less so.

67.

How this got turned into a creationism vs. evolution contrast escapes me. It was Darwin (among others) who got us to realize that the world is irreducibly complex, and that our theories for understanding it must turn to the identification of processes (adaptation, exaptation, proaptation [iirc]), rather than taxonomies of things (Andy's nouns).

Again, this does not mean that we can't say anything. Think of testing and "metrics" as just part of our toolbox for verifying claims. A humanities approach to literary criticism ("postmodern" or not!) relies on the privileged position of a critical expert, who, say, reads all of an author's works, examines his or her life and times, etc, and arrives at a series of claims about how best to understand a specific novel or similar. This is a knowledge claim. It is not testable. There is no standardized metric for knowing which critic's account is "right." But they are compared, and one account may win the day.

A social science approach (say, anthropology or hiswtory) relies on the evidence of exploratory research, gathering thick evidence from the flow of social processes to build an account of how best to understand a particular phenomenon. Evidence might include: archival research and claims (as Tim discussed in "Datamining the Forums"), surveys, interviews, observant participation, media study, etc. Again, this is not "testable" in the original sense of replicating conditions.

Often, one hears the idea that you could test these claims about people by, say, asking them questions (are you happier because you read old books?). The problem is that people are interactive, and there is an enormous gulf between what they say and what they do. So, perhaps we could observe their behavior and see if those who read old books act more morally. But, how to measure that?

The point is, those aren't the only ways to make claims. I believe Sidney Mintz's account of the history of sugar because he makes his case in a convincing way. He marshals evidence, builds his argument, and makes his claims. I don't need to try to imagine how to test it (when the conditions can never be replicated), because we have a claim about it by him that is done well and that, imho, we should believe, at least until something better comes along. So what's the problem? This isn't a choice between hard-headed hypothesis-testing empiricism (that's not the only kind of empiricism!) and intelligent design. It is merely about broadening our appreciation of the ways we have, across the academy, developed means to make reliable claims. Some of those rest on privileged critical opinion, some rely on exploratory rather than experimental verification, and some rely on hypothesis testing and precise measurement. It's a big tent.

68.

Richard: PS: If you SIP DARJEELING rather than SIP TEA, you get 200 points.

Lovely! Details like this is what makes virtual worlds come alive.

Cory: One can choose to agree with a piece of knowledge for one of several reasons, which break down into:
1) Empirical evidence in support of the assertion
2) Agreeing because a large group of poeple agree with it
3) Agreeing because one person you trust agrees with it
4) Agreeing with it because you just want to

5) Because it is part of an axiomatic system.

What's wrong about questioning the axioms of a system, and the statments made about such axioms? What is wrong about transforming into into a different system? If that isn't wrong, then what is wrong with postmodernism, in principle?

69.

Thomas> So, perhaps we could observe their behavior and see if those who read old books act more morally. But, how to measure that?

How about crime rates versus old book sales? You could gather data in a bunch of ways. If we want to build on the supposition gathering data and testing is one part of the tool kit and ignoring it is just as silly as only relying on it.

Thomas> I don't need to try to imagine how to test it (when the conditions can never be replicated), because we have a claim about it by him that is done well and that, imho, we should believe, at least until something better comes along.

I think we're (mostly) loudly agreeing here, since a) you're relying on supporting evidence to the claim, and b) you are allowing that a competing claim could displace it. I bet, if we drilled down, you even know what conditions would lead you towards choosing an alternate claim, right? Among those are the reputation and experience of the person making the argument.

For example, I tend to default to agreeing with Thomas on issues around knowledge claims because I know that he is far more versed in the literature and history of the subject than I ever will be. Again, this how we build on each other's knowledge. But, that as important a tool as that default position is, I try to balance it with other tools at my disposal.

Thomas> . It was Darwin (among others) who got us to realize that the world is irreducibly complex, and that our theories for understanding it must turn to the identification of processes.

Yup, agree completely! Where I think we don't agree is about whether there is a choice between "privileged critical opinion, some rely on exploratory rather than experimental verification, and some rely on hypothesis testing and precise measurement." Here, it seems to me, we can make statements about the relative merits of different approaches *in different applications*.

Again, to return to billiards, we can agree that the equations of motion are a better choice if you are building a simulation engine, right? Since we can measure the simulation against observed reality, we want to choose the option that leads us to most accurate reproduction of those observations. Similarly, we can agree that if you want to learn to play billiards, hanging out in the pool hall and playing with experts is the way to go.

It's like studying the Dutch Tulip bubble. Economics, social science, art, game theory, biology, and marketing all provide crucial insights into why the bubble happened, and attempting to explain it with only one would be foolish. But, these different disciplines are better and worse at explaing different components of the bubble and that it is OK (in fact, important) to understand these relative merits.

70.

Ola, thank you for pointing that one out! I don't think I've said that there is anything wrong with question systems -- in fact, I've tried to be pretty strenuous in the opposite direction, since solving currently unsolved problems is all about innovation and questions the system! Where we need to be careful, I think, is tht when we move between systems we can't be careless about the impacts of that change.

71.

Cory, I agree that we're mostly agreeing. But where we're not is where you're tending to slip, now and then, into something resembling a position is that only testable and empirically verifiable knowledge is worthwhile knowledge. You say the flower is beautiful, but you *really* want to talk about falsifiable hypotheses concerning the cause of the structure. That's fine, as a taste preference. But as Thomas says, that turns out to be only one little subset of real knowledge about beauty.

And sure, you could take a Shakespearean sonnet and measure the effects of reading it on happiness or crime rates. Go ahead, have fun! But at the end of the day, that is not going to tell you that much about the sonnet.

By the same token, staying within the bounds of lit crit, you could criticize poetry by focusing on meter, counting dactyl & trochee patterns to your heart's content. And it's actually fascinating to do that for awhile, because good poets are adept at the craft of meter. But you can't mistake understanding the metric structure for understanding the poem -- the meter informs the poem, but it doesn't suffice to define the poem.

This unifying theme of this whole thread, I think, is the dangers of reductionism and walling off one "discipline" from others. Randy and Chip say the lit crits walled off themselves from logical, plain-speaking engineers. Richard says the "deconstructionist" singular emphasis on fragmentation of meaning prevents rational discourse. Cory says (I'm not sure to whom) that religious dogma should not wall itself off from empirical research. And -- please note -- the postmodernists (again, too broad a term, but I'll use it) are united simply by their resistance to the totalizing "discourse" of modernism.

But all this can quite easily be flipped on the science uber alles set -- not saying anyone here fits that bill, but I have met some (and befriended some) who do. Engineers who shun the humanities as meaningless fluff often have their own mode of reductionist and totalizing discourse, characterized by its own smugness.

And, not to start a new topic, but 15 years after the 1991 conference, if you look at the budgets of major universities today and where priorities are placed, it's hard to cast the pursuit of scientific knowledge as being victimized by the agendas of smug and triumphal deconstructionists.

72.

Nate Combs: I was hoping that this discussion would have gone a different route (see the 2nd half of post).

Probably because it didn't show up on the front-page?

Nate Combs:WoW versus Eve-Online versus Second Life is interesting but perhaps much less illuminating than WoW versus MySpace versus ?

Well, I have looked at Anarchy-Online and a friendship/dating site in parallell. It is interesting, you may learn a lot about design from it, but it is also very difficult to boil down theories into something that is going to be convincing to others as the users are likely to approach those systems differently. Apples and oranges make arguments painful. Or maybe I am very difficult person to convince?

I wouldn't recommend it to anyone working alone, based on my own experiences. Maybe if you are part of an engaging virtual-worlds research group (if those exist).

73.

Thomas said, The point is, much of even science is not about testing, replication, and prediction. Those that deal with the unrecoverable past (and present), as well as the play of meaning in works of art and elsewhere, must necessarily [support their claims] differently, and they do.

Not all knowledge claims are scientific -- such as those that deal with meaning in art (or religion, but that's a different discussion). The unrecoverable past is still often open to falsifiable and predictive claims.

I don’t think it’s a stretch to say that the fundamental tenets of science are predictive falsifiable hypotheses coupled with observation and leading to theory-building. There are many knowledge-based fields that employ these techniques in some cases (I would put anthropology here, but you might differ, Thomas), and others that rarely if ever use these techniques (literary criticism for example). The further you move away from the hard-nosed observational and falsifiable areas of science, the more you move into reason and opinion – and from there it’s a short step to fashion and authority-based reasoning (I'm going to ignore the Kuhnian meta-effects of authority on science for now). Such methods can easily become separated from any external consequences; he edifices of reason and opinion become internally self-consistent and compelling, but unresponsive to external input. As Chip said, these become insulated from the darwinian selection effects of actually having to exist in an environment consisting of something other than self-sustaining authority and opinion. Much of the criticism of postmodernism here has been from that tack, though of course this criticism goes much further back, all the way to the Aristotilian concept of the universe and how it can be known - by reason, rather than by observation. I had thought this had been sorted out a few hundred years ago, but maybe not.

You later say: A humanities approach to literary criticism ("postmodern" or not!) relies on the privileged position of a critical expert, who, say, reads all of an author's works, examines his or her life and times, etc, and arrives at a series of claims about how best to understand a specific novel or similar. This is a knowledge claim. It is not testable.

Correct. It is also opinion. Informed, considered, well-reasoned opinion, but opinion nonetheless. It is not by any means objective, falsifiable, or scientific. The process you describe above is common in the humanities; it is anathema in science.

Andy Havens: Before postmodernism, *THE THING* was all important. We go to war because we are *THE FRENCH* or *THE BRITISH* or *CHRISTIAN.* To protect *TRUTH* or *COMMERCE* or *HONOR.* All very Germanic, with lots of capitalized nouns. Art was only good if it was of a particular style; literature needed to be correct. There was a bit to do with imperialism and colonialism in the mix, too.

That strikes me as incredibly narrow minded. Or maybe it’s just that there were lots and lots of postmodernists before anyone knew they were there. If I cite Jefferson, Shakespeare (looking into the essence of a rose or a man, not the thing itself), Spinoza, or heck, Plato and Pythagoras with their realities-within-realities, you’d either have to ignore them or shoe-horn them in saying “well they were postmodernists too.” At which point everything good is postmodernist, and everything bad is imperialist-European-male. This is backward reasoning of the worst kind.

Thomas: It was Darwin (among others) who got us to realize that the world is irreducibly complex

Be very careful here. The phrase ‘irreducibly complex’ is used by Intelligent Design and Creationist proponents differently from how you appear to be using it here. Their assertion encompassed in this phrase is that no naturally occurring process can be used to explain complexity we see around us. They emphatically do not mean this to indicate that “our theories for understanding it must turn to the identification of processes rather than taxonomies of things.” Connecting Darwin to the phrase ‘irreducible complexity’ in any positive way is only sowing confusion – and makes you likely to be unintentionally cited by IDers as a scientific expert who agrees with them.


Trying (possibly in vain) to wrench this back to something like the original topic, I think a lot of this illustrates the difficulties in trying to discuss “the essential paradigm of cyberspace.” From some POVs that phrase is meaningless, irrelevant, ambiguous, or worthy of nothing more than eye-rolling. If we were somehow able to identify an “essential paradigm” for some thing we might define as “cyberspace” what would this get us? Or is that question from another POV equally irrelevant, missing the point, and worthy only of eye-rolling?

74.

Mike: It is not by any means objective, falsifiable, or scientific. The process you describe above is common in the humanities; it is anathema in science.

Well, what is special for the humanities is that the researcher's experience of a work is viewed as more "credible" (and why shouldn't it be?). Science is more than hypothesis testing: design research (computer science), interpretative approaches etc. The shared ideal in science is to exclude the researcher from the equation, but that isn't possible since the chosen research question and method in itself carries a message...?

75.

Ack, I can't believe that little sleep caused me to miss the "irreducibly" -- thank you Mike! I agree with how I think Thomas meant it, not the ID bozos.

I think my position can be summed up pretty simply:
1) There are myriad methods of generating knowledge and information
2) Different domains have differing methods for using this knowledge/information
3) Within a given domain, different methods are more or less useful, and which method is more or less useful changes over time
4) When it is possible to use empirical methods to test an assertion, failing to use those methods leaves potentially useful information on the table

So, yes, I do feel that when making a falsifiable assertion it is morally bankrupt not to apply appropriate methods to test it. Sort of like choosing the Horde. But seriously, if we have the opportunity to generate more knowledge about us or our universe, why wouldn't we?

76.

Mike wrote:

...makes you likely to be unintentionally cited by IDers as a scientific expert who agrees with them.

Be careful? On the contrary. The last thing I want to do is tiptoe around and avoid this, because that only plays into the hands of IDers. I guess I'm not that worried if they mistake a claim about complexity itself as necessarily entailing the next step they take, that *therefore* only a divinity could explain it. Those two opinions are worlds apart, and seing them as necessarily following one another is the result only of incredibly poor reasoning (dogma), reasoning which is in fact very easy to point out.

Your comment about postmodernists before postmodernism reminds me of Bruno Latour's We Have Never Been Modern. That's precisely one of his points: the excesses of modernism never reigned totally across academic thought, just as the excesses of postmodernism (dogma in the other direction) do not reign totally across the humanities.

I continue, like Greg, to worry that the subtext of your and Cory's posts is that hypothesis-tested knowledge is the sine qua non of knowledge; the gold standard. I prefer to see these multiple aproaches pragmatically, as ontologically on a par with one another and, as Cory correctly noted, good for different kinds of questions. But one of the excesses of positivism is precisely the barring of non-positivist approaches tout court, when in fact they might have something to offer (Kuhn is appropriate to include, as you note, Mike and I think in particular of Stanley Milgram's "experiments"). Instead of feeling confident that we can judge ahead of time which kinds of arenas or domains should receive which kind of inquiry, I prefer to see if the proof is in the pudding. Reading them on their own terms, critically in each appropriate way (and, at times, reading critically across methodologies), can enlighten us all.

77.

Missed you post while writing, Cory; very nice to hear you put it that way.

78.

Thomas said: I continue, like Greg, to worry that the subtext of your and Cory's posts is that hypothesis-tested knowledge is the sine qua non of knowledge; the gold standard. I prefer to see these multiple aproaches pragmatically, as ontologically on a par with one another and, as Cory correctly noted, good for different kinds of questions.

Actually I agree with you; there are significant areas of knowledge that are just not amenable to analysis by falsifiable hypothesis (per Cory's Sonnet example earlier) -- they just aren't science and shouldn't labeled as such. It doesn't make other means of knowing the world or ourselves lesser, just separate (and bound by different analytical limitations than found in science). In general I hold to a variant of Gould's non-overlapping magesteria not only for science and religious belief, but for other matters of opinion that involve learning, consideration, and critique (it's curious to me that Cory seems to want to apply scientific analysis to religious belief, but that's yet another different topic).

Instead of feeling confident that we can judge ahead of time which kinds of arenas or domains should receive which kind of inquiry, I prefer to see if the proof is in the pudding. Reading them on their own terms, critically in each appropriate way (and, at times, reading critically across methodologies), can enlighten us all.

Yes, but... we're on the raggedy edge of one domain or another here. How do we tell what is an "appropriate way" to discuss and analyze virtual worlds, for example? What sort of "proof" in the pudding do we allow or dismiss? Is it appropriate or worthwhile to read World of Warcraft from a vegetarian perspective, taking into account the significant noble bovine presence contrasted with the clearly repugnant cannibalistic carnivorous undead, notwithstanding the many meat-oriented recipes in the world -- or is such a deconstruction purely eye-rolling territory?

I don't think there's really a singular answer (some readings work, some don't but there's no gold standard). This leaves us -- as in the humanities in general -- in the unfortunate position that, as Chip said, external impersonal selection methods are all but inoperable: anyone with enough clout can by their opinion alone turn a nonsensical reading into one suddenly considered profound.

79.

Mike> In general I hold to a variant of Gould's non-overlapping magesteria not only for science and religious belief

. . . and here I thought I was going to have to agree with Mike all day. So, I don't understand this position at all. It seems like a cop out from the scientific community to avoid arguments. The domain of religious belief fits nearly into my approach to knowledge. So long as it sits off as literature or art it makes no sense to apply empiral methods (well, except in secondary ways such as linguistic analysis). But, once it attempts to move into descriptions of how the real world works, why shouldn't the falsifiable assertions within it be tested? Leaving aside the mistranslation, what would a virgin birth mean to biology? If old books define morality, do people who spend less time reading them spend more time in jail? Do cutlures that tie themselves to old books live longer, innovate more, or create more art?

Note that this isn't a criticism of old books or religion per se -- it is only when dogma is applied to testable assertions that failing to test these assertions leaves potential knowledge behind. This is why I'm all for rigorous prayer studies. I would think that the cohort that believes in the teachings of old books would be on the front lines of testing these assertions. After all, if believers are already picking and choosing what aspects of the old books they are rejecting, why not apply all the tools at our disposal to help that process?

80.

Moving seriously off-topic...

Cory: Do cultures that tie themselves to old books live longer, innovate more, or create more art?

In some cases, yes. There is solid empirical evidence for the efficacy of religious belief and lifestyle (lifespan, disease rates, subjective satisfaction, etc.; no citations handy - hit me up later), and even some interesting ones for the efficacy of various forms of meditation and prayer.

On the converse side, there is no falsifiable hypothesis to decide the fundamental question of God. Everyone from Aristotle to Anselm to Russell and Gould have poked at this. I don't expect to see this on the cover of Science any time soon.

81.

Eh, seems like Popper is poison.

Where are the two of you placing the areas that neither belongs to the humanities or the natural sciences?

Hint: most research doesn't!

Academia does investigate the historical Jesus, claims about authenticity etc... Why raising all these moot points?

82.

"all of them are productive in that they become commodities and generate wealth (hah, literature is a form of economics, people!).

You ARE american."
-Second line by Ola


I thought this was Marxist?

83.

@Mike: So, by reserving "science" for hypothesis-based methodologies, you are comfortable excluding a vast chunk of the social sciences, as well as those parts of the natural sciences that rely on observation and exploration (astronomy, paleontology, geology) for at least a part of their knowledge claims? I see no reason to reduce scientific inquiry in that way. This is one of the fundamental misunderstandings here -- a myopia about science itself, to say nothing of the humanities.

84.

@Mike (with thanks to Thomas): Before we had a word for psychoanalysis, we had concepts and ideas that, hundreds and even thousands of years earlier, would be best described by that term. So... yeah. I've got no problem ascribing deconstructionist tendencies to folks who lived prior to the 20th century.

But the importance of the ideal as ideal per se is somewhat unique to the period. And the idea of doing it, even to ones own work, and doing it in an extreme, not just as an exercise... but as a fundamental requirement of (my words) keeping trampling, maddening, murderous, insane law at bay... that, too, hasn't been seen in history much before.

Yes, there have been many times when a new order questions the old. And ideas of impermanence or fluidity or chaos are held up as being important or worthy of study. But deconstructionism has, at its core, the idea of "You can *not* understand a whole thing." There is not such thing as put-together. As complete. To those who believe in its importance, deconstruction is, frankly, an imperative; a requirement; a calisthenic, not a nice "also good to do."

Now... I don't think that way. I believe that there are core ideals and truths, many of which are scientific in nature. [Note... "Areas of knowledge that are amenable to analysis by falsifiable hypothesis..." And we're calling lit-crit types obtuse?] Some of which ain't. I believe in the Big Bang and evolution and in God and in Shakespeare. So sue me.

Because, in the end, I do think that you need to have solid constructions if you want to deconstruct them.

85.

Mike: "...as in the humanities in general -- in the unfortunate position that, as Chip said, external impersonal selection methods are all but inoperable: anyone with enough clout can by their opinion alone turn a nonsensical reading into one suddenly considered profound."

The one important new idea raised again in the 20th century (and before that across the centuries by a rather small minority of thinkers whose heritage is certainly not dominating western culture today, e.g. Cusanus or Montaigne, just to bring in new names :) is that

There is no *Outside*, no *impersonal selection* whatsoever as soon as human cognition, decision making, creation, or action gets involved. (And by the way, this, the *idealized, pure, untainted Outside* distinguishes Plato from say Derrida, read "Carte postale" for a real hefty postmodern treatment :)

There is no *invisible force* selecting useful knowledge claims - not in natural science, not in engineering, not in lit crit.

There is only *us*

... human beings, always self-referential as soon as we open our mouths because this is just the way we were made - we cannot speak about *the world outside* (outside of our "bodies", our "minds") without speaking about ourselves, our history, our kids - do you think you can conceive an experiment on quantum physics and keep your prior life somehow *out of it*?

So there is only *us*, the human species... and we somehow have to deal with that - and *This* is what people have been trying to figure out for the last 100.000 yrs or so. Call it whatever you like.

86.

Tom Hunter: I thought this was Marxist?

Yes! I said american, didn't i???

87.

Thomas: So, by reserving "science" for hypothesis-based methodologies, you are comfortable excluding a vast chunk of the social sciences, as well as those parts of the natural sciences that rely on observation and exploration (astronomy, paleontology, geology) for at least a part of their knowledge claims?

Much of the natural sciences make strong predictive claims that effectively circumvent conditions like distance in time or space: look in one place, make a prediction based on prior observation about what will be found in another; the veracity of that prediction, especially if confirmed via multiple observations, tests the hypothesis and builds theory.

There is great value in the social sciences as well, much of which is called (perhaps by courtesy as much as custom) science, but which should not be confused with the methods of hypothesis, observation, and prediction. Wundt, Freud, Skinner, Posner, Malinowski, Mead, etc., were bound by different conditions (all involving humans) that limited the predictive ability of their models. Their models did not and could not earn the level of assurance that comes through hard-won predictive observation (and in some cases experimentation). This isn't a knock against those areas, it's just what is. Even in physics this occurs -- some of Einstein's hypotheses could not be confirmed until years after his death, and some current interesting hypotheses labor under the scathing "not even wrong" label.

So I wouldn't say I'm "excluding" the social sciences -- that has a pejorative ring -- but I do think it's important not to confuse the hypotheses, models, and theories about, say, photosynthesis or covalent bond formation with the necessarily more tentative models and theories of the bases of community formation, the causes of neurosis, stellar death, or the conditions under which RNA first self-assembled on this planet.

I see no reason to reduce scientific inquiry in that way.

No, I suppose not, given your anthropological outlook. For anyone trained in the so-called harder sciences (physics, chemistry, biology, etc.) this really isn't controversial (or wasn't last I read much philosophy of science -- perhaps postmodernism has crept in there as well). My own training straddles biology, psychology, and computer science, so I'm used to skating across the lines of observational prediction and what might be called observational interpretation in areas where falsifiable hypotheses are few and far between, and without making value judgements as to which was "included" vs. "excluded."

This is one of the fundamental misunderstandings here -- a myopia about science itself, to say nothing of the humanities.

I do not agree that this is myopia at all. I'd say instead that it's dangerous (and perhaps myopic) to conflate those methods of inquiry that produce objectively, repetitively, predictive observation and hypothesis with those that necessarily rely more heavily on individual interpretation, credibility, and argument.

The reasons are as I and others have said already: doing so sets models and conclusions from both sources on equal footing, when those more reliant on individual interpretation have not earned the degree of certainty that hard science affords. This leads people to lean on the intellectual model of the day -- social Darwinism, racial superiority, the indistinguishability of the sexes (i.e., the idea that male and female brains and minds do not differ), logical positivism, postmodernism, Marxism, Libertarianism, what have you -- as heavily as they might lean on models of gravity, chemistry, or psychophysics (e.g., retinal-based illusions are observed only "in the mind" but require no individualized interpretation to understand them). The model may be completely right or completely wrong, but the rationale of a few respected members of the academy does not make it so either way. When that is all you ultimately have to rely on, building too much on such models makes for a poor foundation.

The mysterious "?" wrote: There is no *Outside*, no *impersonal selection* whatsoever as soon as human cognition, decision making, creation, or action gets involved.

I'm aware of this philosophy via Bateson and others; there's a (possibly extreme) form of this in overall apophatic belief that IMO slides towards solipsism... but most don't want to take it to that conclusion.

One formulation of predictive hypothesis and observation says that we perceive the world essentially as it is; a fairly modernist view. Others acknowledge that while none of us is separable from the world or our observations, we nevertheless can effectively cancel out each others' errors and influences by repeated observation; there is no objectivity in isolation, only in aggregate.

I know that this is unsatisfying to some who believe that we cannot disentangle ourselves from our observations to any significant degree, or even that there is no "thing" to be observed at all. At that point all I can do is shrug: we are down to axiomatic statements of belief at that point, essentially tenets of faith: if you do not believe that falsifiable, predictive hypotheses and observations are possibly reliable, then there is nothing anyone can say to change your mind. At its root, even science relies on this statement of faith.

88.

Mike: "Others acknowledge that while none of us is separable from the world or our observations, we nevertheless can effectively cancel out each others' errors and influences by repeated observation; there is no objectivity in isolation, only in aggregate. At that point all I can do is shrug: we are down to axiomatic statements of belief at that point, essentially tenets of faith: if you do not believe that falsifiable, predictive hypotheses and observations are possibly reliable, then there is nothing anyone can say to change your mind. At its root, even science relies on this statement of faith."

I can easily agree with this. I do use "falsifiable, predictive hypotheses" all the time (with modest success :) and I do acknowledge that my doing so is depending on my own decisions and on the people I care about and I chose to live with...

So coming back to VWs... what makes them so fascinating to some of us, I believe, is the fact that we do create a "world" we can observe from the "outside in" as well as from the "inside out" -- we create a "world", observe us while we are doing so, and then start talking about it to each other "in-world" and in RL... maybe this process is something that in fact approaches what people have tried to explain in "the old books"... and there is absolutely nothing to be ashamed of... why make it "scientific" or "critical" when "human" is more than enough?

89.

Talking of Bateson, Watzlawick et al.'s "Change. Principles of Problem Formation" might be just as good a practical, real-world starting point as say stuff about Drucker -- the world view and the language "postmodernism" was built on is simply not taught neither in western high schools nor at western physics/engineering departments -- I guess one has to approach it more patiently :)

90.

@Mike: But you are slipping back and forth between an acknowledgment that the knowledge claims of hypothesis-based methods are only reliable, or provisional, which they are, and the too-easy characterization of them as objective or certain. You can't have it both ways, and I submit that the latter habit of characterization is a political, rather than epistemological, characterization, in its origins. To my mind, all of these claims are reliabilist; some generate more reliable claims, some generate less reliable ones, but there is no hard bright line which separates "objective" inquiry from "interpretive" (this is what I take to be ?'s point as well). When human beings have far less impact on the phenomenon under study, the predictive value usually increases, but that doesn't mean you can find the spot where science "ends" and the humanities begin. I'm staggered, I confess, by your holding to that view.

91.

A few more comments to try to sort this out, since I'm a bit gobsmacked...

Mike wrote:

without making value judgements as to which was "included" vs. "excluded."

So why do it here? If in practice one must move between these kinds of reasoning all the time (as many scientists do), then holding to a view that one is moving in and out of science, again, seems to be a political statement, not one with any usefulness for marking a kind of intellectual inquiry.

Again, you seem familiar with the fact that many scientists, when circumstances do not allow for extensive testing, replication, or prediction, nonetheless generate claims (such as about dinosaur evolution), but you steadfastly hold to the view that this is not science. It's just strange. Exploratory science has been a part of science since the beginning.

Mike wrote:

doing so sets models and conclusions from both sources on equal footing, when those more reliant on individual interpretation have not earned the degree of certainty that hard science affords.

This may be the source of the misunderstanding.
By saying these claims are on the same footing epistemologically, this does not mean that they have the same degree of reliability, which is what you seem to be saying here. There are more reliable claims, and less reliable ones, but there is no category difference of methods which maps perfectly onto that spectrum. It seems that you fear that calling exploratory science (for a start), science, diminishes the claims of science. This is hard to believe; except, again, as part of a political position.

92.

@Thomas:

When/if you are in Columbus, Ohio, we must have lunch. I will take you by Cornhenge and you will be glad. The real story, and then the myth of the story are fun.

@Mike, who said: "The model may be completely right or completely wrong, but the rationale of a few respected members of the academy does not make it so either way."

Right. And that's a highly postmodernist take on the situation, eh? What's scary to lots of non-scientists is not the idea that science is a good way to figure out better toasters, Tang, Velcro, rockets or skin cream. What's scary is the idea that many Sciencians believe that if it is scientific, therefore it must be good. If it is right (accurate), therefore it must be right (moral). I'm a big believer in the aggregate good of science (not being a fan of dying at age 32 of polio or staff infections, and really liking central air conditioning and my Xbox 360). But lots of people look at stuff like atom bombs, bio weapons, global warming, etc. and say, "Did the scientists who invented that stuff really stop and think about the moral implications? Or did they just do it because they could?"

I'm not saying I agree with that... but it's part of the reason you have the "doubt everything" camp on the art-and-social side of the academic divide. Because on the science-test-ology, let's push the boundaries of known unprovability side, you have people that will sometimes build new stuff without much thought for the art-and-social welfare.

In D&D terminology, I am forever classified by my friends as "chaotic good." Which, to me, is a hoot. I love that. I will do you no harm, but you really can't count on me not to buy you a monkey for your anniversary. Science? Sometimes... "lawful evil." And that upsets folks in other grid positions.

So, yes. When deciding how to improve your steam engine or test my central air conditioner to make it better.... empiricism!!! Please, yes. All that neat, science-y, prove-it-y stuff. When deciding whether or not to do it *at all* in the first place... back up and think about all kinds of things that may not be provable, but may be interesting, informative and wise.

93.

>We're going to have to agree to disagree here. There is a nearly infinite list of knowledge that started with fire and is currently passing through the human genome that demonstrates what happens when we *do* attempt to explain the source of complexity. If you are going to take a moment for awe, why not focus on the amazing capabilities for groups of people to solve apparently insurmountable problems. That is something worth appreciating. Of celebrating.

And of investigating and understanding how we keep doing it.

Oh, we're definitely going to disagree, and strenuously : )

All you're doing here, Cory, is elevating the group-think to the status of God. All of a sudden, the mob is God. Why are they any better.

I'm awed and I cannot conceive of a greater thing -- and you reduce that as a scientist to merely a chemical admixture, hormones or stimuli. But then you suddenly elevate something called "the group working together" to magical, mythical proportions and never parse whether all that cloud of unknowing is merely a lot of hormones and stimuli. You can't play that game with the individual, the jettison it for the group.

I know you're very enamored by the "wisdom of the crowd" stuff. Surely you've seen the discussions here and everywhere else, including in the guy's book, that mitigate, conditionalize, explain just what he meant, and it wasn't merely these happy picnics you're describing.

I don't understand why I have to get down on my knees and genuflect to some synthetic concept of people "solving problems together". First, I guess I don't see a lot of that in real life. Um, did you have something like, I dunno, the U.S. Congress or the General Assembly as an example of this...er...awesome group capacity to amazingly solve problems together?

Or do you just imagine it's little groups of people at conferences on Texas ranches or something having little epiphanies?

It might be in your little groups of work or personal life you have these awesome epiphanies but do they stand the test of explication to skeptics -- the kind of skeptics any believer in God faces in trying to defend the idea of God?

It seems to me you are merely celebrating a Golden Calf, the ecstasy of some mistaken, euphoric idea of group collaboration. I'm unpersuaded.

94.

Thomas: "To my mind, all of these claims are reliabilist; some generate more reliable claims, some generate less reliable ones, but there is no hard bright line which separates "objective" inquiry from "interpretive" (this is what I take to be ?'s point as well). When human beings have far less impact on the phenomenon under study, the predictive value usually increases, but that doesn't mean you can find the spot where science "ends" and the humanities begin."

But I'd like to add one important thing:

If I may continue to use the metaphor of the "outside" (roughly the axiomatic system) and the "inside" (language, "methods&procedures", "standard" devices, and metrics of accuracy of some sort, even in text interpretation: there is a certain set a hebrew - english dictionaries you simply want to rely on if you are to work as a "theologian"...) then the "reliabilist" concept/meta-axiom of truth applies mostly if not always to the "Outside", i.e. the axiomatic foundation of a field of inquiry. And this foundation is definitely changing over time due to the feedback loop between some humans and some other humans (and Chip rightly pointed out that there *are* differences to be observed, but in *human interaction* not in "darwinian acts of...")

This simply means people *have to agree* on the axiomatic framework before they start making knowledge claims. BUT, and here I seem to agree with Mike, if these axioms have been established at some degree of usefulness (like say in *aircraft engineering*, an eyample R. Dawkins seems to like when bashing his anti-positivist critiques) the reliability of claims "Inside the franework" can be derived from the "Outside" alone, by applying logic, mathematics, testing, and so forth.... But in cases where the peer review investigation didn't work you usually find and that people relied only on parsing the language used, by relying on the fact that the jargon is right... an important insight certainly highlighted again and again by post-whatever-ism.

Just one more example: the quantum hypothesis was so powerful not because of some "group think" that suddenly fell in line with what Planck, Einstein and others proposed, but because every physicist was ready to except Newton's and Maxwell's laws *and* all the previous experimental data that seemed to contradict it. So the "outside" had to be changed in a reliabilist to make the "inside" become logical and "hard scientific" again -- and it were *humans* who did -- it was *not* and act of some metaphysical force -- neither god nor something else.

And could physicists have come up with some other theory instead? -- Is that a testable hypothesis? :))

95.

greglas>I was just pointing out that Cory, in describing the putative fallacy of different rule sets for some disciplines, was criticizing a statement that sounded a heck of a lot like your argument for magic circles.

It doesn't sound like it at all to me. Cory was saying that if research in domain A strays into the territory of domain B, and domain B is such that new theories have to be tested against a corpus of existing knowledge in order for progress to be made, then the research in domain A should fall into line rather than claiming to be exceptional.

Example: game design has a lot in common with novel-writing. There are long-standing tools available for analysing novels (and plays and poetry) that have been applied to new narrative forms (such as screenplays). If we want to analyse games, should be not attempt to repurpose these tools to our own ends? Or is there something fundamentally different about games that limits this in some way?

Personally, I feel that there is something about the way narrative unfolds in games that at times gets them below the granularity at which narrative theories operate, ie. that there may well be different (or at least new) rules that apply to games at the "textual" level. Others, such as Lee Sheldon, think I'm over-invested in games and that existing tools are sufficient. So here's me on one side, strongly suspecting that different rules apply for games, and Lee on the other, strongly suspecting that they don't. What I can do is try to show where the existing rules break down or become vacuous; what Lee can do is show how gameplay can be described in traditional narrative terms; what neither of us can do is ignore the other, because in both our fields this is an unresolved question. We have to compare the new (games) with the old (literature) in order for the critical theories of either to progress. This is exactly Cory's point, albeit in a domain of comparative studies rather than of science: it's foolish for me to reject the existing practices out of hand, saying they don't apply, because they do apply. The results of the analysis may show that the existing theories don't apply (or they may not), but in order to show either of these games theorists have to use the same practices that the literature theorists use, we can't just say "I'm special and get to play by different rules".

My argument for magic circles is talking about rule sets in terms of game rules, not in terms of the practices used for studying game rules.

Richard

96.

Richard -- yes, my I suppose I should have said:

Cory> "I'm special and get to play by different rules."

greglas>This is actually *superficially remniscent of* Richard's argument for virtual worlds, I think.

Merely an attempt to be clever... that apparently backfired.

97.

Richard said: "We have to compare the new (games) with the old (literature) in order for the critical theories of either to progress."

Why?

Games are not literature are not written traditions are not groups of books are not single books are not oral stories are not formal, dogmatic religion are not lore are not superstition are not shared suspicion are not hearsay and rumor are not wondering are not fear and ignorance. If you linearly trace the cultural roots of the narrative thread of games back to the first spark of their lives, you travel through many interesting pastures, sure.

But an MMO is surely not, in any empirical sense, a story told by Cub Scouts around a campfire, is it? Nor a creation myth. What could one possible do to inform the other?

Are we making money on some of these MMOs and VWs? Sure. But why should you talk to someone who has studied the history of gambling in the Old West? WoW has nothing to do with riverboat casinos, eh?

And I hear that people have sex, virtually, in Second Life. But that's not a thing like other kinds of traditional pornography or erotic literature. So why bother making the comparison. Kids sneaking peeks at the bra section of the Sears catalog circa 1970 or going to the library to peer at nekid natives in National Geo in the 1950's has nothing to do with 15 year olds logging on to SL and pretending to be adults so that they can light up a sex ball...

A thing is what it is. Right?

Again... I'm a marketer by trade. What is "empirical" for me, is The Funnel. You put a dollar in the top. What comes out the bottom? It better be more than a dollar in the long run, or you're gonna dry up and blow away. But that doesn't mean that the only and every act performed by good company is the counting of pennies.

One thing to remember about even the most bizarre sounding of the postmodern deconstructionist papers and circumlocutious arguments therein, is that their set of equations for examining their universe *is* language, not physics. Why? Because language and metaphor are the sets of symbols by which man knows his own mind and each other. At least that's how the theory goes. You can say to me, "PV = MRT" until the cows come home, but that doesn't tell me much about you. "I like the Beatles more than the Rolling Stones" has a lot more of a viral load, though it is empirically, well... without much scientific weight.

What is a game? What is story? You can rely completely on science, or you can rely completely on feeling. I don't think anyone here is arguing for those extremes, eh?

01110010 01100101 01100100 01110101 01100011 01110100 01101001 01101111 00100000 01100001 01100100 00100000 01100001 01100010 01110011 01110101 01110010 01100100 01110101 01101101

98.

@Andy (previous post): Thanks for the invite -- I can't wait! Cornhenge looks perfect for a conversational stroll through the outsized and absurd. What fun. :-)

99.

This is kind of funny- Andy I am on "your" side, but the specific examples you give don't sit so well with me.

White Wolf always touted their roleplaying games as collaborative storytelling- certainly they labelled everything as such. More shared fictional narrative than "game" at least was how they hoped to portray I don't know why their games. I think there's a good link between pen and paper RPGs and MMOs, so I wouldn't discount all literary theory when looking at MMOs. The main point, I think, is that you can't look at, say, an MMO using *only* comparisons to literature etc, but that does have to be part of the discussion, where they overlap.

Ironically, as far as cybersex goes (at least the style that doesn't involve playing whack-a-mole on a menu box) you might find that it's quite a lot like traditional erotic literature, in that it could be copy/pasted from literotica :D

I'm also told (I gave up on English Lit in school) that novels often follow a plot structure based on sex. Certainly written cybersex typically strives to follow a short story like structure in a kind of back n forth way. Study of traditional narratives has a lot to say about this aspect on online play- not so much about why 15 year old boys might choose poseballs in SL over railway siding litter porn magazines, or more easily accessible googled pictures and videos perhaps.

I would never agree that games should only ever be looked at using techinques from literary arts or whatever. That seems utterly insane, but I think *some* of those tools are useful.

01100010 01101001 01101110 01100001 01110010 01111001 00100000 01110100 01101000 01101001 01101110 01101011 01101001 01101110 01100111 00100000 01110011 01110101 01100011 01101011 01110011

100.

@Ace... I should have used the [irony on] and [/irony off] code for that last post. I most assuredly think that we should be overlapping many of these tools.

My point was that Richard (and others) have been saying that over-the-top crit-lit types take it too far one way. But then Richard said: "We have to compare the new (games) with the old (literature) in order for the critical theories of either to progress."

My string of "games are not literature...etc." was meant to be ironic reductionism. Literally? Of course, games are not literature. In some sense of the words, though... sure. Games ARE literature. And games are economies. And they are chat rooms. And they are places of work and libraries and schools.

Now... a truly left-field deconstructionist might also start with the claim that an MMORPG is also "a primitive hunter gatherer society." There's, at least, a crit-lit sophomore paper waiting to be written there. Not that characters in a game like WoW actually hunt animals and gather resources like gold... but that the *players* themselves are engaged in behaviors similar to those of early man. Perhaps the PvP players are "hunters," who are "hunting for ego inflation that has been lost in a world bereft of most opportunities for physical violence." And the "gatherers" are the social types, who are using the platform to "gather" gifts of friendship and camaraderie.

Is that a bit further a fling than looking at games as stories? Depends on who you're talking to. And when. People originally thought Freud was an idiot. Then they didn't. Then they did. Now... it's a toss up. The consensus among many of my pals in the psych ward is that his tools were great for building other tools, but he used them badly himself. Misguided genius.

Richard said: "There are long-standing tools available for analyzing novels (and plays and poetry) that have been applied to new narrative forms (such as screenplays)."

Well... yes and no. In the early days of film that was 100% true. Because nobody knew anything about movies because they were new. As time has gone on, however, the tool sets have diverged widely and wildly. What makes for a great book often makes, really, for a lousy screenplay. Thus there are two entirely different Academy Awards for original screenplay and adaptation. The tools for analyzing some of the similar underlying structures are similar... but only in their similarities. What many novelists have found out, to their discomfort, is that the road from book-to-screen involves working with someone not who knows your work well, but who knows film well.

Any particular weird-ass, flaky, deconstructionist, postmodern, unraveling of a work may, in fact, be crap. Yes. But just as the guys who created that "Mentos and Coke" movie did nothing that really accomplished anything... we are still moved to think, "Wow. I wonder how they did that." The mental exercise behind much (not all) deconstruction is, I think, worth consideration.

Is it going to cure cancer? Nope.

But neither are video games ; )

01111001 01101111 01110101 01110010 00100000 01101101 01100001 01101101 01100001 00100111 01110011 00100000 01110011 01101111 00100000 01100110 01100001 01110100 00101110 00101110 00101110

The comments to this entry are closed.