« A Kinected World | Main | HNY #1: Interview for Jeffrey Ventrella's new book Virtual Body Language »

Jan 04, 2011

Comments

1.

Interesting post Ted - both AE and SE are officially Hard Problems. Creating effective, resilient, non-brittle communities or effective, resilient, non-brittle representations of emotions in online worlds or NPCs are incredibly difficult to do. I still think they're worth doing, but I wouldn't look for them in mainstream games any time soon (I've been saying that for, oh, at least ten years, and it's still true.)

Personally I envision a combination of the two alternatives (points on a line?) that you mention. For example, I and others have for years been touting the possibilities of "pesonal NPCs" (probably going back to Mike Rozak's blog post on the topic from a few years ago).

If you can have NPCs that matter to you, and you to them, who are entangled with your life-journey in the game/space and have the ability to play with others... well that seems like a winning formula to me.

As for SWTOR, I have great hopes and significant concerns. Will all that necessarily static voice acting really stay fresh, or will it quickly become "tl;dr" as with quest text in other games?

2.

Ted>It's not a new term, indeed Professor Turkle has paved the way here, as before.

It wasn't a new term when she used it, either. I have papers on the subject from the early 1980s when I was doing my PhD.

Richard

3.

Isn't it highly possible that our universe itself is such, or some combination thereof (a handful of 'real' people, the rest NPCs)? Did you know that Philip K. Dick's middle name was 'Kindred'? Yet he was the guy most in doubt of the reality of our apparent SE.

I'd take AE as the whole kit and kaboodle might just be a big illusion anyway, and I don't need real people if AI is just is good. But that's gonna take a while:
http://www.wired.com/magazine/2010/12/ff_ai_essay_airevolution/

Yes, I have thought about this, kind of a lot. Would I take an android boyfriend that mimics Keanu Reeves, perfectly? Yes, yes, I would. I can turn off the body odor and snoring, you see, shut him down when I want control of the remote, and I wouldn't have to feed him, etc. Unless I want to.

4.

That brings up the whole Bostrom argument, that we're already in simulation.

5.

Richard> It wasn't a new term when she used it, either.

Nothing is new to you, Richard!!! It's like talking to Edison about light bulbs.

6.

@Lisa, I really love the reference to Philip K. Dick as you descend into a Cartesian conundrum. I have a problem with your Keanu Reeves android though: wouldn't the level of control you have over him/it expose the simulation for what it is. This may seem trivial, but surely the the simulation would need to seem convincing for you to engage with it in a genuinely emotional way.

7.

Found this video on YouTube while reading up on Dr. Bostrom. (also something in there related to Ted's exodus idea).

Didn't realize Bostrom was so dystopian! Oh, is he the guy who says 'the universe is a crazy rampaging robot programmed to kill?' Maybe, but if I believed that, why would I get up each day?

8.

@Dean

The key, I think, is the equivalent of a romantic turing test that will tell if an android or bot is sentient enough to overcome my cynicism. If it appears substantially real and elicits romantic response in me and is sincere, what do I care if it actually is physically 'real' or not?

How far off are these kinds of choices? 10 years?

9.

Mmm, 5 years?
http://www.videosurf.comwww.videosurf.com/video/life-like-walking-female-robot-62356652

And Dean, realized I can answer your question better. While I mention control as a benefit, I wouldn't want complete control. Spontaneity, mystery and surprise fuel romance. They would all have to be complexly modeled and integrated into the AI.

The comments to this entry are closed.