Jeff Orkin, a game-developer-now-in-academics is doing some cool stuff and is looking for help with The Restaurant Game. To see how you can contribute, go to Jeff and Andrews' posts (1. 2. , 3. ). The sound-bite:
...(I)t will algorithmically combine the gameplay experiences of thousands of players to create a new game. In a few months, we will apply machine learning algorithms to data collected through the multiplayer Restaurant Game, and produce a new single-player game that we will enter into the 2008 Independent Games Festival.
Take a look.
Now, to move into the realm of speculation.
A while ago I commented on Luis von Ahn's work. If games like The ESP Game are algorithms based on a symbiotic human-machine relationship, then that was one kind of symbiosis (games with a purpose). Another kind of symbiosis could be based on learning and mimicy.
Let us suppose an Artificial Intelligence (AI) were possible that could understand you better than you might understand yourself (especially when contextualized in your virtual world). In other words, given the tools, would you better trust yourself to create the perfect game (to your tastes) than a hypothetical AI observing you in your play? Or do you recall poorly that game you once designed and tried to play.
Nate, can you clarify: what's the alternative to trusting "yourself"? If you're not doing trusting yourself, aren't you trusting the designer of the adaptive algorithm?
C.f. If you're measuring the length of an object, would you trust a ruler or would you trust yourself? :-)
Posted by: greglas | Feb 25, 2007 at 06:38
greglas> what's the alternative to trusting "yourself"? If you're not doing trusting yourself, aren't you trusting the designer of the adaptive algorithm?
------
Yes. But also note that the adaptive algorithm's output could emerge from simpler (learned) elements: to what extent is it is behavior then 'designed?' The question, I think, extends just as easily to cover user created content: do you always know best when it comes to having a good time in your online world?
Posted by: nate_combs | Feb 25, 2007 at 09:17
OK, the output might be emergent in the sense that it learns your play patterns. You might analogize this to breaking in a shoe -- the shoe learns the shape of your foot, and in a way, probably "knows" your foot better than you know it.
Other forms of monitoring might similarly measure (better than you can) and adjust better than you -- e.g. a treadmill might be designed to speed or slow based on your heartrate. (It "knows" your heartrate better than you do, and can dynamically adjust to maintain a heart rate better than you.)
We've already got plenty of console games with dynamically adjusting skill levels (opposition heats up or cools down based on your skill in performance) -- I could see that *potentially* in MMOGs, but as we know, they aren't skill based. So that seems like a dead end.
What might be interesting to push is an MMOG that dynamically assesses your play style and adjusts your MMOG experience on your play style. E.g., the MMOG utilizes game mechanics to funnel socializers toward each other and certain zones, PvP types toward each other and other zones, crafters, explorers, etc.
I suppose that might be an example of an MMOG that knows more about you (your play style) than you do. But would that be fun?
Posted by: greglas | Feb 25, 2007 at 09:35
Eve has a system where spots that are very popular for services such as corporation offices, dynamically adjust in price based on demand. It's not AI by any means, but it is a dynamic adjustment of the world, designed to prevent extreme contention for a single desirable location where the price is fixed far too low.
In some sense, the system knows the players better than the developers can, because who has time or the memory to remember which of thousands of space stations are the most advantageous in changing economic conditions.
I'm trying to think if there's anything that reacts to players at a more individual level, but I'm struggling to think of anything.
It bears mentioning though, that these systems are generally put in by game developers, so they're probably designed to either benefit the game as a whole (in the case of controlling clustering) or make the game more fun. Of course, you can always go wrong and create the Microsoft paperclip.
Posted by: Daniel Speed | Feb 25, 2007 at 13:11
Greg >I suppose that might be an example of an MMOG that knows more about you (your play style) than you do. But would that be fun?
----------------------------------
I think Robin Huinecke's paper is still classic to some of the examples you gave.
http://www.cs.northwestern.edu/~hunicke/pubs/Hamlet.pdf
My impression is that these types of schemes of adjusting the difficulty of npcs/games are seen as less effective. Perhaps we can get the counter-argument here, but i'm under the impression that players see through these manipulations - to its detriment.
What I think might be more lucrative, however, is not "dynamic cheating" for the sake of the player than shaping the player activity towards high-payoff areas - in the style /preference of the player.
Daniel's example of the role of market signals in Eve is a limited example but might capture some of this - it is perceived as "fair" and can powerfully shape player economic activity. Its limitation however is that it applies most directly to market minded- players.
I think another way of casting the problem is to flip the perspective: how often do players get into "ruts" that eventually lead them to places in the game-world space that they either don't like or get bored of? I would guess it happens often: thank god for alts! I would conject a lot of those ruts were travelled by players who worked at every step believing that was the direction they wanted to go.
Posted by: nate_combs | Feb 25, 2007 at 20:06
> a lot of those ruts were travelled by players who worked at every step believing that was the direction they wanted to go.
Cf. IRL?
Posted by: greglas | Feb 25, 2007 at 20:55
Nate,
Insights in this area can be gleamed from the marketing industry, particular online advertising. The aim is of course to learn individual consumer behavior and then optimized the marketing information ratio for that individual consumer.
Insights is this area can also be gleamed from the "if you like this, you may also like that" referral tools. The aim is to direct the user to things of interest.
So in terms of games, it is possible for the system to learn the play behaviors of individuals and then guide them toward contents that are of most value to them. And the key verb is "guide" rather than "push".
Games that does too good of a job adjusting dynamically the challenge of the gameplay will annoy players. (I am reminded of how I hate computerized tests that dynamically adjust the difficulty of subsequent questions based on how you perform in prior questions!)
The fundamental question is of choice. I don't want the AI to choose for me. I want to feel that I am in control and am making the choices. I'm ok with AI recommending and guiding, but I want to reserve the right to choose. I think many people will have similar thoughts. And I also think that if an AI develop consciousness, the AI would want to make its own choices too.
Frank
Posted by: magicback (Frank) | Feb 26, 2007 at 23:29