Power, information, and trust frequently trade-off in interesting ways in virtual (as well as real) spaces.
For example, consider an MMOG universe, say Eve-Online, where I have a battleship and my buddies are arrayed, lurking. Say, we are dripping with drones and missles... and lo, a lone Industrial ("Indy") drops out of warp onto our lap, plump with minerals from "0.0." Suppose I open a line to that Indy and say "dude! (what is safe passage worth to you... ref: Piracy in Eve-Online)." Odds are that I would care far less than the Indy pilot whether he was paying attention, was able to speak my dialect of dude, or was out of the bathroom. He would, in contrast, want to find out "darn quick" what the terms of my offer were. In other words, the transmitted information would be worth more to him than to me. But what about the trust side of the ledger? Should we come to terms, I would need to trust him far less to uphold his end of a deal than he would need to trust me. I have power, he doesn't, trust in the medium is less important to me than to him.
That the weaker agent has less control, and hence must rely on trust to a greater extent is no surprise. In fact, from a communication concept of trust, Ed Gerck (Trust as Qualified Reliance on Information) makes the point that absolute power can mitigate the need for trust, absolutely. An important property of trust from this perspective is that while it is essential to the information channel, it cannot be transferred via that same channel. So for example, while I can communicate to the victim in the above example my demands, I cannot also communicate to him to "trust me", for I could be lying. That has to happen somewhere else, somehow else.
This analogy is less forgiving of the possibility of "the persuasion of strangers" - it happens all the time. But let's discount this as a definitional distinction: gullibility is not a synonym for trust as it is used here. Interestingly, again from a communications bent, Ed frames the trust question in terms of:
- "information is what you do not expect"
- "trust is what you know"
In the earlier Eve-Online ambush example, *information* applies to the trap as well as the terms of passage. Whereas *trust* is with respect to a deeper estimate of the ambusher's commitment to the bargain. In the end the victim will arrive at a judgement by either recalling me explicitly, or my corporation by reputation, or infer a belief from past experience or imagined fears. The important point is that this knowledge (loosely used) is developed outside of the encounter.
One implication for virtual worlds may go to the heart of avatar swagger: all things being equal, more powerful characters can be less trusting of the world around them than the weaker. This introduces an interesting set of trust asymmetries that are not driven by individual concerns, reputation or history, etc., but by world rules itself. For examplel, noobs are designed to be paranoid.
Another implication is that it suggests that more casual game worlds based on short play cycles with highly transient social experiences are inherently less trustworthy than, say more world-y ones with deeper social network systems.
Wouldn't it be nice to go somewhere and have everyone know your name and to trust someone.
Noobs, in some sense, are designed to be paranoid.
But when they're dealing with more powerful, older players they have a large volume of information, history and reputation to base their trust on.
The oldbie dealing with a newbie without history and reputation has less information on which to base thier trust, but, as you say, less need for the trust as well.
So, everyone's happy. Except maybe in virtual worlds where everyone is equal. In these worlds the newbie can research the history of the oldbie, but the oldbie has equal need for trust, but less information on which to base it. So, in social worlds are oldbies designed to be paranoid?
Posted by: Jim Purbrick | Jan 26, 2005 at 10:22
There is a large literature on managing risk in transactions. Last I checked, there are even a few books solely on the Prisoner's Dilemma, i.e. how do you manage risks in an unsecured transaction? Straight-up game theorists and economists go by the approach that we will do whatever maximizes our take (sometimes short run and sometimes long).
But there are folks called economic socioligists who ponder a much more likely variant: what if you have a relationship to the person? What if the other prisoner is your friend, father, doctor, paperboy or sales clerk, etc.? All of which is to say that relationships matter. I think this has a lot to do with the spread of guilds and clans--you want to deal with people who you are fairly sure aren't going to screw you over, and you manage it by building a relationship over time.
So too does reputation matter. One thing that keeps people from screwing each other over is their fear that they will not be able to transact in the future if their name is mud. I recommend Paul Resnick's succinct paper on this:
http://www.si.umich.edu/~presnick/papers/cacm00/
And here is where virtual game worlds have a mixed record. Where are the ratings systems for invididuals? (I would point to 2L as a notable exception). Where are the scores that tell you, Hey this person has returned people's steel lockboxes without incident 45 times? Where is the score that says, This gal has PK'd 356 players of a lower rank in the past 20 weeks? These things do show up here and there in MMOGs, but I'm surprised at how infrequently. For my money, there is no better way to secure trust in an insecure environment than to base it on past actions or on others' evaluations of a player.
I mean, if eBay can use it as their single-most potent weapon in increasing trust among a globe full of total strangers, why can't games? Or maybe that level of transparency would ruin the risk and the fun.
On this note, I'm really interested to see if Blizzard's forthcoming "honor system" has any metrics that can be used as reputation systems.
No doubt others here have examples of existing reputation systems or places that call out for them.
Posted by: Dmitri Williams | Jan 26, 2005 at 13:19
I have had a very lengthy meditation on trust and reputation percolating for almost a year now, ever since a conference at Swiss Re than Cory and I both attended...
There's a lot of pitfalls to reputation systems--most of them assume that trust is transitive, when it isn't really, not quite. Mistrust is more readily transitive, sadly. Real reputation relies on built-up networks; newbies cannot "research the oldbie" because they have no way to do so.
It's not an easy nut to crack.
Posted by: Raph | Jan 26, 2005 at 16:40
newbies cannot "research the oldbie" because they have no way to do so.
Sounds like someone should hack together an
automated expertise management system which provides trust information for newbies and then pipe their experiences with the system in to Raph's floatation tank.
Posted by: Jim Purbrick | Jan 26, 2005 at 19:44
The problem is that even that is not really valid--trust is generally limited to a given social group. Consider the case of two guilds that hate each other. What does the newbie see when he sees one of the members?
Posted by: Raph Koster | Jan 26, 2005 at 23:06
Dmitri wrote:
And here is where virtual game worlds have a mixed record. Where are the ratings systems for invididuals? (I would point to 2L as a notable exception). Where are the scores that tell you, Hey this person has returned people's steel lockboxes without incident 45 times? Where is the score that says, This gal has PK'd 356 players of a lower rank in the past 20 weeks? These things do show up here and there in MMOGs, but I'm surprised at how infrequently. For my money, there is no better way to secure trust in an insecure environment than to base it on past actions or on others' evaluations of a player.
I mean, if eBay can use it as their single-most potent weapon in increasing trust among a globe full of total strangers, why can't games? Or maybe that level of transparency would ruin the risk and the fun.
Because Ebay's task is relatively easy. It has only two activities (buying and selling) and one context to worry about.
Take one of your examples, for instance: Player X has killed Y number of players of lower level in the past T amount of time. What does this tell us? By itself, nothing really, beyond exactly what it says. You can't infer the player is a griefer. You can't infer the player isn't trustworthy, and so on. How do you know that the lower level player wasn't calling him a fucking prick or something? In our games, for instance, if I, Joe Noob, walk up to you, Bob the Overlord and say, "Bob, I think you're a fucking prick and I suspect you take it up the behind from syphillus-infected farm animals nightly." you'd certainly be within your rights to kill me for it, even if you ARE a jerk and my insults are well-deserved.
The problem with context in virtual worlds is mainly that code sucks/is incapable of recognizing it well, particularly with any subtlety. It can't tell the difference very well between griefing and justifiable homicide, for instance.
--matt
Posted by: Matt Mihaly | Jan 27, 2005 at 02:01
To be brutally simplistic about this, given the amount of infidelity around that does not seem to be discovered - if you can’t tell that someone one you live with is sleeping with someone else; or that Joe-schmo-delivery-boy might go ‘postal’ what hope have you of establishing trust within the limited time and context of a virtual world.
So for me this is an exercise of working out if ‘good enough’ can achieved at a reasonable cost (cost being taken in broad terms).
The two fatal flaws that I have seen in putative reputation systems are:
1) The type of people that you want to be warned against are exactly the sort of people that can and would subvert the system
2) The data being gathered and then assessed in terms of trust is too contextual to be able to extract meaning from e.g. is sneaking up on someone and killing them an untrustworthy thing – well kinda but if you are in a PvP system then everyone may be doing this with honour as defined within their group
But ye’ know, one of the things I actually like is being a strange stranger in a stranger land, so I’m not sure what level of trust information about the environment I want, I like the mystery and the danger – its not something I get day to day in the office.
Matt > "Bob, I think you're a,,,
Thanks for adding to the rich texture of language used on TN Matt :)
Posted by: ren | Jan 27, 2005 at 04:46
matt> Because Ebay's task is relatively easy. It has only two activities (buying and selling) and one context to worry about.
And it doesn't really work on ebay either, you have to read the comments to see what it is all about. Besides, when it comes to purchases you get sceptical if there is less than a 99% satisfaction rating...
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 05:27
Ren> But ye’ know, one of the things I actually like is being a strange stranger in a stranger land, so I’m not sure what level of trust information about the environment I want, I like the mystery and the danger – its not something I get day to day in the office.
Yes, but the very simple metric: this person has killed someone because his name is written in blood is useful and makes the world more exciting. (Meridian59 had this) I also like the concept of outlaws. It makes for interesting interaction as the outlaws have to explain to strangers how they earned their reputation and get non-outlaws to do the favours. You'll also get mafia-like behaviour. "take these swords to the smith and get them sharpened and I will protect you"
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 05:38
managing risk in transactions
Outside of highly circumscribed game worlds (e.g. not *world-y*) "transactions" may imply externalities where, well, trust is useful and cannot be directly managed.
For example, another one from the same Eve-Online universe. Imagine you play a trader. Imagine there is a very good deal on a mineral you really want today offered at a 0.0 system where you have to go to take delivery. Now, this game world can insure that the sale/price of the goods is conducted perfectly - riskless.
The risk is in going to pick the goods up after purchase and point delivery.
Consider a special case where that the favorable offering was deliberate and served as a "honeypot" to attract folk into the system for ambush by pirates...
If I didn't trust (e.g. didn't know) the seller - I would wonder, always.
Posted by: Nathan Combs | Jan 27, 2005 at 07:11
Clearly there are good and bad variants of reputation systems. And it depends--as Ren and Ola note--on the flavor you want in the first place. Maybe you don't want transparency and trust. But maybe you do and the goal is to come up with systems that are harder to subvert and provide quality information. Sure there are plenty of pitfalls here, but that doesn't mean it can't be done.
Contrast MMOGs with dating services. Some of them give feedback, and some are relatively toothless because they allow the person to edit their feedback. That's obviously useless. And as Matt notes, sometimes pure numbers give no context (although someone who had killed 300 lower-level players and, say, initiated 90% of the fights ought to suggest something). But some combination of either scores or comments can be useful in a situation. Back to dating: Bob is a nice guy with a sense of humor, but 65% of the women he's dated say he has a fear of commitment that prevents a real relationship from forming. The male average score is currently at 53%, etc. etc. Whatever.
Now take it the extra step. How likely is Bob to get a date in that system vs. walking up to a stranger in a bar? If he knows his behavior is tracked, maybe he's better behaved. And maybe that lets him initiate a "transaction" more than he would have in a real life situation because the transactee is more willing to take the risk when he/she can see a good rating.
Now can a virtual world find the obstacles and come up with a system that gives people decent information? I fully agree that it's a tough nut to crack, but it can be done.
And I must disagree with Ola. The ratings system on eBay is ridiculously succesful. The fear of a bad rating and the possibility of dropping below 95 or 99% is the very thing that prevents fraud. Despite what game theorists say ought to happen, people do not cheat. The incidence of fraud on eBay is far, far lower than it is in "real life."
Posted by: Dmitri Williams | Jan 27, 2005 at 12:45
Dmitri> And I must disagree with Ola. The ratings system on eBay is ridiculously succesful.
Yes, but maybe not because of the rating metrics working properly.
Dmitri> The incidence of fraud on eBay is far, far lower than it is in "real life."
How do you know? I've experienced lots of fraud of the item-not-exactly-as-described variety. But I still give them a good rating because if I don't then they will rate me badly. Then I put in the facts in the text. So no, the rating doesn't work all that well, but the text does.
It works to a limited extent for high volume trading. I.e. if you trade large volumes, then repeat customers become rather important.
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 12:56
Or let met put this another way, ebay doesn't work because of "trust", but due to a deep "distrust" which makes buyers do better background checks and which forces volume sellers to make themselves look good.
However, I almost never experience fraud offline and I never really do any background checks either. When I experience fraud online I can't be bothered to report it, I more or less expect it and write it off as an annoying loss. If I experienced the same offline I would get into a fit and call a consumer agency. Don't believe metrics on these things for a second.
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 13:13
Ola>How do you know? I've experienced lots of fraud of the item-not-exactly-as-described variety. But I still give them a good rating because if I don't then they will rate me badly. Then I put in the facts in the text. So no, the rating doesn't work all that well, but the text does.
Yeah, but the text counts, so you're saying that part of the system does in fact work.
Retaliation can be a genuine problem, no doubt. It's just not a deal-killer in most contexts and text can help. Despite what most people fear, you tend to get virtuous rather than vicious cycles.
For those interested in the literature on rep systems, there is a lot of recent stuff:
http://databases.si.umich.edu/reputations/bib/bib.html
The paper I posted earlier in the thread is a nice short version, though.
Posted by: Dmitri Williams | Jan 27, 2005 at 13:29
Aye, at least for me I guess I would say that text is good for trust and (low) scores are more a reason to distrust a seller than to trust. For ebay I think it also matters that the volume-buyers are rather experienced. Amazon.com doesn't work that great in my experience, too many noobs...
The most important indicators can often not be reflected in quantitative reputation systems. E.g. I have the greatest trust in people who do what they do for the love of it (small record companies, audiophile small scale manufacturers etc). That's a very subjective evaluation that a machine can't really help you with.
When one move beyond simple auctioning systems one will have to ask if the cure is worse than the illness. I.e. a reputation system that is invasive or which gives the users a feeling of being misrepresented. Repuation measures in MMOs should be as neutral and presented in a world consistent manner. E.g. if you kill someone your sword will drip with blood for the next 24 hours, rather than "Player X is a griefer".
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 13:58
I wonder if this question of trust is worth looking at from the demand side, rather than the supply side.
The need for security (which determines the need for trusted systems) is one of those dimensions of personality on which people clearly differ. For some, their security (personal, financial, "physical," etc.) factors into every behavioral decision, and (to them) anyone who doesn't think this way is probably some kind of thief. For other people, security is not only something they don't personally think about, they actually regard those who do care about their security as uptight/repressed/bourgois.
These are two very different sets of expectations about the need for tools to determine trustability.
So with respect to any system, the question is: how many of each of these types is likely to use the system? Will there be a lot of people who are so concerned with security/trust that they won't participate in a system without some kind of trust tools? Or will those likely to use the system be folks who want or are willing to accept a significant amount of uncertainty and loss (in return for other things)?
The answers to these questions are what tell you how your system needs to be designed with respect to trust. If you're expecting (or want) free-for-all, anything-goes types, then features to insure trust will not only be a waste of your development time, they'll actually be considered insulting. If on the other hand you want (or think you're likely to get) a more security-conscious clientele, then failing to include some tools for tracking and displaying trustability will probably cause a large percentage of your potential user base to say, "no, thanks -- it's too dangerous there."
So before we jump into designing trust tools, maybe it would be helpful to ask: "Do my likely users really want trust tools? Why?"
--Flatfingers
Posted by: Flatfingers | Jan 27, 2005 at 14:31
Good question Flatfingers, but from a design point of view you want players to lower their expectations somewhat in order to reduce their anger and whining somewhat when they get screwed... To me this isn't a question of preventing people from being screwed, but to "play horror movie music" when they enter the danger-zone. Quite often the danger-zone turns out to not be dangerous, but I don't see why that is a problem really. (In a game, that is)
Conveying player histories in-charcter in-game is a good way to provide hints about what might be in store for you. So, disinguishing between angels/virgins (players who never have killed), regular players, and players who have killed recently might have some use. The accuracy isn't so important as long as the system doesn't judge the players (other players inclusive). In fact, for a game I don't think the system should try to provide objective facts, just some fictionally consistent hints of past actions.
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 15:30
Flatfingers said, "Do my likely users really want trust tools? Why?"
Trust tools enable players to establish and maintain relationships, especially transitive relationships. Without things like stable identity for avatars and some ability to assign trust metrics, relationship-building is curtailed.
Relationships are the currency of community. Without relationships communities won't form.
Communities are the soul of MMOGs. Without stable communities people soon tire of the game aspect and leave.
Without people sticking around, subscription income dwindles. People don't invite their friends, spend time in the world, or maintain their accounts.
So: from the commercial POV, trust metrics are critical to maintaining relationships, community, and the all-important revenue stream. I suspect that better trust tools, as you've called them, will lead to both greater customer acquisition and retention than we're seeing now.
Posted by: Mike Sellers | Jan 27, 2005 at 15:54
I think it's very unclear if trust metrics will help much. Newbies already trust strangers (which is why they also get screwed, but they don't have much to loose anyway. Some might quit over it, but then maybe the game wasn't right for them anyway). Oldbies are classified by guild membership, so how much help would some metrics be for them?
Maybe it could mean something for the middleground, but trust is generally earned through interaction and altruistic behaviour, so what you want to do is to lower risk for interaction, not to objectively assess trustworthieness (which, as Matt pointed out, is a gift to those who can exploit them).
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 16:22
Meta-irony: not everyone trusts trust metrics.
Well, to me a good trust metric seems pretty useful to a newbie precisely so they *don't* get screwed in the first place. Maybe they'd trust the right people, and maybe it would cut down on how often people screw over newbies in general. Thus the virtuous cycle. People who get frustrated and quit aren't necessarily the ones at fault.
And you can always build in some game dynamic that rewards those who earn higher trust scores, whether that's via a metric or testimonials or whatever. (I don't share the desire to avoid numbers here, but certainly want to avoid bad systems).
People love to tout Ultima's ethical behavior, so why not encode it? Or, borrowing from Ola's comments and KOTOR/Fable, what if those who kept screwing others over or who kept helping others started developing a different appearance? That fits a game's style without showing a number. Maybe you trust the guy with the glowing halo. And maybe you hesitate to make a deal with the guy who has developed a hunchback and drools a lot.
Posted by: Dmitri Williams | Jan 27, 2005 at 17:42
But Raph et al did encode Ultima's ethical behaviour... and after a long struggle they apparently removed it, I think? :-) I have to say that Raph's dialogues with players on the issues were very entertaining.
I am all for the halo and well, not a hunchback perhaps, but maybe dark eyes with a red glow? Not as a trust metric, merely a reflection of past actions. If players want to infer trustworthieness from it then it is really their own business...
There are better ways to protect newbies than aggregating reputation-scores, solid design! Simply make them immune to curses, PvP and make it easy to get new equipment in case somebody tells them to hand it over for improvements and runs away with it... (sigh)
Reputation as a score with ingame effects are begging for abuse. Wouldn't it be a lot more fun to tease a person if you can trick him into ruining his reputation? Wouldn't it be a lot of fun to powerlevel a rich twinked angelic character with a heart of gold, start a guild, nurture it for a month and then suddenly turn evil? It's not like reputation systems can't be done, but they really shouldn't tell the users that they can trust person X. Trust-metrics are as troublesome as "distrust-metrics", if not more so.
Posted by: Ola Fosheim Grøstad | Jan 27, 2005 at 18:40
fwiw I've done a fair amount of work on the relationship between power asymmetries and trust (although not applied to gameworlds) - see for example http://www.henryfarrell.net/distrust.pdf .
Posted by: Henry Farrell | Jan 30, 2005 at 23:14