« Prediction Time Again | Main | World of Warcraft is just numbers »

Jan 17, 2011

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c022953ef0147e1acd797970b

Listed below are links to weblogs that reference Automated community policing?:

Comments

1.

"gesellschaft", not "gesselschaft"

2.

Well... user forums have been "solving" customer service for more than a decade now. I can count the number of times I've received help on a Microsoft product from Microsoft itself on one finger (guess which one?). Most tutorials I've used on a variety of products have come from the community.

This is more like solving community policing. I think it sounds like a great idea. Not sure how you'd "game" this system, though, if the players on the jury are random. You grief, someone complains, details are provided to the random panel, they adjudicate, you get thumbs-up or down.

Seems like you could also use this system to discourage petty or frivolous griefing reports. If you accuse someone, and the panel says, "No... t'weren't griefing..." You get hit with some minor negative stimulus.

I love it. The game of justice!

3.

I like the idea a lot too. I'm just super curious how it'll play out given the details.

The "gaming the system" part would be if adjudicators (is that the right name?) would be trying to guess the majority vote rather than deciding based on the merits of the case. At its worst, it's a test of perceptions of others rather than substance. Then it becomes "how tough are people?" Maybe it'll be fine, but that crowd-based mechanic stuck out to me. In jury duty, we get paid (ha-ha) for coming and doing the right thing, not agreeing with others. When we get lazy and agree to get things over with you get "12 Angry Men."

IANAL or a jury-selection expert, so I'd love to hear one weigh in with how this will play out.

4.

I believe a system on these lines has been used at BoardGameGeek to vet postings for a number of years, with good results. The incentives for the panel members are to align with perceived general opinion. The weak point comes if that perception points somewhere the site's/game's owners did not intend.

5.

"A Tale in the Desert" (ATITD - http://www.atitd.com/) is a niche game that exists since 2001 (I think) and has a kind of player policing.

Every once in a while, there is a vote in three rounds. There is a number of groups with 7 players that have to vote in eachother after 48h of debate. The winner of each group moves to the next phase. 2nd round repeats and the 3rd round is an open global vote on the 7 candidates. The winner becomes "demi-pharao", granting him (amongst other things) the right to PERMA-BAN 7 other players.

Demi-Pharaohs end up being used as game police. If there is a major problem, players complain to them and ask to intervene. But there is a tabu on ban usage and as such I believe it was only used 3 or 4 times in all these years. The voting ends up picking the people with the best game reputation that will give assurance they will "never" use the ban.

Just to show that there is "automated policing" around ;)
By the way, that game also allows for player-written laws to be voted and then the Dev implements them.

Disclosure: I played the game from 2004 to 2007.

6.

I think Richard Bartle raised this point during one of his recent online lectures. Wasn't Everquest sued for using this exact strategy, because it constituted an indirect source of income for the players? I can see how small niche communities like ATITD and Achaea, which even outsourced content creation to volunteers, can get away with this, but LoL, being a huge player, is bound to fall under the watchful eye of some lawyer.

7.

One of the problems with the idea is if the community has low standards for acceptable behavior, the judges will continue to uphold those low standards. For example, if the community believes that it’s okay to call people ‘bitches’ and ‘retards’ then the player-judges probably won’t issue reprimands for players in those cases, because those terms are acceptable in the community. The problem with the majority verdict notion is if the majority’s standards are poor, it doesn't solve the issue of making a community any less of a toxic place. Given this, it would be difficult for a game to improve those standards and have their community atmosphere rise above that low community standard. There are probably minimum standards that community judges have to adhere to (i.e. players shouldn't call each other 'n*****'), but there are nuances in human interaction that judges may let slide because a majority of them simply don't think that, for example, casual sexism (i.e. "You play like a girl!") is all that bad of a thing.

8.

Here's a community based approach detailed in my book Building Web Reputation Systems from O'Reilly http://oreilly.com/catalog/9780596159801

From Chapter 10 - Case Study: Yahoo! Answers Community Content Moderation: [Wiki version here -> http://buildingreputation.com/doku.php?id=chapter_10 ]

"In summer 2007, Yahoo! tried to address some moderation challenges with one of its flagship community products: Yahoo! Answers (answers.yahoo.com). The service had fallen victim to its own success and drawn the attention of trolls and spammers in a big way. The Yahoo! Answers team was struggling to keep up with harmful, abusive content that flooded the service, most of which originated with a small number of bad actors on the site.

Ultimately, the answer to these woes was provided by a clever (but simple) system that was rich in reputation: it was designed to identify bad actors, indemnify honest contributors, and take the overwhelming load off of the customer care team. Here's how that system came about."

9.

I think that this will be a great change, cause so many people on LoN are just plain jerks.

Hopefully people are responsible with their "griefing decisions" or else there will be a new way to grief people (by getting them flagged for griefing!)

The comments to this entry are closed.