fbpx

Subgroups: Moderating an Online Community


How to moderate or manage a forum or online community using moderators and admins.

We divide our social network into how we are like and how we are not like one another. Most communities are politically correct and only have “likeness” applications. Usually a friends list or we get feeds which say “if you liked this, you’ll love this” a la Amazon. Real communities also identify enemy lists (think of foes in online games) and occasionally you’ll find a “if you liked that, you’ll hate this”. I believe it’s critical to have a “slash ignore” function, as it cuts down moderation. While you will always have to boot some troll from the community, a bit of bitching can be settled down by telling both parties to add the opponent to the ./ignore list.

One of the fundamental features of an online community is subgroups. In gaming forums, it might be guilds. In Facebook it’s groups. Other communities call subgroups tribes or factions. People swarm into smaller groups – if you don’t allow for it, and manage that swarming down, the community forks anyway, usually in a ferocious fashion and hives off into another community. In software development, it’s called forking and both groups take the source code and create competing products. This happened a while ago with Mambo/Joomla! but it’s not uncommon.

When managing an online community it’s imperative that you allow the moderators (online customer service) to feel that they have their own sub-community.
Do-not-feed-the-troll
I set up private areas for them. In a forum or bulletin board the areas would look like this:

Moderator General Discussion area:
This is where they support each other:

  • Technical:  How do I IP-range ban a member?
  • Procedural: Is cross-posting in 3 forums spamming?
  • Emotional: An ex-member is harassing me and threatening me, and it’s upsetting me
  • Social: Anyone seen this funny video?
  • Internal: How do I apply for the Senior Moderator/Admin role?
  • Feedback: I think we need to add tagging to posts
  • and so on

Moderators Rap Sheet:
I use the rap sheet to track naughty and allegedly naughty members. It is a thread or section that tracks:

  • the members logon name,
  • any aliases (members create Jekyll and Hide profiles so they can vent occasionally) ,
  • IP addresses (all of them, from year dot),
  • email addresses (all of them, from verified signup email to noone@nowhere.com changes),
  • dates and times of infractions of rules and any other tips on that member such as “watching this one, seems to be spoiling for a fight but has not yet broken the rules (enough) to warrant a warning/banning”
  • Warnings and bannings and emails/personal messaging around that. It all gets kept as a communication audit of that member. Absolutely all of it from the email that says “you are hereby formally warned for xyz” through to their response “get lost I hate you” to the “I’m so sorry, please let me back in”.
  • links to threads/posts placed in the evidence locker (see later)

Even if the member only gets drunk once a year and lets loose a tirade, at least this discussion post lets incoming moderators know the full history (and suspicions) around that member. Don’t let people create duplicate threads – there should be one rap sheet per member. Often a troll will create multiple accounts and it’s difficult to sort out – in that case:

  • on the smaller secondary rap sheet, for the troll account, link to the main Rap Sheet, perhaps saying “This file on Gandalf1980 is now closed and moved to his main character Rap Sheet at DarthVaderRulez” or whatever
  • then close the unwanted one
  • If you can’t be bothered copying and pasting all the links and discussions from the troll account to the main Rap Sheet, simply post a link to the closed (but not deleted) thread “for more information on his alternate character, check here”.

Communication trails are a beautiful thing. The member can’t say “oh but the other moderator said it was ok”, when you are staring at a copy of the other mods email and can quote it back verbatim. Keep EVERYTHING open in the moderator subgroup. Do NOT let mods get into public or private flame wars with members via personal messaging in the forums. If the mod knows they have to copy it to the thread, they will triple check how they say things.

And new moderators are often surprised that they may have unvoiced suspicions about someone and an experienced mod will post “be careful of XYZ member, I think they are trolling for personal information”. We sense more about people’s online behaviour than we realise. And no, it doesn’t become a forum of negativity, just a tool for feeling safer and keeping the community safe online.

Evidence Locker
This is where you remove unwanted content from the main forums to a private locked area. If some charming member has posted up the same porn content 250 times, delete 249, move one copy to the forum and make a comment about the number removed. Also link to the rap sheet – the evidence locker is for content, the rap sheet for member profile information.

Moderator Wiki
Guidelines and rules evolve. Some rules are hard and fast. Thou wilt not post pornography. But others are etiquette statements – criticise the argument, not the person. Not exactly a rule.

Subgroups have different values from each other, as variable as from member to member. So do moderators. One moderator might hate the use of fcuk or feck or fark to get around cussing filters, others won’t care less. This can confuse the community who will think one mod is “picking on them” or a “forum nazi” and another one is cool. Keep the playing field level! By making -sorry, enabling 😛 – moderators discuss these issues and their actions in private forums and wikis, away from the community, they pull together as a team. And no, it can’t be discussed openly in the interests of “transparency”. After all, it’s not the 99% of good members that cause the work, but the 1% of predators, trolls, organised crime and whatever.

The moderator subgroup should reflect the tools and structure and incentives you have set up in the wider community. Do you have a Leadership Spotlight program to encourage evangelists? Great, then have a Top Mod program too. Running rituals and events in the main community? Cool, then encourage the moderators to do the same for themselves.

Moderating is customer service, but it’s not the “answer the call in 20 seconds and resolve the issue in 2 minutes” type of customer service. Look after your moderators!

Hope this helps those who are setting up forums or communities and is at least interesting to those who aren’t.

Laurel Papworth

Named by Forbes™ Magazine in the Top 50 Social Media Influencers globally, named Head of Industry, Social Media (Marketing Magazine™) and in the Power150 Media bloggers (AdAge™). CERT IV Training and Assessment certified trainer (Diplomas and Certificates etc) Adult Education. Laurel has manager Facebook Pages for Junior Masterchef, Idol, Big Brother etc. and have consulted on private online communities for banks Westpac, not for profits UNHCR & governments in SE Asia. Lecturer, social media, University of Sydney for 10 years and Laurel has 11,000 online students. Laurel Papworth personally connects to 6 million followers online and has taught around 100,000 people in the last 10 years how to be social media managers.

16 thoughts on “Subgroups: Moderating an Online Community

  1. hey, this is a great introduction to online moderation. over at the http://www.indyish.com/stay-at-the-pension-house-when-you-visit-montreal#comments, we’re talking a lot about trolls and constructive negotiation of opinions (check the comment section for one recent conversation). i sometimes feel at a loss for ways to deal with members who are invested in our webspace, yet choose to comment in an aggressive and often offensive tone. it’s nice to know there’s a body of work developing around the subject. thanks for your great work!

  2. More moderation tools have been added into the brandstation social network platform. Site owners get to see who’s been reported, what was said about them, by who, on what date etc. Then site owners or admin users can inactivate users.
    Brandstation is a social networking platform enabling companies to enjoy the benefits of their own social network for internal and external communications. It’s been created by viewmy.tv specialists in social media. www.brandstation.tv

  3. Great post, I don’t see nearly enough written about the importance of moderation in building communities.

    I think an important add here is that the interaction the moderators have with the community help set the tone, with community members modelling the behaviour that is already occurring.

  4. @Mauritius12 Bidde 😉 Der Typ da hat die 4. und das was voner 5. da is hochgeladen, aber nur mit englischen Untertitel 😉 http://bit.ly/6BmT

  5. RT @SilkCharm: What support are you giving yr social media admins? http://bit.ly/7t0q4 The Moderators' Community – adapt for Facebook & …

  6. a truly interesting way to look at some thing that most people find difficult to comprehend. Before this post, I never quite envisaged that this is how things go.

Comments are closed.

Recent Content