Jump to content
Rikki

Proactive and reactive moderation - which is right for your online community?

One of the bigger decisions a community manager has to make as a community grows is whether to employ proactive or reactive moderation (or a combination of both). This isn’t always a conscious decision; sometimes forum moderation features are toggled without giving much explicit thought to the style of moderation desired and the pros and cons of doing so. It’s worth taking a moment to consider the reasons behind each type, and come to a justification for one or the other.

Firstly, let’s discuss what we mean by proactive and reactive moderation.

 

Proactive Moderation
With a proactive approach to moderation, the goal is to prevent bad content from ever appearing in public. The primary way that this is achieved is by having moderation staff review all content posted, and manually approving it after deciding whether it is acceptable.

Another feature that could be classed as proactive moderation is administrator screening of new registrations. When a new user registers in the community, their account can be placed in a ‘validating’ state, requiring an administrator to review the information submitted and deciding whether to approve the account.

As you might expect, proactive moderation is the safest way to ensure bad content doesn’t make it to public view. However, the significant drawback is that users won’t see their content immediately, which can be frustrating and severely stifle productive discussion. At worst, it can push users away from your community altogether. Heavy-handed moderation is often viewed negatively by members who are trying to participate, and can ultimately backfire. 

With a proactive moderation approach, it’s important that you communicate with members one-to-one if they post content with good intentions but which doesn’t meet your criteria. This can reduce resentment over wasted effort, and gives them the opportunity to adjust their approach for future content.

 

Reactive Moderation
In contrast, a reactive approach to moderation allows user to post freely, without explicit pre-screening of content, with moderators reacting to issues as and when they arise. Reactive moderation is, generally speaking, a more pleasant experience for users because it allows them to engage fully with the community. However, there is of course the risk that unsuitable content is seen in public, at least temporarily.

Choosing a reactive approach doesn’t have to mean a free-for-all. There are many features you can use to make identifying and dealing with bad content a quick and painless process, while still allowing users to contribute freely to the community:

  • Report center
    Allows users to identify bad content and submit notifications to moderation staff for prompt action.
  • Badword filter, URL filtering and keyword triggers
    Prevent common swear words and other divisive terms from being used by censoring them or replacing them with ***. You can also blacklist undesirable URLs from being used within posts. Plus, automatically watch and moderate posts that contain terms you specify.
  • Warning system
    Where a user has proven to be problematic, the warning system in Invision Community allows you to track infractions and apply punishments to the account. These can range from a simple warning message, to suspension, to complete ban. Users can be required to acknowledge the warning before being able to see the community again.
  • Moderation queue
    Individual users can be placed into the moderation queue, requiring all content they post to be screened by a moderator before being visible - a good compromise that means you don’t need to screen all content, just that from troublemakers.
  • Spam service
    The IPS Spam Defense Service is a free service that automatically reviews new registrations to your community to determine whether they match any known spammers, using data crowdsourced from other Invision Community sites. The service can virtually eliminate known spammers from your community, preventing them from ever causing a problem.
  • One-click spam cleanup
    If a spammer does make it into your community, removing their posts and banning them is a one-click action for moderators.
  • Saved actions
    Saved actions make it quick to apply multiple moderation actions in one go. For example, if members often post support topics in a non-support forum, a saved action would allow moderators to move the topic and reply to let the member know what happened - all with a single click.

 

Which is the right approach for your community?

We recommend creating a clear, detailed set of community guidelines that outlines the boundaries of the community

Every community is different, so there’s no one answer here - that’s why Invision Community includes features that enable both approaches, to allow you to determine which to use.

In general, we suggest thinking of reactive moderation as the default stance, and increasing the amount of oversight you make depending on the circumstances. There are exceptions of course. For example, in a situation where a user posting personally-identifying information in a public forum could have a profound implication for personal safety, a proactive moderation approach might be more desirable. Similarly, if it’s essential that users receive correct information that has been vetted by your staff, you may want to review content before it appears (though in this case, other techniques might be considered, such as staff labelling content once it is ‘approved’ by them).

Your choice need not be entirely one or the other, either. While Invision Community has moderation settings that apply to the entire community, it’s also possible to apply different settings on a per-forum or per-member group basis.

Communities often make use of per-group moderation as a way of screening new members. This is achieved by putting new members into a ‘limited’ group that requires content to be reviewed by a moderator. Then, using Invision Community’s group promotion tools, the member is automatically moved to a regular member group once they have a specified number of approved posts (usually a low number; one to five works well). This approach reduces the danger of a rogue member signing up and creating a problem, without requiring the resources to screen every new post to the community.

Finally, whichever approach to moderation your team ultimately finds work best, we recommend creating a clear, detailed set of community guidelines that outlines the boundaries of the community, and what you consider acceptable and unacceptable from members. Most users don’t set out to create problems for you, and referring to your guidelines can often put the lid on any trouble before it starts.

We hope this overview proves helpful to both new and established communities. If you have any approaches to moderation that you think others might be able to learn from, please go ahead and share them in the comments below!
 

Edited by Rikki


×
×
  • Create New...