Jump to content

Proactive and reactive moderation - which is right for your online community?

One of the bigger decisions a community manager has to make as a community grows is whether to employ proactive or reactive moderation (or a combination of both). This isn’t always a conscious decision; sometimes forum moderation features are toggled without giving much explicit thought to the style of moderation desired and the pros and cons of doing so. It’s worth taking a moment to consider the reasons behind each type, and come to a justification for one or the other.

Firstly, let’s discuss what we mean by proactive and reactive moderation.


Proactive Moderation
With a proactive approach to moderation, the goal is to prevent bad content from ever appearing in public. The primary way that this is achieved is by having moderation staff review all content posted, and manually approving it after deciding whether it is acceptable.

Another feature that could be classed as proactive moderation is administrator screening of new registrations. When a new user registers in the community, their account can be placed in a ‘validating’ state, requiring an administrator to review the information submitted and deciding whether to approve the account.

As you might expect, proactive moderation is the safest way to ensure bad content doesn’t make it to public view. However, the significant drawback is that users won’t see their content immediately, which can be frustrating and severely stifle productive discussion. At worst, it can push users away from your community altogether. Heavy-handed moderation is often viewed negatively by members who are trying to participate, and can ultimately backfire. 

With a proactive moderation approach, it’s important that you communicate with members one-to-one if they post content with good intentions but which doesn’t meet your criteria. This can reduce resentment over wasted effort, and gives them the opportunity to adjust their approach for future content.


Reactive Moderation
In contrast, a reactive approach to moderation allows user to post freely, without explicit pre-screening of content, with moderators reacting to issues as and when they arise. Reactive moderation is, generally speaking, a more pleasant experience for users because it allows them to engage fully with the community. However, there is of course the risk that unsuitable content is seen in public, at least temporarily.

Choosing a reactive approach doesn’t have to mean a free-for-all. There are many features you can use to make identifying and dealing with bad content a quick and painless process, while still allowing users to contribute freely to the community:

  • Report center
    Allows users to identify bad content and submit notifications to moderation staff for prompt action.
  • Badword filter, URL filtering and keyword triggers
    Prevent common swear words and other divisive terms from being used by censoring them or replacing them with ***. You can also blacklist undesirable URLs from being used within posts. Plus, automatically watch and moderate posts that contain terms you specify.
  • Warning system
    Where a user has proven to be problematic, the warning system in Invision Community allows you to track infractions and apply punishments to the account. These can range from a simple warning message, to suspension, to complete ban. Users can be required to acknowledge the warning before being able to see the community again.
  • Moderation queue
    Individual users can be placed into the moderation queue, requiring all content they post to be screened by a moderator before being visible - a good compromise that means you don’t need to screen all content, just that from troublemakers.
  • Spam service
    The IPS Spam Defense Service is a free service that automatically reviews new registrations to your community to determine whether they match any known spammers, using data crowdsourced from other Invision Community sites. The service can virtually eliminate known spammers from your community, preventing them from ever causing a problem.
  • One-click spam cleanup
    If a spammer does make it into your community, removing their posts and banning them is a one-click action for moderators.
  • Saved actions
    Saved actions make it quick to apply multiple moderation actions in one go. For example, if members often post support topics in a non-support forum, a saved action would allow moderators to move the topic and reply to let the member know what happened - all with a single click.


Which is the right approach for your community?

We recommend creating a clear, detailed set of community guidelines that outlines the boundaries of the community

Every community is different, so there’s no one answer here - that’s why Invision Community includes features that enable both approaches, to allow you to determine which to use.

In general, we suggest thinking of reactive moderation as the default stance, and increasing the amount of oversight you make depending on the circumstances. There are exceptions of course. For example, in a situation where a user posting personally-identifying information in a public forum could have a profound implication for personal safety, a proactive moderation approach might be more desirable. Similarly, if it’s essential that users receive correct information that has been vetted by your staff, you may want to review content before it appears (though in this case, other techniques might be considered, such as staff labelling content once it is ‘approved’ by them).

Your choice need not be entirely one or the other, either. While Invision Community has moderation settings that apply to the entire community, it’s also possible to apply different settings on a per-forum or per-member group basis.

Communities often make use of per-group moderation as a way of screening new members. This is achieved by putting new members into a ‘limited’ group that requires content to be reviewed by a moderator. Then, using Invision Community’s group promotion tools, the member is automatically moved to a regular member group once they have a specified number of approved posts (usually a low number; one to five works well). This approach reduces the danger of a rogue member signing up and creating a problem, without requiring the resources to screen every new post to the community.

Finally, whichever approach to moderation your team ultimately finds work best, we recommend creating a clear, detailed set of community guidelines that outlines the boundaries of the community, and what you consider acceptable and unacceptable from members. Most users don’t set out to create problems for you, and referring to your guidelines can often put the lid on any trouble before it starts.

We hope this overview proves helpful to both new and established communities. If you have any approaches to moderation that you think others might be able to learn from, please go ahead and share them in the comments below!

Edited by Rikki



Recommended Comments

In any community it would the of utmost importance to monitor the posts not just for bad language but to judge the level of knowledge of the member and asses if the member is worthy of contributing to the forum. 

That said its never a good idea to altogether ban a member. If not in the beginning at some point of time if he would grow and mature into an effective member of the forum. 

IP does most of this pretty well. Content Moderation, members cant see if the member is banned unless you log into his profile and only a preset number of infractions can ban a member. What it needs to consider is, as forums are all about opinions, should a member be able to read my post if I dont want him to. Is the member really interested or just trolling and passing cheap comments. Is there a way to ban IP addresses of that computer from reading my posts. While this can be done at the moderator level, should IP allow it at the member level. 

As we grow with time and age we will make an impact on a community. Moderation is no easy task and needs a very high level of judgemental power. What would be the right way of going about this. Any community needs full time moderators. Lets say six moderators that work in shifts. First few posts should be approved and disapproved with warning with in a few hours. As mentioned above the ability to asses if a member is really asking a question or trolling is a key factor in moderation. 


Edited by Farook
Link to comment
Share on other sites

On ‎21‎/‎10‎/‎2017 at 4:20 AM, Mr 13 said:

Could you post only News&Updates in News&Updates section? Please stop bother(with useless notifications) people who subscribed to this section in order to follow news and updates, not QOTW/Obvious things about moderation, etc.

Create separate section for such kind of things.

I for one am getting fed-up with seeing these off topic notifications in my admin cp (news and updates), come on now start taking notice.

if I was to start posting off topic comments and discussions in the wrong forum sections I would soon get pulled up about it :thumbsup:

Edited by bearback
Link to comment
Share on other sites

We use reactive + special moderator tools. This tool (created as application) add special button 'checked' for every post. It visible only for moderators (by their sections). Moders must read all content and set 'checked' on each of them. This tool make knowing what is already checked by other moderator and other moderator didn't need to re-read already checked content. So they have a special tab in moderate panel, named 'Unchecked content' which shows where (by topic) is most number of unchecked content (they see topic only from their sections). So it was extermely helpful for us for good content check experience. It our must have tool because if we skip something bad (for example, alcohol advert or terrorist's text) - we got a big problems from our police and other government sctructures. So we very accurate with that. And we can't use proactive method because we have a high speed discussions sometime (~5 posts per min - daily average, 60+ posts per min in some hot times - all graphs of that available in graphana).

I can show closed demo of that system, if you think it interested to implement as part of the IPS reactive methods)

Link to comment
Share on other sites

1 hour ago, Upgradeovec said:

It visible only for moderators (by their sections). Moders must read all content and set 'checked' on each of them. This tool make knowing what is already checked by other moderator and other moderator didn't need to re-read already checked content.

One of the forums I'm tech on moderate in the same way that they have sections that are theirs and they should read up on. 

They don't have such a tool though. Would it be possible for me to see a demo? Considered sharing in the Marketplace?

Edited by TSP
Link to comment
Share on other sites

5 hours ago, TSP said:

One of the forums I'm tech on moderate in the same way that they have sections that are theirs and they should read up on. 

They don't have such a tool though. Would it be possible for me to see a demo? Considered sharing in the Marketplace?

Sorry, I coudn't share it. Sent PM to you with a screens as a demo.

Link to comment
Share on other sites

A comment: I am happy to receive any info. on the better running of my forums; if I do not want read, I delete. I would rather too much than too little.

I hand-approve all new members; usually within 24 hours, and we (so far) have been spam-free. We have moderators, but have a light touch approach: we do no pre-screening of content, but because we are still fairly small ((2,000+ members) scanning the new content for that day is all we have needed, so far. Because we have a thoughtful "guide to how to use the forums" which stresses the need for posts to be like a conversation with a friend, we have not needed to step on any flame wars, and have only needed to remove one post (content) so far.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Create New...