Jump to content

Featured Replies

This is great. I've been doing it thorugh Cloudflare for years and I'm happy to see IC do it too.

This is something that’s been very top of mind for me and it will be a huge relief knowing these measures are in place when I switch to the cloud.

Thank you for making this a priority!

I’d like to think it would never detect anything on my site, but so glad it’s there as protection if we do have some wrong’uns about. Thanks!

Only available for Cloud customers I guess ?

 

Note: this service is not available for Invision Community Classic

 

Only available for Cloud customers I guess ?

Correct...

  • Author
  • Management
 

Only available for Cloud customers I guess ?

That's correct. We do not host the media for Invision Community Classic customers.

 

This is great. I've been doing it thorugh Cloudflare for years and I'm happy to see IC do it too.

How are you doing this through CF? Please share the tips. Every site owner should do this.

 

Provides no legal protection, ensures you are reported if there's an event.
Not great!

Who the heck is going to provide legal protection? If it’s reported, then great! That’s a step toward finding the perpetrators.

 

Who the heck is going to provide legal protection? If it’s reported, then great! That’s a step toward finding the perpetrators.

I hear you. And look, I am certainly for anything that helps stop and prevent spreading bad content around. Protection for the victims is something we should ALL try our best to accommodate. But unfortunately the CF solution only creates risk for you and your site but not much benefit toward solving the problem.

  • It Reports Your Site, Not the Uploader. The tool flags your platform instead of the actual user who uploaded the content. This can make your site look like the offender, even if you’re the one taking positive steps toward removing bad content.

  • Creates a Permanent Legal Paper Trail. Every flagged upload is reported to NCMEC under your site’s name, potentially marking your platform as a repeat source of CSAM

    even if the content was blocked immediately. CF can report false positives.

  • Can Increase Law Enforcement Scrutiny. Multiple reports can trigger investigations into your platform, forcing you to justify why CSAM is appearing instead of focusing on prevention.

  • Loss of Control Over Reporting. The tool auto reports everything based on hash matches, meaning false positives, context, or your moderation actions don’t matter. the report is still sent. As a site owner, you want control so you can focus on reporting valid concerns only and you want to report bad people not your own site as the source.

  • Potential for Service Termination. If Cloudflare receives too many reports related to your site, they can terminate your account, leaving your platform vulnerable to downtime.

There are better solutions that don't increase intrinsic risk for you and your site.

Edited by SJ77

 

I hear you. And look, I am certainly for anything that helps stop and prevent spreading bad content around. Protection for the victims is something we should ALL try our best to accommodate. But unfortunately the CF solution only creates risk for you and your site but not much benefit toward solving the problem.

  • It Reports Your Site, Not the Uploader. The tool flags your platform instead of the actual user who uploaded the content. This can make your site look like the offender, even if you’re the one taking positive steps toward removing bad content.

  • Creates a Permanent Legal Paper Trail. Every flagged upload is reported to NCMEC under your site’s name, potentially marking your platform as a repeat source of CSAM

    even if the content was blocked immediately. CF can report false positives.

  • Can Increase Law Enforcement Scrutiny. Multiple reports can trigger investigations into your platform, forcing you to justify why CSAM is appearing instead of focusing on prevention.

  • Loss of Control Over Reporting. The tool auto reports everything based on hash matches, meaning false positives, context, or your moderation actions don’t matter. the report is still sent. As a site owner, you want control so you can focus on reporting valid concerns only and you want to report bad people not your own site as the source.

  • Potential for Service Termination. If Cloudflare receives too many reports related to your site, they can terminate your account, leaving your platform vulnerable to downtime.

There are better solutions that don't increase intrinsic risk for you and your site.

Show us a better solution.

 

Show us a better solution.

image.png

Well, there is this

 

image.png

Well, there is this

It’s the same thing. Cloudflare.

 

It’s the same thing. Cloudflare.

I did not know this.

 

That's correct. We do not host the media for Invision Community Classic customers.

Any way we can do this on Classic? Even if we have to sign up to something for an API?

Is this part of all cloud communities or is this part of the AI image moderation listed for Business level and up?

 

Any way we can do this on Classic? Even if we have to sign up to something for an API?

No, it's not possible. IPS is not hosting your media for them to be able to scan. As noted elsewhere in this topic, you can look at leveraging Cloudflare to do something similar on your own if you wish.

 

Is this part of all cloud communities or is this part of the AI image moderation listed for Business level and up?

It's not part of the AI scanning. It's done for all cloud hosted accounts automatically.

Recently Browsing 0

  • No registered users viewing this page.