Invision Community 4: SEO, prepare for v5 and dormant account notifications Matt November 11, 2024Nov 11
Posted February 18Feb 18 Management Popular Post We're pleased to announce that CSAM scanning is now available at no extra cost.What is CSAM scanning?CSAM stands for Child Sexual Abuse Material. The media served by your community will be scanned to identify any matches in a central CSAM database. These lists are provided by leading child safety advocacy groups, such as the National Center for Missing and Exploited Children (NCMEC).CSAM scanning is now recommended to increase compliance in many areas, and you can rest assured that Invision Community will now block any harmful media from being shown in your community. Moderating large communities can be challenging when your members are able to upload photos, and this scanning tool offers great peace of mind.What about privacy?Rest assured that privacy is paramount to Invision Community. CSAM scanning is not an AI tool, and neither does it mean media on your site is being shared with third parties for review. Each photo or video is assigned a digital fingerprint, which is checked against a known database. If there is a match, the photo is blocked from being viewed.What happens if there is a match?If we detect a match for CSAM, the media is blocked from being shown. We will then remove the offending media and, where possible, block the person who uploaded the material.This tool is designed to help you, as an Invision Community owner, moderate your community and protect you from unwittingly sharing CSAM.We hope you never have a member upload such material, but it's reassuring to know that your site will not be complicit in its sharing. By identifying and blocking bad actors' access across our network, we can pre-emptively protect you, too.Can I opt out?No, there is no way to opt out. This scanning service offers great peace of mind, protects you from harmful material being shared, helps compliance and prevents bad actors from accessing other communities.Note: this service is not available for Invision Community Classic.
February 18Feb 18 This is great. I've been doing it thorugh Cloudflare for years and I'm happy to see IC do it too.
February 18Feb 18 This is something that’s been very top of mind for me and it will be a huge relief knowing these measures are in place when I switch to the cloud. Thank you for making this a priority!
February 18Feb 18 I’d like to think it would never detect anything on my site, but so glad it’s there as protection if we do have some wrong’uns about. Thanks!
February 19Feb 19 Note: this service is not available for Invision Community Classic Only available for Cloud customers I guess ?Correct...
February 19Feb 19 Author Management Only available for Cloud customers I guess ?That's correct. We do not host the media for Invision Community Classic customers.
February 23Feb 23 This is great. I've been doing it thorugh Cloudflare for years and I'm happy to see IC do it too.How are you doing this through CF? Please share the tips. Every site owner should do this.
February 24Feb 24 How are you doing this through CF? Please share the tips. Every site owner should do this.Here you go - https://developers.cloudflare.com/cache/reference/csam-scanning/
February 25Feb 25 Here you go - https://developers.cloudflare.com/cache/reference/csam-scanning/Provides no legal protection, ensures you are reported if there's an event. Not great!
February 25Feb 25 Provides no legal protection, ensures you are reported if there's an event. Not great!Who the heck is going to provide legal protection? If it’s reported, then great! That’s a step toward finding the perpetrators.
February 25Feb 25 Who the heck is going to provide legal protection? If it’s reported, then great! That’s a step toward finding the perpetrators.I hear you. And look, I am certainly for anything that helps stop and prevent spreading bad content around. Protection for the victims is something we should ALL try our best to accommodate. But unfortunately the CF solution only creates risk for you and your site but not much benefit toward solving the problem. It Reports Your Site, Not the Uploader. The tool flags your platform instead of the actual user who uploaded the content. This can make your site look like the offender, even if you’re the one taking positive steps toward removing bad content.Creates a Permanent Legal Paper Trail. Every flagged upload is reported to NCMEC under your site’s name, potentially marking your platform as a repeat source of CSAMeven if the content was blocked immediately. CF can report false positives.Can Increase Law Enforcement Scrutiny. Multiple reports can trigger investigations into your platform, forcing you to justify why CSAM is appearing instead of focusing on prevention.Loss of Control Over Reporting. The tool auto reports everything based on hash matches, meaning false positives, context, or your moderation actions don’t matter. the report is still sent. As a site owner, you want control so you can focus on reporting valid concerns only and you want to report bad people not your own site as the source.Potential for Service Termination. If Cloudflare receives too many reports related to your site, they can terminate your account, leaving your platform vulnerable to downtime.There are better solutions that don't increase intrinsic risk for you and your site. Edited February 25Feb 25 by SJ77
February 25Feb 25 I hear you. And look, I am certainly for anything that helps stop and prevent spreading bad content around. Protection for the victims is something we should ALL try our best to accommodate. But unfortunately the CF solution only creates risk for you and your site but not much benefit toward solving the problem. It Reports Your Site, Not the Uploader. The tool flags your platform instead of the actual user who uploaded the content. This can make your site look like the offender, even if you’re the one taking positive steps toward removing bad content.Creates a Permanent Legal Paper Trail. Every flagged upload is reported to NCMEC under your site’s name, potentially marking your platform as a repeat source of CSAMeven if the content was blocked immediately. CF can report false positives.Can Increase Law Enforcement Scrutiny. Multiple reports can trigger investigations into your platform, forcing you to justify why CSAM is appearing instead of focusing on prevention.Loss of Control Over Reporting. The tool auto reports everything based on hash matches, meaning false positives, context, or your moderation actions don’t matter. the report is still sent. As a site owner, you want control so you can focus on reporting valid concerns only and you want to report bad people not your own site as the source.Potential for Service Termination. If Cloudflare receives too many reports related to your site, they can terminate your account, leaving your platform vulnerable to downtime.There are better solutions that don't increase intrinsic risk for you and your site.Show us a better solution.
March 2Mar 2 That's correct. We do not host the media for Invision Community Classic customers.Any way we can do this on Classic? Even if we have to sign up to something for an API?
March 2Mar 2 Is this part of all cloud communities or is this part of the AI image moderation listed for Business level and up?
March 2Mar 2 Any way we can do this on Classic? Even if we have to sign up to something for an API?No, it's not possible. IPS is not hosting your media for them to be able to scan. As noted elsewhere in this topic, you can look at leveraging Cloudflare to do something similar on your own if you wish. Is this part of all cloud communities or is this part of the AI image moderation listed for Business level and up?It's not part of the AI scanning. It's done for all cloud hosted accounts automatically.