Jump to content

NSFW Pictures?


Club Dark

Recommended Posts

Posted

Quick question I've been tossing around - is there any use or desire for a plugin that will attempt to identify if an image is NSFW and treat it special if it is (rather than requiring an image to be manually flagged by the uploader or a moderator)?

  • Replies 60
  • Created
  • Last Reply
Posted
6 minutes ago, bfarber said:

Quick question I've been tossing around - is there any use or desire for a plugin that will attempt to identify if an image is NSFW and treat it special if it is (rather than requiring an image to be manually flagged by the uploader or a moderator)?

I've recently found a free API tool that identifies images automatically,

https://www.moderatecontent.com

I was thinking about using it in a plugin.

Posted
On 11/13/2018 at 3:43 PM, Club Dark said:

Pro... when do you think it will be avalible for download?

And could the USER PROFIL have a button saying .. SEE NSFW / DONT SEE NSFW pictures?

So the user can turn in on and off and dont need to click every pic to see it

Along with that, would it be possible to "lock" that setting to "Don't see NSFW" for accounts in a certain user group?

  • 2 weeks later...
Posted
3 hours ago, Club Dark said:

Any good news sir?

Actually, no more progress rather than what I have posted before.

8 minutes ago, RevengeFNF said:

@A Zayed will it also work with images posted in topics or just in the gallery app?  

The initial release should be for gallery app only.

Posted
On 11/14/2018 at 11:21 AM, bfarber said:

Quick question I've been tossing around - is there any use or desire for a plugin that will attempt to identify if an image is NSFW and treat it special if it is (rather than requiring an image to be manually flagged by the uploader or a moderator)?

Of course! Gallery need some improvements like that. 

Posted
On 11/14/2018 at 8:21 AM, bfarber said:

Quick question I've been tossing around - is there any use or desire for a plugin that will attempt to identify if an image is NSFW and treat it special if it is (rather than requiring an image to be manually flagged by the uploader or a moderator)?

WXfuedi.gif

  • 2 weeks later...
Posted
On 11/14/2018 at 8:21 AM, bfarber said:

Quick question I've been tossing around - is there any use or desire for a plugin that will attempt to identify if an image is NSFW and treat it special if it is (rather than requiring an image to be manually flagged by the uploader or a moderator)?

If Tumblr has taught us anything recently, it's that you have to be careful relying too much on AI to determine those images, haha.

https://www.polygon.com/2018/12/4/18125997/tumblr-nsfw-guidelines-flagging-algorithm

Realistically, though, it could be useful, maybe. It's kind of dubious. I assume you'd be relying on a third-party API to scan images, right?

If it's a simple integration that can be enabled/disabled at will, some communities might be use out of it I'm sure. Holding posts that are potentially NSFW for moderator review for example.

I'm just not sure if that's a feature useful enough that many people would be willing to pay a third party for such a service.

  • 3 weeks later...
Posted

I'd be interested in this as well.  I tried to do something similar with automatic rules, but the effect I wanted to achieve is outside my realm of knowledge.

I wanted there to be a custom field option for them to check if the content they were uploaded was mature (or NSFW) and if checked, that content would be hidden from anyone who is not in a specific secondary group.  I would need it to work site-wide, but getting it to work in the gallery would be a good start.

  • 2 weeks later...
  • 2 weeks later...
Posted
On 11/14/2018 at 5:21 AM, bfarber said:

Quick question I've been tossing around - is there any use or desire for a plugin that will attempt to identify if an image is NSFW and treat it special if it is (rather than requiring an image to be manually flagged by the uploader or a moderator)?

So we own a fashion site, we have like 4 million images. We tried using some API's both free and commercial to identify this NSFW (nude or non nude) and they all had pretty terrible accuracy rates. 

How were you thinking of identifying them? We would for sure pay for a solution should one exist. 

Posted
11 hours ago, maddog107_merged said:

So we own a fashion site, we have like 4 million images. We tried using some API's both free and commercial to identify this NSFW (nude or non nude) and they all had pretty terrible accuracy rates. 

How were you thinking of identifying them? We would for sure pay for a solution should one exist. 

If I were to do this, I'd have to leverage some sort of API to scan the image. I had not spent any time investigating APIs to determine which ones work well and so forth, however. I'd be interested in hearing which ones you've tried and how they fared.

When I replied, I was looking at Google's Cloud AI APIs specifically.

Posted
4 hours ago, bfarber said:

If I were to do this, I'd have to leverage some sort of API to scan the image. I had not spent any time investigating APIs to determine which ones work well and so forth, however. I'd be interested in hearing which ones you've tried and how they fared.

When I replied, I was looking at Google's Cloud AI APIs specifically.

That actually works quite well (as long as you ignore "racy") compared to the stuff I looked at a few years back (tineye comes to mind and some open source software, perhaps NudityDetectioni2v?). Mostly because google has access to the web labels it has as well as its SafeSearch. And considering the only reason I need this is because AdSense doesnt like our content sometimes, this may actually work. 

$1.5/1000 images x 4million images = $6k 😞guess Ill be living off ramen for a while. 

Although I don't like the fact it thinks all swimsuit is NSFW, as if they have never been on instagram or twitter and looked at celebrities 😄

 

Screen Shot 2019-01-31 at 11.53.03 AM.png

Screen Shot 2019-01-31 at 12.18.30 PM.png

Screen Shot 2019-01-31 at 12.17.10 PM.png

Screen Shot 2019-01-31 at 12.02.43 PM.png

Screen Shot 2019-01-31 at 12.02.13 PM.png

Screen Shot 2019-01-31 at 11.54.53 AM.png

Screen Shot 2019-01-31 at 11.54.33 AM.png

Just now, maddog107_merged said:

That actually works ok (as long as you ignore "racy") compared to the stuff I looked at a few years back (tineye comes to mind and some open source software, perhaps NudityDetectioni2v?). Mostly because google has access to the web labels it has as well as its SafeSearch. And considering the only reason I need this is because AdSense doesn't like our content sometimes, this may actually work. 

$1.5/1000 images x 4million images = $6k 😞guess Ill be living off ramen for a while. 

Although I don't like the fact it thinks all swimsuit is NSFW, as if they have never been on instagram or twitter and looked at celebrities 😄

 

Screen Shot 2019-01-31 at 11.53.03 AM.png

Screen Shot 2019-01-31 at 12.18.30 PM.png

Screen Shot 2019-01-31 at 12.17.10 PM.png

Screen Shot 2019-01-31 at 12.02.43 PM.png

Screen Shot 2019-01-31 at 12.02.13 PM.png

Screen Shot 2019-01-31 at 11.54.53 AM.png

Screen Shot 2019-01-31 at 11.54.33 AM.png

 

Posted

Well, you wouldn't run all of your existing images through the service - you'd only run newly submitted images through it. So unless you have over 1000 images per month, if they're charging $1.50 per thousand images, then you're talking $1.50 per month or so. Not too bad I suppose?

This sort of addon struck me as potentially useful in a passing thought, and then I happened across this topic. No idea if I'll carve out time to actually build such an addon, but I could definitely see the usefulness.

Posted

Such an addon would be useful if it came with the ability to flag content as NSFW, keep the content, and add a filter over it so members have to choose whether or not they want to see it. It should also allow members themselves to specify if the content they are uploading is 'sensitive' and have it flag that way too (that way if one doesnt want to have to pay an API service they could still use this). For an artist community like ours, an API makes little sense, but a self-flag or moderator flag could work very well. I really like the way Mastodon handles this, and perhaps you can take a look at that @bfarber.

Posted
20 hours ago, AlexWright said:

Such an addon would be useful if it came with the ability to flag content as NSFW, keep the content, and add a filter over it so members have to choose whether or not they want to see it. It should also allow members themselves to specify if the content they are uploading is 'sensitive' and have it flag that way too (that way if one doesnt want to have to pay an API service they could still use this). For an artist community like ours, an API makes little sense, but a self-flag or moderator flag could work very well. I really like the way Mastodon handles this, and perhaps you can take a look at that @bfarber.

 

I have a similar plugin to what you are looking for that I hired out to make. Let me know if you want it. Essentially if you are a guest, you will need to log in to see content. The reason we went that route is AdSense is essentially a guest, and they blacklist pages with NSFW content, so they cant see it. 

The plugin has auto filters (you set them) so if the post contains a "spoiler"/hidden tag or keywords (nsfw, nudity,nude) its auto flagged. There is also a small button where a mod/admin can flag it as NSFW. 

 

Let me know if you want it. 

 

 

Screen Shot 2019-02-05 at 9.34.07 AM.png

Screen Shot 2019-02-05 at 9.35.19 AM.png

Posted
23 minutes ago, maddog107_merged said:

 

I have a similar plugin to what you are looking for that I hired out to make. Let me know if you want it. Essentially if you are a guest, you will need to log in to see content. The reason we went that route is AdSense is essentially a guest, and they blacklist pages with NSFW content, so they cant see it. 

The plugin has auto filters (you set them) so if the post contains a "spoiler"/hidden tag or keywords (nsfw, nudity,nude) its auto flagged. There is also a small button where a mod/admin can flag it as NSFW. 

 

Let me know if you want it. 

Hiya! This wouldn't work for us, since we would need it for the Gallery application, and we already have a toggleable NSFW group that checks Birthday for permissions. The Filter would be nice because some of our art deals with extreme kinks, and we want to keep things as simple as possible (one general gallery, one NSFW gallery that has the image filter for extreme content). I appreciate the offer though!

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...