20

I need a provision of detecting the adult image in the websites where the user uploads the pictures to determine that a picture isn’t acceptable for my site. Can any one suggest the method to do this?

I need kind of open source code/program (PHP) that could be implemented in the website and stops the user to upload the picture. Earlier to the my idea there is a class image filter http://www.phpclasses.org/browse/package/3269.html But I want code which is similar to this or maybe even more advanced.

BalusC
  • 1,082,665
  • 372
  • 3,610
  • 3,555
sakthipriyabalaji
  • 217
  • 1
  • 2
  • 3
  • 4
    Even Google's image search filters slip up from time to time. Your best bet is to hold all uploads in a queue where a human can publish the acceptable ones and reject the rest. – tadamson Mar 02 '10 at 17:47
  • This is a nice example for a visual Captcha. ;-) Better use human resources to decide whether an image is appropriate or not. Make it easy to report such images and get some moderators that delete them. – Gumbo Mar 02 '10 at 17:48
  • By the way, how is that "class image filter" insufficient for you? Didn't it work as expected? ;) – BalusC Mar 02 '10 at 17:58
  • possible duplicate of [What is the best way to programatically detect porn images?](http://stackoverflow.com/questions/713247/what-is-the-best-way-to-programatically-detect-porn-images) – Gordon Mar 16 '12 at 09:27

10 Answers10

19

Sorry mate, I think the bottom line here is you aren't going to get around manual checking. There are approaches, but none really reliable and with a lot of false positives.

Check out this question: What is the best way to programatically detect porn images? for inspiration.

If you ask me though, it's a waste of time to look for an automated solution. Given half an hour's time, I can find you twenty images that would trigger a "nude" alarm even though perfectly innocent, and the same the other way around. Also, nudity is not going to be the only thing you do not want on your site.

Better spend the time on a system that makes it really easy to manually verify the content users upload to your site, e.g. as a Desktop widget, on the mobile phone, or whatever suits you best.

Community
  • 1
  • 1
Pekka
  • 442,112
  • 142
  • 972
  • 1,088
  • There's lots of good stuff on that linked question. I especially liked the link to Amazon's mechanical turk. – a'r Mar 02 '10 at 17:44
  • @ar yes, there are approaches, but the mechanical turk costs money. In *most* cases for small sites and projects, manual checking is really the way to go. Correct me if I'm wrong of course - I'd be interested to hear whether anybody is using the Turk or other things successfully to fight bad uploaded content. – Pekka Mar 02 '10 at 17:48
  • 1
    +1 For acknowledging off the bat it can't really be done. – C. Ross Mar 02 '10 at 17:48
  • See my comment below about crowdsifter. That's essentially dolores' labs business model. They are an intelligent, cheaper wrapper around the Turk – mcpeterson Mar 02 '10 at 17:57
  • "Sorry mate, I think the bottom line here is you aren't going to get around manual checking". Why would you feel sorry about this? :P – PeeHaa Mar 16 '12 at 09:33
4

The only way I know to do this is to have uploads moderated. Images go into a moderation queue and are then "passed" or "rejected" by a human being.

Robusto
  • 31,447
  • 8
  • 56
  • 77
4

Just an idea. Could maybe come up with a solution that utilizes Tin Eye to see where the image is found. Then pair those results with a website filtering program. If it finds them to be pornographic sites it could give you some level of filtering.

NebuSoft
  • 3,962
  • 2
  • 22
  • 24
4

I agree with Pekka's comments about automation. That's a rather difficult problem that is not quite solved yet.

Have you thought about http://crowdsifter.com/ or doloreslabs? They can crowdsource a moderated queue for you, or check your existing images on the cheap. If its a business website that might be the trick.

Note: I am not affiliated with dolores labs.

mcpeterson
  • 4,894
  • 4
  • 24
  • 24
3

This works more or less, with a few complications to think about.

  • Don't automatically delete images if a picture get flagged

The script does not detect nudity itself, but colors that are likely to be in a pornographic picture.

I would send flagged pictures that the script detects as "pornographic" in a moderation queue and then choose what to do or not to do. Simple as that. Keep in mind that pictures of babies and similar definitely will get flagged from time to time.

https://github.com/FreebieStock/PHP-Image-Pornographic-Content-Detection

kanarifugl
  • 9,887
  • 5
  • 29
  • 39
3

One of the best app for this request is:

http://www.9lessons.info/2014/01/block-uploads-of-adult-or-nude-images.html

2

All current methods involve the skin color detection. this is never accurate and not even close to advanced.

If at all something advanced exist, that could be used to stop child pornography.

SysAdmin
  • 5,455
  • 8
  • 33
  • 34
  • Fun part...if the image is greyscale, there's no color to detect. What are you going to do, outlaw images containing too much `#959595` or something? :) – cHao Mar 18 '12 at 15:42
2

This is a generic principle regardless of what language/technique. The best way to combat this is to allow the human to check! In short, there would be a lot of work involved, for instance, how can you tell if the image is nude or child pornography by examining the pixels - it cannot be done as there would be a level of sophistication involved, like, there is a trial at the moment in my country for mobile operators to automatically block images being sent across in order to clamp down on pornography, just don't ask me how - but apparently reports of it are successful, I just would not trust whatever algorithm they have used as it may/could generate false positives!

This is quite a similar thing to the usage of Captcha, to block spammers, only the human would have to enter a magic word(s) or number(s) in such way that the analysis of the image will deduce what the image will contain thereby stopping the flow of spammers.

In your case, perhaps, prevent the upload of images, nahh, that's a bit too restricted, do what this site is doing - moderating. You need to moderate the images first (perhaps a queue or a safe holding directory to store the images) and decide if they are suitable or not.

If not, depending on the seriousness of the image - this could involve contacting the ISP and the local law enforcements (this is where the grey area happens - how would you know and not over-react?)....

It would pay to be wise, prudent and just alert the authorities in that case, if it's pornographic nature, inform the ISP, pass the buck onto them, and let them in turn decide the best course of action OR inform a local authority website that reports this kind of thing...like in my country, we have a link to the hotline web-site where people can anonymously post an email to the site and inform that they have encountered a pornographic image...

I am not a lawyer...but IIRC, and this is where the matter can stink up, you can view them, but to download them is illegal...again I have to post this disclaimer - I am not a lawyer, you need to check first....this is what I mean by where the grey area happens...

So in a nutshell, no one is above a computer, and there is a more smarter entity than a computer itself, i.e. a human moderator can only check and ascertain first prior to viewing the image...

Kiquenet
  • 14,494
  • 35
  • 148
  • 243
t0mm13b
  • 34,087
  • 8
  • 78
  • 110
2

Might wanna check this out, this is a J/S file. Problem is that it only runs on the latest browsers - Firefox 3.6+, Chrome, Safari, Opera and Internet Explorer 9.

http://nerd5.com/web-development/detect-nudity-using-javascript-nude-js.html

firedrawndagger
  • 3,573
  • 10
  • 34
  • 51
1

At Free Vectors we are also interested in this topic. Thus, we have just implemented and shared a PHP Nudity Detection class on Github based on J. Marcial-Basilio et al. (2011) algorithm, that tries to detect nudity by counting skin color count.

Thank you @Dejan for pointing us to the original research paper.

Community
  • 1
  • 1
MrTraffic
  • 11
  • 2