24

I am currently developing a website for a client. It consists of users being able to upload pictures to be shown in a gallery on the site.

The problem we have is that when a user uploads an image it would obviously need to be verified to make sure it is safe for the website (no pornographic or explicit pictures). However my client would not like to manually have to accept every image that is being uploaded as this would be time consuming and the users' images would not instantly be online.

I am writing my code in PHP. If needs be I could change to ASP.net or C#. Is there any way that this can be done?

halfer
  • 19,824
  • 17
  • 99
  • 186
Glen Robson
  • 908
  • 2
  • 19
  • 42

6 Answers6

31

2019 Update

A lot has changed since this original answer way back in 2013, the main thing being machine learning. There are now a number of libraries and API's available for programmatically detecting adult content:

Google Cloud Vision API, which uses the same models Google uses for safe search.

NSFWJS uses TensorFlow.js claims to achieve ~90% accuracy and is open source under MIT license.

Yahoo has a solution called Open NSFW under the BSD 2 clause license.

2013 Answer

There is a JavaScript library called nude.js which is for this, although I have never used it. Here is a demo of it in use.

There is also PORNsweeper.

Another option is to "outsource" the moderation work using something like Amazon Mechanical Turk, which is a crowdsourced platform which "enables computer programs to co-ordinate the use of human intelligence to perform tasks which computers are unable to do". So you would basically pay a small amount per moderation item and have an outsourced actual human to moderate the content for you.

The only other solution I can think of is to make the images user moderated, where users can flag inappropriate posts/images for moderation, and if nobody wants to manually moderate them they can simply be removed after a certain number of flags.

Here are a few other interesting links on the topic:

Muhammad Dyas Yaskur
  • 6,914
  • 10
  • 48
  • 73
Brett Gregson
  • 5,867
  • 3
  • 42
  • 60
  • Thanks for your reply i will look into nude.js and PORNsweeper. In regards to the latter of your comment, I will be implementing a user moderated function to allow the users to flag an image it the think it is inappropriate. However i cannot just use this method as this would allow for images that could be illegal to be saved on my server. – Glen Robson Jan 18 '13 at 10:14
  • No problem, I have updated my answer with another possible solution for you (Amazon Mechanical Turk) – Brett Gregson Jan 18 '13 at 11:27
  • That looks like a good idea for the future but for now we need to keep costs to a minimum. Again thanks for the information, ill see how i get on. – Glen Robson Jan 18 '13 at 11:34
  • No problem. Please let us know what you end up going with, will be interesting to see what your solution is :) – Brett Gregson Jan 18 '13 at 11:34
6

The example below does not give you 100% accurate results but it should help you a least a bit and works out of the box.

<?php
$url = 'http://server.com/image.png';
$data = json_decode(file_get_contents('http://api.rest7.com/v1/detect_nudity.php?url=' . $url));

if (@$data->success !== 1)
{
    die('Failed');
}
echo 'Contains nudity? ' . $data->nudity . '<br>';
echo 'Nudity percentage: ' . $data->nudity_percentage . '<br>';
Jack
  • 173
  • 1
  • 3
  • This gets my vote - not a paid API like some of the answers here, relatively accurate results (with a % returned to allow you to queue up potential false results), and a super quick implementation... – freestate May 01 '19 at 11:32
  • Any idea who is behind this API? – Cocowalla Dec 08 '21 at 19:38
3

If you are looking for an API-based solution, you may want to check out Sightengine.com

It's an automated solution to detect things like adult content, violence, celebrities etc in images and videos.

Here is an example in PHP, using the SDK:

<?php
$client = new SightengineClient('YourApplicationID', 'YourAPIKey');

$output = $client>check('nudity')>image('https://sightengine.com/assets/img/examples/example2.jpg');

The output will then return the classification:

{ "status": "success", "request": { "id": "req_VjyxevVQYXQZ1HMbnwtn", "timestamp": 1471762434.0244, "operations": 1 }, "nudity": { "raw": 0.000757, "partial": 0.000763, "safe": 0.999243 }, "media": { "id": "med_KWmB2GQZ29N4MVpVdq5K", "uri": "https://sightengine.com/assets/img/examples/example2.jpg" } }

Have a look at the documentation for more details: https://sightengine.com/docs/#nudity-detection (disclaimer: I work there)

agrandiere
  • 31
  • 2
2

There is a free API that detects adult content (porn, nudity, NSFW).

https://market.mashape.com/purelabs/sensitive-image-detection

We've using it on our production environment and I would say it works pretty good so far. There are some false detections though, it seems they prefer to mark the image as unsafe if they are unsure.

DenisL
  • 308
  • 2
  • 10
0

It all depends on the level of accuracy you are looking for, simple skin tone detection (like nude.js) will prob get you 60-80% accuracy on a generous sample set, for anything more accurate than that, let's say 90-95%, you are going to need some specialized computer vision system with an evolving model that is revised over time. For the latter you might want to check out http://clarifai.com or https://scanii.com (which I work on)

Rafael Ferreira
  • 1,260
  • 8
  • 11
0

Microsoft Azure has a very cool API called Computer Vision, which you can use for free (either through the UI or programmatically) and has tons of documentation, including for PHP.

It has some amazingly accurate (and sometimes humorous) results.

Outside of detecting adult and "racy" material, it will read text, guess your age, identify primary colours, etc etc.

You can try it out at azure.microsoft.com.

Sample output from a "racy" image:

FEATURE NAME:               VALUE:
Description                 { "tags": [ "person", "man", "young", "woman", "holding",
                              "surfing", "board", "hair", "laying", "boy", "standing", 
                              "water", "cutting", "white", "beach", "people", "bed" ], 
                              "captions": [ { "text": "a man and a woman taking a selfie", 
                              "confidence": 0.133149087 } ] }
Tags                        [ { "name": "person", "confidence": 0.9997446 }, 
                              { "name": "man", "confidence": 0.9587285 }, 
                              { "name": "wall", "confidence": 0.9546831 }, 
                              { "name": "swimsuit", "confidence": 0.499717563 } ]
Image format                "Jpeg"
Image dimensions            1328 x 2000
Clip art type               0
Line drawing type           0
Black and white             false
Adult content               true
Adult score                 0.9845981
Racy                        true
Racy score                  0.964191854
Categories                  [ { "name": "people_baby", "score": 0.4921875 } ]
Faces                       [ { "age": 37, "gender": "Female",
                                "faceRectangle": { "top": 317, "left": 1554, 
                                                   "width": 232, "height": 232 } } ]
Dominant color background   "Brown"
Dominant color foreground   "Black"
Accent Color                #0D8CBE
ashleedawg
  • 20,365
  • 9
  • 72
  • 105