guardlogic

guardlogic / AdultImageClassifier / 0.1.0

README.md

Overview

Estimates the probability that an image contains "adult" content. This generally means sexually explicit, not-safe-for-work (NSFW) content.

Usage

Input

Image url: eg,, http://www.domain.com/img.jpg

or

Algorithmia data url: eg., data://path_to_image_file

or

json request with the field image containing the image url or data url. Images can be color or gray-scale.

Output

json containing adult_prob as the probability that the image contains adult content. The probability lies in (0,1) and its value is formatted as a string.

Note: It is up to the user to decide the threshold in (0,1) above which the content of the image will be treated as "adult" in his or her application.

Examples

Request:

{'image': 'https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Sus_cebifrons_negrinus_piglet.jpg/1599px-Sus_cebifrons_negrinus_piglet.jpg'}

or

'https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Sus_cebifrons_negrinus_piglet.jpg/1599px-Sus_cebifrons_negrinus_piglet.jpg'

Result:


{'adult_prob': '000106323'}