deeplearning

deeplearning / EmotionRecognitionCNNMBP / 1.0.1

README.md

1. Introduction

This algorithm gives you the emotion for each person in the given photo with its corresponding confidence interval.

Input:

  • (Required) Image Data API Url, Web (http/https) Url, binary image or a base64 encoded image.
  • (Optional) Number of results (default=3, max=7)

Output:

  • A list of emotions and bounding-box information for each detected person

Note: The first call to this algorithm will take a bit longer than sequential calls to due algorithm initialization. All following calls will be significantly faster.

2. Examples

Example 1.

  • Parameter 1: Data API Url
{
    "image": "data://deeplearning/example_data/elon_musk.jpg"
}

Output

{
  "results": [
    {
      "bbox": {
        "bottom": 911,
        "left": 295,
        "right": 849,
        "top": 357
      },
      "emotions": [
        {"confidence": 0.9386989, "label": "Happy"},
        {"confidence": 0.0483937, "label": "Neutral"},
        {"confidence": 0.0120008, "label": "Disgust"}
      ],
      "person": 0
    }
  ]
}

Example 2.

  • Parameter 1: HTTP Url
{
    "image": "https://s3.amazonaws.com/algorithmia-assets/algo_desc_images/deeplearning_EmotionRecognitionCNNMBP/jim_caviezel.jpg"
}

Output:

{
  "results": [
    {
      "bbox": {
        "bottom": 1094,
        "left": 354,
        "right": 1019,
        "top": 428
      },
      "emotions": [
        {"confidence": 0.9999458, "label": "Happy"},
        {"confidence": 0.0000528, "label": "Neutral"},
        {"confidence": 8e-7, "label": "Disgust"}
      ],
      "person": 0
    }
  ]
}

Example 3.

  • Parameter 1: Base64 image
{
    "image": "data:image/png;base64....",
}

Output;

{
  "results": [
    {
      "bbox": {
        "bottom": 911,
        "left": 295,
        "right": 849,
        "top": 357
      },
      "emotions": [
        {"confidence": 0.9386989, "label": "Happy"},
        {"confidence": 0.0483937, "label": "Neutral"},
        {"confidence": 0.0120008, "label": "Disgust"}
      ],
      "person": 0
    }
  ]
}

Example 4.

  • Parameter 1: Data API Url
  • Parameter 2: Number of results
{
  "image": "data://deeplearning/example_data/elon_musk.jpg",
  "numResults": 7
}

Output;

{
  "results": [
    {
      "bbox": {
        "bottom": 911,
        "left": 295,
        "right": 849,
        "top": 357
      },
      "emotions": [
        {"confidence": 0.9386989, "label": "Happy"},
        {"confidence": 0.0483937, "label": "Neutral"},
        {"confidence": 0.0120008, "label": "Disgust"},
        {"confidence": 0.000406, "label": "Sad"},
        {"confidence": 0.0003461, "label": "Fear"},
        {"confidence": 0.00015, "label": "Angry"},
        {"confidence": 0.0000046, "label": "Surprise"}
      ],
      "person": 0
    }
  ]
}

3. Credits

For more information, please refer to: http://www.openu.ac.il/home/hassner/projects/cnn_emotions/ or Gil Levi and Tal Hassner, Emotion Recognition in the Wild via Convolutional Neural Networks and Mapped Binary Patterns, Proc. ACM International Conference on Multimodal Interaction (ICMI), Seattle, Nov. 2015

dlib/FaceDetection was used to detect faces in given images.

Demo images were taken from:

https://en.wikipedia.org/wiki/Elon_Musk#/media/File:Elon_Musk_2015.jpg_

https://gl.wikipedia.org/wiki/Jim_Caviezel#/media/File:Jim_Caviezel_SDCC_2013.jpg