Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/43161405/313,350,41,55/full/0/native.jpg)
AWS Rekognition
Age | 53-61 |
Gender | Male, 99.8% |
Sad | 100% |
Surprised | 6.5% |
Calm | 6.3% |
Fear | 6% |
Confused | 1.8% |
Angry | 0.8% |
Happy | 0.5% |
Disgusted | 0.4% |
![](https://ids.lib.harvard.edu/ids/iiif/43161405/621,376,36,41/full/0/native.jpg)
AWS Rekognition
Age | 50-58 |
Gender | Male, 100% |
Sad | 99.9% |
Confused | 13.1% |
Surprised | 6.7% |
Fear | 6.4% |
Calm | 4.6% |
Disgusted | 4.6% |
Angry | 2.5% |
Happy | 0.6% |
![](https://ids.lib.harvard.edu/ids/iiif/43161405/317,362,43,43/full/0/native.jpg)
Microsoft Cognitive Services
Age | 55 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43161405/623,378,40,40/full/0/native.jpg)
Microsoft Cognitive Services
Age | 64 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43161405/305,338,61,72/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
interior objects | 97% | |
pets animals | 2.5% | |
Captions
Microsoft
created on 2018-05-10
a person sitting on a bench in front of a building | 87.9% | |
a person sitting on a bench | 87.8% | |
a person sitting on a bench next to a building | 87.7% | |