Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/20222055/639,176,15,18/full/0/native.jpg)
AWS Rekognition
Age | 23-31 |
Gender | Male, 62.3% |
Calm | 98.1% |
Sad | 0.7% |
Angry | 0.5% |
Confused | 0.3% |
Surprised | 0.2% |
Disgusted | 0.1% |
Happy | 0.1% |
Fear | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/20222055/638,175,17,20/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very likely |
![](https://ids.lib.harvard.edu/ids/iiif/20222055/592,349,40,47/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/20222055/834,167,23,26/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 96.8% | |
Christmas Tree | 96.7% | |
Categories
Imagga
streetview architecture | 88.1% | |
interior objects | 5.4% | |
paintings art | 2.7% | |
beaches seaside | 1.3% | |
Captions
Microsoft
created on 2022-01-23
a group of people in a room | 88.2% | |
a group of people standing in a room | 85% | |
a group of people standing around a table | 71.7% | |