Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18780167/230,295,45,56/full/0/native.jpg)
AWS Rekognition
Age | 22-34 |
Gender | Female, 84.9% |
Calm | 90.3% |
Happy | 5.9% |
Sad | 2.1% |
Confused | 0.6% |
Angry | 0.3% |
Surprised | 0.3% |
Disgusted | 0.3% |
Fear | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/18780167/367,341,42,52/full/0/native.jpg)
AWS Rekognition
Age | 26-42 |
Gender | Female, 92.1% |
Calm | 60.7% |
Happy | 25.6% |
Sad | 6% |
Fear | 4.2% |
Angry | 1.8% |
Surprised | 0.9% |
Confused | 0.5% |
Disgusted | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/18780167/230,301,41,48/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/18780167/368,334,47,54/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 99% | |
Categories
Imagga
paintings art | 90.9% | |
text visuals | 8.4% | |
Captions
Microsoft
created on 2021-12-14
a person standing in front of a sign | 46.7% | |
a person standing next to a sign | 42.2% | |
an old photo of a person | 38% | |