Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18780654/371,231,49,62/full/0/native.jpg)
AWS Rekognition
Age | 13-25 |
Gender | Female, 62.6% |
Calm | 79.7% |
Confused | 6.2% |
Sad | 5.3% |
Happy | 4.4% |
Surprised | 1.6% |
Fear | 1.1% |
Angry | 1.1% |
Disgusted | 0.6% |
![](https://ids.lib.harvard.edu/ids/iiif/18780654/458,274,50,68/full/0/native.jpg)
AWS Rekognition
Age | 22-34 |
Gender | Female, 50.4% |
Calm | 95.5% |
Happy | 1.4% |
Sad | 1.2% |
Angry | 0.8% |
Surprised | 0.7% |
Confused | 0.2% |
Disgusted | 0.2% |
Fear | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/18780654/371,233,52,60/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/18780654/455,269,60,70/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 98% | |
Categories
Imagga
interior objects | 98.7% | |
paintings art | 1.1% | |
Captions
Microsoft
created on 2021-12-14
a person standing next to a vase | 33.8% | |
a person standing next to a vase | 26.1% | |
a person standing next to a vase with flowers in it | 25.3% | |