Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/20221901/156,331,24,51/full/0/native.jpg)
AWS Rekognition
Age | 37-45 |
Gender | Male, 83% |
Calm | 98.3% |
Surprised | 0.5% |
Happy | 0.4% |
Confused | 0.4% |
Sad | 0.1% |
Angry | 0.1% |
Disgusted | 0.1% |
Fear | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/20221901/936,312,32,37/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/20221901/309,196,41,47/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 98.9% | |
Categories
Imagga
interior objects | 96.9% | |
paintings art | 1.7% | |
Captions
Microsoft
created on 2022-01-23
a man standing in front of a store | 48.4% | |
a man and a woman standing in front of a store | 25.2% | |
a man that is standing in front of a store | 25.1% | |