Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/20675760/275,440,77,105/full/0/native.jpg)
AWS Rekognition
Age | 22-34 |
Gender | Female, 99.5% |
Disgusted | 0.1% |
Happy | 9.2% |
Fear | 0% |
Angry | 0.2% |
Calm | 89.7% |
Sad | 0.3% |
Surprised | 0.2% |
Confused | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/20675760/274,461,80,80/full/0/native.jpg)
Microsoft Cognitive Services
Age | 31 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20675760/257,412,120,140/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 99% | |
Categories
Imagga
interior objects | 85.8% | |
food drinks | 6.5% | |
paintings art | 5.3% | |
people portraits | 1.2% | |
Captions
Microsoft
created on 2020-04-24
a close up of a glass vase | 35.1% | |
a glass vase | 29.9% | |