Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/20575504/957,322,48,55/full/0/native.jpg)
AWS Rekognition
Age | 45-63 |
Gender | Female, 52.5% |
Disgusted | 45.7% |
Happy | 45.1% |
Confused | 45.2% |
Sad | 53.4% |
Angry | 45.3% |
Surprised | 45.1% |
Calm | 45.3% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/644,327,37,43/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Male, 50.2% |
Confused | 45.3% |
Angry | 46.7% |
Disgusted | 45.5% |
Surprised | 45.3% |
Sad | 50.4% |
Happy | 45.4% |
Calm | 46.4% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/695,313,38,49/full/0/native.jpg)
AWS Rekognition
Age | 17-27 |
Gender | Male, 50.6% |
Disgusted | 45.5% |
Surprised | 45.2% |
Confused | 45.4% |
Sad | 50.5% |
Angry | 45.8% |
Calm | 47.5% |
Happy | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/566,311,35,45/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Male, 51.4% |
Disgusted | 46% |
Confused | 45.7% |
Happy | 45.2% |
Calm | 46.3% |
Surprised | 45.4% |
Sad | 50.3% |
Angry | 46.1% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/408,313,50,76/full/0/native.jpg)
AWS Rekognition
Age | 23-38 |
Gender | Male, 70.4% |
Happy | 6.6% |
Surprised | 8.9% |
Angry | 6.7% |
Sad | 8.8% |
Confused | 13.6% |
Disgusted | 4.5% |
Calm | 50.9% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/117,344,45,57/full/0/native.jpg)
AWS Rekognition
Age | 15-25 |
Gender | Male, 51.7% |
Disgusted | 45.1% |
Happy | 45.1% |
Confused | 45.5% |
Sad | 45.2% |
Angry | 45.6% |
Surprised | 45.4% |
Calm | 53% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/482,311,36,44/full/0/native.jpg)
AWS Rekognition
Age | 20-38 |
Gender | Female, 50.5% |
Angry | 45.2% |
Calm | 52.8% |
Sad | 45.3% |
Confused | 45.3% |
Disgusted | 45.2% |
Happy | 45.5% |
Surprised | 45.7% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/225,292,60,79/full/0/native.jpg)
AWS Rekognition
Age | 20-38 |
Gender | Male, 80.7% |
Sad | 7% |
Disgusted | 1% |
Surprised | 3.6% |
Angry | 7.9% |
Happy | 2.6% |
Calm | 71.3% |
Confused | 6.7% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/795,304,42,56/full/0/native.jpg)
AWS Rekognition
Age | 15-25 |
Gender | Female, 50.7% |
Disgusted | 45.1% |
Happy | 45.3% |
Sad | 46.6% |
Angry | 45.3% |
Calm | 52.4% |
Confused | 45.1% |
Surprised | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/329,335,27,44/full/0/native.jpg)
AWS Rekognition
Age | 15-25 |
Gender | Female, 52.3% |
Angry | 45.7% |
Disgusted | 45.3% |
Sad | 46.1% |
Confused | 45.6% |
Happy | 45.1% |
Surprised | 45.6% |
Calm | 51.7% |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/413,329,55,55/full/0/native.jpg)
Microsoft Cognitive Services
Age | 32 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/795,316,42,42/full/0/native.jpg)
Microsoft Cognitive Services
Age | 23 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/694,324,38,38/full/0/native.jpg)
Microsoft Cognitive Services
Age | 23 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/234,303,65,65/full/0/native.jpg)
Microsoft Cognitive Services
Age | 40 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/120,353,49,49/full/0/native.jpg)
Microsoft Cognitive Services
Age | 22 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/644,335,38,38/full/0/native.jpg)
Microsoft Cognitive Services
Age | 33 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/487,316,36,36/full/0/native.jpg)
Microsoft Cognitive Services
Age | 14 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/787,293,63,73/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/690,304,57,67/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/105,327,72,83/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/393,299,83,97/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/20575504/639,313,56,65/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Possible |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 99% | |
Categories
Imagga
interior objects | 50.8% | |
people portraits | 22% | |
events parties | 18% | |
food drinks | 6.4% | |
paintings art | 2.2% | |
Captions
Microsoft
created on 2019-06-05
a group of people sitting at a table | 95.6% | |
a group of people sitting around a table | 95.4% | |
a group of people sitting at a table in a restaurant | 93.9% | |