Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/43160005/754,124,133,167/full/0/native.jpg)
AWS Rekognition
Age | 6-14 |
Gender | Female, 99.9% |
Calm | 84.7% |
Sad | 15.3% |
Surprised | 6.3% |
Fear | 5.9% |
Confused | 0.2% |
Angry | 0.1% |
Disgusted | 0.1% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/570,321,107,131/full/0/native.jpg)
AWS Rekognition
Age | 6-14 |
Gender | Female, 100% |
Happy | 99.2% |
Surprised | 6.4% |
Fear | 5.9% |
Sad | 2.2% |
Calm | 0.2% |
Confused | 0.1% |
Angry | 0% |
Disgusted | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/193,400,108,127/full/0/native.jpg)
AWS Rekognition
Age | 6-14 |
Gender | Female, 100% |
Calm | 98.7% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.2% |
Angry | 0.7% |
Confused | 0.4% |
Disgusted | 0% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/348,287,35,48/full/0/native.jpg)
AWS Rekognition
Age | 22-30 |
Gender | Male, 99.9% |
Calm | 78.1% |
Confused | 11.3% |
Surprised | 7% |
Fear | 6% |
Angry | 4.4% |
Sad | 2.5% |
Happy | 2.1% |
Disgusted | 1.2% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/277,317,38,53/full/0/native.jpg)
AWS Rekognition
Age | 21-29 |
Gender | Male, 100% |
Surprised | 60.6% |
Fear | 27.5% |
Sad | 19.2% |
Calm | 8.5% |
Happy | 8.2% |
Angry | 1.3% |
Confused | 1% |
Disgusted | 1% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/316,340,38,50/full/0/native.jpg)
AWS Rekognition
Age | 38-46 |
Gender | Female, 100% |
Confused | 31.6% |
Calm | 20.7% |
Happy | 16.8% |
Angry | 14.1% |
Fear | 8.4% |
Surprised | 8.1% |
Sad | 4.4% |
Disgusted | 2.5% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/388,78,11,14/full/0/native.jpg)
AWS Rekognition
Age | 16-24 |
Gender | Female, 81.8% |
Calm | 72.9% |
Happy | 9% |
Sad | 8.1% |
Fear | 7.3% |
Surprised | 6.7% |
Angry | 1.9% |
Confused | 1.6% |
Disgusted | 0.6% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/316,63,12,15/full/0/native.jpg)
AWS Rekognition
Age | 16-24 |
Gender | Male, 81% |
Calm | 93.4% |
Surprised | 7% |
Fear | 6% |
Sad | 2.5% |
Happy | 1.4% |
Confused | 1% |
Angry | 0.7% |
Disgusted | 0.5% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/362,69,9,12/full/0/native.jpg)
AWS Rekognition
Age | 16-24 |
Gender | Male, 98.2% |
Calm | 74% |
Sad | 17.9% |
Surprised | 6.7% |
Fear | 6.1% |
Confused | 5% |
Angry | 1.7% |
Happy | 1.1% |
Disgusted | 0.8% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/386,101,12,16/full/0/native.jpg)
AWS Rekognition
Age | 47-53 |
Gender | Female, 90% |
Calm | 94.1% |
Surprised | 6.7% |
Fear | 6% |
Happy | 2.3% |
Sad | 2.3% |
Angry | 1% |
Disgusted | 0.5% |
Confused | 0.4% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/336,63,11,14/full/0/native.jpg)
AWS Rekognition
Age | 4-12 |
Gender | Male, 100% |
Calm | 74.2% |
Happy | 7% |
Fear | 6.8% |
Surprised | 6.8% |
Sad | 6% |
Angry | 4.5% |
Confused | 1.8% |
Disgusted | 1.6% |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/193,415,112,112/full/0/native.jpg)
Microsoft Cognitive Services
Age | 29 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/278,328,46,46/full/0/native.jpg)
Microsoft Cognitive Services
Age | 32 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/314,349,40,40/full/0/native.jpg)
Microsoft Cognitive Services
Age | 44 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/349,298,38,38/full/0/native.jpg)
Microsoft Cognitive Services
Age | 29 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/750,161,129,129/full/0/native.jpg)
Microsoft Cognitive Services
Age | 34 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/570,336,112,112/full/0/native.jpg)
Microsoft Cognitive Services
Age | 75 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/719,95,180,209/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/546,283,152,177/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very likely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/163,356,160,187/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/307,324,61,72/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43160005/271,308,58,68/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
interior objects | 42.2% | |
people portraits | 26.6% | |
paintings art | 24% | |
pets animals | 2.6% | |
food drinks | 2.6% | |
text visuals | 1.2% | |
Captions
Microsoft
created on 2018-05-11
a group of young children sitting next to a window | 45.5% | |
a group of people sitting in front of a window | 45.4% | |
a girl sitting in front of a window | 45.3% | |