Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/18782519/884,482,70,104/full/0/native.jpg)
AWS Rekognition
Age | 36-52 |
Gender | Male, 96.6% |
Calm | 79.8% |
Sad | 19.1% |
Confused | 0.3% |
Happy | 0.3% |
Angry | 0.2% |
Disgusted | 0.1% |
Fear | 0.1% |
Surprised | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/354,511,59,76/full/0/native.jpg)
AWS Rekognition
Age | 32-48 |
Gender | Male, 99.6% |
Calm | 92.2% |
Angry | 5.5% |
Sad | 0.8% |
Happy | 0.5% |
Disgusted | 0.4% |
Confused | 0.3% |
Surprised | 0.2% |
Fear | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/563,517,55,69/full/0/native.jpg)
AWS Rekognition
Age | 31-47 |
Gender | Male, 99.4% |
Calm | 87.3% |
Happy | 7.7% |
Angry | 1.5% |
Surprised | 1.2% |
Confused | 1.1% |
Sad | 0.5% |
Fear | 0.3% |
Disgusted | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/163,465,15,18/full/0/native.jpg)
AWS Rekognition
Age | 34-50 |
Gender | Male, 91.5% |
Happy | 60.8% |
Calm | 26.2% |
Angry | 3.7% |
Sad | 3.2% |
Disgusted | 2% |
Confused | 1.6% |
Surprised | 1.5% |
Fear | 0.9% |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/87,461,13,17/full/0/native.jpg)
AWS Rekognition
Age | 7-17 |
Gender | Male, 67% |
Calm | 51.7% |
Happy | 24.4% |
Sad | 13% |
Fear | 3.5% |
Angry | 2.4% |
Surprised | 1.9% |
Confused | 1.7% |
Disgusted | 1.5% |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/864,501,70,70/full/0/native.jpg)
Microsoft Cognitive Services
Age | 39 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/369,522,57,57/full/0/native.jpg)
Microsoft Cognitive Services
Age | 48 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/562,527,52,52/full/0/native.jpg)
Microsoft Cognitive Services
Age | 50 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/552,497,78,92/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very likely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/865,454,117,135/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Likely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/338,488,90,104/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very likely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/18782519/159,459,23,27/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Unlikely |
Feature analysis
Categories
Imagga
interior objects | 76.8% | |
pets animals | 12.6% | |
people portraits | 3.7% | |
paintings art | 2.2% | |
food drinks | 1.6% | |
nature landscape | 1.3% | |
Captions
Microsoft
created on 2021-12-15
a group of people standing in front of a store | 87.4% | |
a group of people standing in front of a building | 87.3% | |
a group of people in front of a store | 85.5% | |