Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/43188757/565,187,33,46/full/0/native.jpg)
AWS Rekognition
Age | 12-20 |
Gender | Male, 99.1% |
Calm | 99.1% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.2% |
Confused | 0.3% |
Angry | 0.1% |
Disgusted | 0.1% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/161,182,36,46/full/0/native.jpg)
AWS Rekognition
Age | 21-29 |
Gender | Male, 100% |
Calm | 97.6% |
Surprised | 6.5% |
Fear | 5.9% |
Sad | 2.3% |
Angry | 0.6% |
Confused | 0.2% |
Happy | 0.2% |
Disgusted | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/447,219,28,38/full/0/native.jpg)
AWS Rekognition
Age | 22-30 |
Gender | Male, 100% |
Calm | 39.1% |
Surprised | 22.7% |
Angry | 21.6% |
Confused | 11.5% |
Fear | 6.3% |
Happy | 5.5% |
Sad | 2.8% |
Disgusted | 2.1% |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/266,164,41,53/full/0/native.jpg)
AWS Rekognition
Age | 20-28 |
Gender | Male, 99.8% |
Calm | 97.9% |
Surprised | 6.4% |
Fear | 6% |
Sad | 2.3% |
Confused | 0.7% |
Disgusted | 0.3% |
Angry | 0.2% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/797,168,37,47/full/0/native.jpg)
AWS Rekognition
Age | 21-29 |
Gender | Male, 98.2% |
Calm | 98.8% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.3% |
Angry | 0.2% |
Confused | 0.2% |
Happy | 0.1% |
Disgusted | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/790,171,42,42/full/0/native.jpg)
Microsoft Cognitive Services
Age | 28 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/256,171,39,39/full/0/native.jpg)
Microsoft Cognitive Services
Age | 30 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/154,188,37,37/full/0/native.jpg)
Microsoft Cognitive Services
Age | 32 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/256,144,66,77/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Possible |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/153,165,57,66/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Likely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/438,202,52,61/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Possible |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/790,151,58,67/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Possible |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43188757/555,168,59,68/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Possible |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
paintings art | 49.8% | |
interior objects | 41.7% | |
people portraits | 4.1% | |
pets animals | 3.5% | |