Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/43159683/607,136,73,100/full/0/native.jpg)
AWS Rekognition
Age | 20-28 |
Gender | Female, 97.6% |
Sad | 69.4% |
Calm | 38.8% |
Fear | 20% |
Surprised | 7.5% |
Disgusted | 2% |
Confused | 1.6% |
Happy | 1.1% |
Angry | 1% |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/751,66,53,70/full/0/native.jpg)
AWS Rekognition
Age | 24-34 |
Gender | Male, 98.3% |
Surprised | 98.7% |
Fear | 7.3% |
Happy | 4.3% |
Calm | 3.4% |
Confused | 3.1% |
Sad | 2.7% |
Angry | 1.6% |
Disgusted | 0.7% |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/497,321,80,127/full/0/native.jpg)
AWS Rekognition
Age | 18-24 |
Gender | Male, 61% |
Calm | 67.1% |
Surprised | 10.1% |
Fear | 9% |
Sad | 7.5% |
Happy | 5% |
Angry | 3.3% |
Disgusted | 1.7% |
Confused | 0.9% |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/630,162,68,68/full/0/native.jpg)
Microsoft Cognitive Services
Age | 31 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/577,108,117,137/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Possible |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/725,39,87,102/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/43159683/357,246,421,445/full/0/native.jpg)
Adult | 99% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/676,20,328,535/full/0/native.jpg)
Adult | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/20,342,205,335/full/0/native.jpg)
Adult | 95.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/135,13,280,472/full/0/native.jpg)
Adult | 94.6% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/365,31,149,219/full/0/native.jpg)
Adult | 94.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/24,17,230,525/full/0/native.jpg)
Adult | 91.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/357,246,421,445/full/0/native.jpg)
Male | 99% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/548,90,388,596/full/0/native.jpg)
Male | 98.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/676,20,328,535/full/0/native.jpg)
Male | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/135,13,280,472/full/0/native.jpg)
Male | 94.6% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/365,31,149,219/full/0/native.jpg)
Male | 94.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/24,17,230,525/full/0/native.jpg)
Male | 91.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/357,246,421,445/full/0/native.jpg)
Man | 99% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/676,20,328,535/full/0/native.jpg)
Man | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/135,13,280,472/full/0/native.jpg)
Man | 94.6% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/365,31,149,219/full/0/native.jpg)
Man | 94.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/24,17,230,525/full/0/native.jpg)
Man | 91.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/357,246,421,445/full/0/native.jpg)
Person | 99% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/548,90,388,596/full/0/native.jpg)
Person | 98.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/676,20,328,535/full/0/native.jpg)
Person | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/20,342,205,335/full/0/native.jpg)
Person | 95.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/135,13,280,472/full/0/native.jpg)
Person | 94.6% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/365,31,149,219/full/0/native.jpg)
Person | 94.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/24,17,230,525/full/0/native.jpg)
Person | 91.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/548,90,388,596/full/0/native.jpg)
Boy | 98.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/548,90,388,596/full/0/native.jpg)
Child | 98.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/20,342,205,335/full/0/native.jpg)
Bride | 95.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/20,342,205,335/full/0/native.jpg)
Female | 95.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/43159683/20,342,205,335/full/0/native.jpg)
Woman | 95.3% | |
Categories
Imagga
paintings art | 86.3% | |
people portraits | 8.5% | |
pets animals | 4.6% | |
Captions
Microsoft
created on 2018-05-11
a group of people around each other | 88.3% | |
a group of people sitting in front of a crowd | 85.2% | |
a group of people standing in front of a crowd | 85.1% | |