Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/20492452/476,200,31,38/full/0/native.jpg)
AWS Rekognition
Age | 25-35 |
Gender | Male, 99.2% |
Sad | 100% |
Surprised | 6.3% |
Fear | 6% |
Calm | 0.5% |
Disgusted | 0.4% |
Angry | 0.2% |
Confused | 0.2% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/359,190,30,39/full/0/native.jpg)
AWS Rekognition
Age | 31-41 |
Gender | Male, 100% |
Calm | 100% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.2% |
Angry | 0% |
Disgusted | 0% |
Confused | 0% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/416,170,28,39/full/0/native.jpg)
AWS Rekognition
Age | 25-35 |
Gender | Male, 93.1% |
Calm | 84.9% |
Surprised | 8.2% |
Fear | 7.3% |
Sad | 3.9% |
Confused | 1.8% |
Disgusted | 1% |
Angry | 0.6% |
Happy | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/259,184,32,42/full/0/native.jpg)
AWS Rekognition
Age | 23-31 |
Gender | Male, 99% |
Calm | 71.7% |
Sad | 13.3% |
Surprised | 7.8% |
Fear | 6.8% |
Disgusted | 3.9% |
Confused | 3.2% |
Angry | 2.2% |
Happy | 0.5% |
Feature analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/20492452/698,356,263,339/full/0/native.jpg)
Adult | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/62,223,214,405/full/0/native.jpg)
Adult | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/402,179,168,408/full/0/native.jpg)
Adult | 96.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/217,134,146,260/full/0/native.jpg)
Adult | 91.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/292,163,242,405/full/0/native.jpg)
Adult | 79% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/542,391,158,297/full/0/native.jpg)
Adult | 64.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/698,356,263,339/full/0/native.jpg)
Bride | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/62,223,214,405/full/0/native.jpg)
Bride | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/402,179,168,408/full/0/native.jpg)
Bride | 96.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/217,134,146,260/full/0/native.jpg)
Bride | 91.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/292,163,242,405/full/0/native.jpg)
Bride | 79% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/542,391,158,297/full/0/native.jpg)
Bride | 64.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/698,356,263,339/full/0/native.jpg)
Female | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/62,223,214,405/full/0/native.jpg)
Female | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/402,179,168,408/full/0/native.jpg)
Female | 96.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/217,134,146,260/full/0/native.jpg)
Female | 91.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/292,163,242,405/full/0/native.jpg)
Female | 79% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/542,391,158,297/full/0/native.jpg)
Female | 64.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/698,356,263,339/full/0/native.jpg)
Person | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/62,223,214,405/full/0/native.jpg)
Person | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/402,179,168,408/full/0/native.jpg)
Person | 96.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/584,179,78,147/full/0/native.jpg)
Person | 95.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/642,191,52,143/full/0/native.jpg)
Person | 93.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/666,179,95,155/full/0/native.jpg)
Person | 92.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/217,134,146,260/full/0/native.jpg)
Person | 91.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/713,256,62,90/full/0/native.jpg)
Person | 85.8% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/292,163,242,405/full/0/native.jpg)
Person | 79% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/827,95,32,95/full/0/native.jpg)
Person | 73.8% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/408,157,60,79/full/0/native.jpg)
Person | 68.8% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/542,391,158,297/full/0/native.jpg)
Person | 64.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/698,356,263,339/full/0/native.jpg)
Woman | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/62,223,214,405/full/0/native.jpg)
Woman | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/402,179,168,408/full/0/native.jpg)
Woman | 96.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/217,134,146,260/full/0/native.jpg)
Woman | 91.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/292,163,242,405/full/0/native.jpg)
Woman | 79% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/542,391,158,297/full/0/native.jpg)
Woman | 64.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/20492452/207,260,233,364/full/0/native.jpg)
Horse | 77.6% | |
Categories
Imagga
paintings art | 88.1% | |
beaches seaside | 7.5% | |
streetview architecture | 3.4% | |
Captions
Microsoft
created on 2019-02-18
a vintage photo of a person holding a book | 46.6% | |
a vintage photo of a person | 46.5% | |
a vintage photo of a book | 46.4% | |