Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/32640601/328,185,41,57/full/0/native.jpg)
AWS Rekognition
Age | 19-36 |
Gender | Male, 55% |
Angry | 45.7% |
Sad | 47% |
Calm | 51.4% |
Confused | 45.7% |
Surprised | 45.1% |
Happy | 45% |
Disgusted | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/167,213,38,42/full/0/native.jpg)
AWS Rekognition
Age | 20-38 |
Gender | Male, 52.3% |
Angry | 52.6% |
Surprised | 45.1% |
Happy | 45% |
Sad | 45.2% |
Calm | 46.9% |
Disgusted | 45.1% |
Confused | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/661,239,44,53/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Male, 54.8% |
Disgusted | 45.2% |
Happy | 45.2% |
Angry | 45.8% |
Surprised | 45.3% |
Sad | 45.4% |
Calm | 52.8% |
Confused | 45.3% |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/603,175,32,43/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Female, 51.2% |
Happy | 45.1% |
Calm | 46.6% |
Surprised | 45.1% |
Angry | 45.9% |
Confused | 45.4% |
Disgusted | 45.1% |
Sad | 51.7% |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/872,241,50,65/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Male, 99.2% |
Sad | 1.8% |
Disgusted | 0.2% |
Confused | 0.5% |
Surprised | 0.5% |
Calm | 96.3% |
Angry | 0.6% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/331,197,48,48/full/0/native.jpg)
Microsoft Cognitive Services
Age | 30 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/656,246,46,46/full/0/native.jpg)
Microsoft Cognitive Services
Age | 33 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/860,257,49,49/full/0/native.jpg)
Microsoft Cognitive Services
Age | 44 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/601,185,35,35/full/0/native.jpg)
Microsoft Cognitive Services
Age | 48 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/649,222,69,81/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/314,170,73,86/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/595,168,53,61/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/32640601/861,228,75,88/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Categories
Imagga
paintings art | 46.1% | |
streetview architecture | 36.8% | |
nature landscape | 14.2% | |
people portraits | 1.9% | |
pets animals | 0.6% | |
beaches seaside | 0.3% | |
Captions
Microsoft
created on 2018-03-23
a group of people walking down a dirt road | 97.5% | |
a group of people walking on a dirt road | 97% | |
a group of people on a dirt road | 96.7% | |