Machine Generated Data
Tags
Amazon
created on 2019-03-25
Crowd | 99.7 | |
| ||
Audience | 99.7 | |
| ||
Human | 99.7 | |
| ||
Person | 99.3 | |
| ||
Person | 99.1 | |
| ||
Indoors | 98.9 | |
| ||
Interior Design | 98.9 | |
| ||
Person | 98.7 | |
| ||
Person | 98.5 | |
| ||
Person | 97 | |
| ||
Person | 93.9 | |
| ||
Person | 91.1 | |
| ||
Speech | 89.4 | |
| ||
Person | 88.1 | |
| ||
Room | 87.2 | |
| ||
Lecture | 86.9 | |
| ||
Person | 81.6 | |
| ||
Person | 74.5 | |
| ||
Person | 72.5 | |
| ||
Person | 70.6 | |
| ||
Person | 69 | |
| ||
Classroom | 63.1 | |
| ||
School | 63.1 | |
| ||
Person | 61.3 | |
| ||
Priest | 59.7 | |
| ||
Seminar | 56 | |
| ||
Person | 43 | |
|
Clarifai
created on 2019-03-25
Imagga
created on 2019-03-25
Google
created on 2019-03-25
Photograph | 95.5 | |
| ||
Snapshot | 84.7 | |
| ||
Black-and-white | 78.1 | |
| ||
Room | 65.7 | |
| ||
Event | 62.7 | |
| ||
Photography | 62.4 | |
| ||
Monochrome | 54.4 | |
|
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18832745/286,277,18,26/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Female, 50.3% |
Sad | 49.8% |
Surprised | 49.6% |
Disgusted | 49.6% |
Happy | 49.6% |
Angry | 49.6% |
Confused | 49.6% |
Calm | 49.8% |
Feature analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18832745/866,400,122,187/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/385,552,239,223/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/728,407,124,166/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/573,459,166,312/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/18,511,223,279/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/306,454,168,245/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/458,389,103,122/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/150,467,169,218/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/39,314,88,146/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/249,264,73,155/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/159,133,66,176/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/82,424,103,179/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/497,410,72,85/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/354,406,110,145/full/0/native.jpg)
![](https://ids.lib.harvard.edu/ids/iiif/18832745/99,400,63,60/full/0/native.jpg)
Person | 99.3% | |
|
Categories
Imagga
streetview architecture | 89.8% | |
| ||
interior objects | 5.5% | |
| ||
events parties | 3.3% | |
|
Captions
Microsoft
created on 2019-03-25
a group of people performing on a counter | 89% | |
| ||
a group of people in a room | 88.9% | |
| ||
a group of people sitting in front of a crowd | 84% | |
|
Text analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18832745/993,644,0,0/full/0/native.jpg)
KODOK--2ELA--IRW