Machine Generated Data
Tags
Amazon
created on 2022-01-15
Clarifai
created on 2023-10-26
Imagga
created on 2022-01-15
Google
created on 2022-01-15
Font | 84.6 | |
| ||
Poster | 81 | |
| ||
Art | 77.2 | |
| ||
Rectangle | 70.4 | |
| ||
Advertising | 65.2 | |
| ||
Visual arts | 59.6 | |
| ||
Illustration | 58.3 | |
| ||
Room | 57.7 | |
| ||
Sitting | 56.5 | |
| ||
Pattern | 56.2 | |
| ||
History | 54.8 | |
| ||
Publication | 53.8 | |
|
Microsoft
created on 2022-01-15
text | 99.9 | |
| ||
drawing | 97.9 | |
| ||
cartoon | 92.2 | |
| ||
outdoor | 90.1 | |
| ||
sketch | 89.7 | |
| ||
poster | 79.9 | |
| ||
handwriting | 77.6 | |
| ||
person | 61.5 | |
| ||
art | 59.3 | |
| ||
black and white | 54.3 | |
|
Color Analysis
Face analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18820179/356,317,76,97/full/0/native.jpg)
AWS Rekognition
Age | 23-33 |
Gender | Female, 57.9% |
Happy | 75.8% |
Fear | 9.5% |
Surprised | 5.9% |
Angry | 2.8% |
Calm | 2.3% |
Sad | 1.4% |
Disgusted | 1.3% |
Confused | 0.9% |
Feature analysis
Categories
Imagga
streetview architecture | 75.1% | |
| ||
paintings art | 15.5% | |
| ||
text visuals | 6.6% | |
| ||
interior objects | 2.8% | |
|
Captions
Microsoft
created on 2022-01-15
a man holding a sign | 60.4% | |
| ||
a man standing in front of a sign | 60.3% | |
| ||
a man standing in front of a building | 58.3% | |
|
Text analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/18820179/441,224,54,31/full/0/native.jpg)
ON
![](https://ids.lib.harvard.edu/ids/iiif/18820179/186,151,104,61/full/0/native.jpg)
BIG
![](https://ids.lib.harvard.edu/ids/iiif/18820179/441,223,153,35/full/0/native.jpg)
ON THE
![](https://ids.lib.harvard.edu/ids/iiif/18820179/510,223,84,35/full/0/native.jpg)
THE
![](https://ids.lib.harvard.edu/ids/iiif/18820179/186,148,410,71/full/0/native.jpg)
BIG SAVINGS
![](https://ids.lib.harvard.edu/ids/iiif/18820179/206,75,381,76/full/0/native.jpg)
FATTEST
![](https://ids.lib.harvard.edu/ids/iiif/18820179/442,261,154,42/full/0/native.jpg)
ROAD
![](https://ids.lib.harvard.edu/ids/iiif/18820179/322,154,274,63/full/0/native.jpg)
SAVINGS
![](https://ids.lib.harvard.edu/ids/iiif/18820179/261,879,83,90/full/0/native.jpg)
Big
![](https://ids.lib.harvard.edu/ids/iiif/18820179/261,872,249,97/full/0/native.jpg)
Big Bertha
![](https://ids.lib.harvard.edu/ids/iiif/18820179/356,875,153,69/full/0/native.jpg)
Bertha
![](https://ids.lib.harvard.edu/ids/iiif/18820179/419,6,222,32/full/0/native.jpg)
46522-A
![](https://ids.lib.harvard.edu/ids/iiif/18820179/419,6,259,32/full/0/native.jpg)
46522-A OOX
![](https://ids.lib.harvard.edu/ids/iiif/18820179/637,18,41,13/full/0/native.jpg)
OOX
![](https://ids.lib.harvard.edu/ids/iiif/18820179/212,84,376,70/full/0/native.jpg)
FATTEST
![](https://ids.lib.harvard.edu/ids/iiif/18820179/189,156,104,70/full/0/native.jpg)
BIG
![](https://ids.lib.harvard.edu/ids/iiif/18820179/325,156,273,64/full/0/native.jpg)
SAVINGS
![](https://ids.lib.harvard.edu/ids/iiif/18820179/513,225,84,37/full/0/native.jpg)
ΤHE
![](https://ids.lib.harvard.edu/ids/iiif/18820179/434,263,165,49/full/0/native.jpg)
ROAD
![](https://ids.lib.harvard.edu/ids/iiif/18820179/266,877,95,86/full/0/native.jpg)
Big
![](https://ids.lib.harvard.edu/ids/iiif/18820179/365,871,146,87/full/0/native.jpg)
Beritha
![](https://ids.lib.harvard.edu/ids/iiif/18820179/189,10,410,953/full/0/native.jpg)
165
FATTEST
BIG SAVINGS
ΟΝ ΤHE
ROAD
Big Beritha
![](https://ids.lib.harvard.edu/ids/iiif/18820179/415,10,89,33/full/0/native.jpg)
165
![](https://ids.lib.harvard.edu/ids/iiif/18820179/443,226,57,34/full/0/native.jpg)
ΟΝ