Machine Generated Data
Tags
Amazon
created on 2023-10-25
Face | 100 | |
| ||
Head | 100 | |
| ||
Photography | 100 | |
| ||
Portrait | 100 | |
| ||
Art | 99.9 | |
| ||
Painting | 99.9 | |
| ||
Person | 99.6 | |
| ||
Adult | 99.6 | |
| ||
Male | 99.6 | |
| ||
Man | 99.6 | |
| ||
Clothing | 97.2 | |
| ||
Formal Wear | 97.2 | |
| ||
Suit | 97.2 | |
| ||
Text | 89.7 | |
| ||
Handwriting | 81.6 | |
| ||
Accessories | 56.6 | |
| ||
Earring | 56.6 | |
| ||
Jewelry | 56.6 | |
| ||
Lady | 55.4 | |
| ||
Photo Booth | 55 | |
|
Clarifai
created on 2019-02-26
Imagga
created on 2019-02-26
Google
created on 2019-02-26
Photograph | 96.9 | |
| ||
Gentleman | 81.1 | |
| ||
Portrait | 75 | |
| ||
Art | 65.5 | |
|
Microsoft
created on 2019-02-26
gallery | 93.2 | |
| ||
room | 92 | |
| ||
scene | 90.7 | |
| ||
picture frame | 6.9 | |
| ||
art | 6.9 | |
| ||
museum | 3.9 | |
| ||
black and white | 3.6 | |
| ||
portrait | 2.3 | |
| ||
person | 2.1 | |
|
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/20410141/317,313,90,136/full/0/native.jpg)
AWS Rekognition
Age | 43-51 |
Gender | Male, 100% |
Calm | 97.8% |
Surprised | 6.4% |
Fear | 5.9% |
Sad | 2.2% |
Angry | 0.6% |
Confused | 0.3% |
Disgusted | 0.3% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/20410141/300,344,93,93/full/0/native.jpg)
Microsoft Cognitive Services
Age | 49 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/20410141/290,282,152,178/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
paintings art | 91.2% | |
| ||
people portraits | 8.4% | |
|
Captions
Microsoft
created on 2019-02-26
an old photo of a person | 49.5% | |
| ||
a person taking a selfie in a room | 35.8% | |
| ||
a person in a room | 35.7% | |
|
Text analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/20410141/258,987,14,14/full/0/native.jpg)
to
![](https://ids.lib.harvard.edu/ids/iiif/20410141/303,991,36,18/full/0/native.jpg)
gray
![](https://ids.lib.harvard.edu/ids/iiif/20410141/386,989,11,21/full/0/native.jpg)
by
![](https://ids.lib.harvard.edu/ids/iiif/20410141/409,990,17,13/full/0/native.jpg)
L
![](https://ids.lib.harvard.edu/ids/iiif/20410141/279,990,20,10/full/0/native.jpg)
the
![](https://ids.lib.harvard.edu/ids/iiif/20410141/431,989,39,16/full/0/native.jpg)
Three
![](https://ids.lib.harvard.edu/ids/iiif/20410141/211,987,259,23/full/0/native.jpg)
given to the gray will by L Three
![](https://ids.lib.harvard.edu/ids/iiif/20410141/211,987,43,23/full/0/native.jpg)
given
![](https://ids.lib.harvard.edu/ids/iiif/20410141/327,680,12,5/full/0/native.jpg)
from
![](https://ids.lib.harvard.edu/ids/iiif/20410141/303,679,8,6/full/0/native.jpg)
13
![](https://ids.lib.harvard.edu/ids/iiif/20410141/349,710,114,14/full/0/native.jpg)
PICKERING
![](https://ids.lib.harvard.edu/ids/iiif/20410141/251,709,211,15/full/0/native.jpg)
TIMOTHY PICKERING
![](https://ids.lib.harvard.edu/ids/iiif/20410141/251,711,92,11/full/0/native.jpg)
TIMOTHY
![](https://ids.lib.harvard.edu/ids/iiif/20410141/274,679,156,8/full/0/native.jpg)
Express 36 13 from doming خالد CBrun
![](https://ids.lib.harvard.edu/ids/iiif/20410141/343,990,32,13/full/0/native.jpg)
will
![](https://ids.lib.harvard.edu/ids/iiif/20410141/410,679,20,6/full/0/native.jpg)
CBrun
![](https://ids.lib.harvard.edu/ids/iiif/20410141/274,747,161,45/full/0/native.jpg)
Hichening.
![](https://ids.lib.harvard.edu/ids/iiif/20410141/274,680,23,6/full/0/native.jpg)
Express
![](https://ids.lib.harvard.edu/ids/iiif/20410141/297,681,5,4/full/0/native.jpg)
36
![](https://ids.lib.harvard.edu/ids/iiif/20410141/342,680,21,6/full/0/native.jpg)
doming
![](https://ids.lib.harvard.edu/ids/iiif/20410141/398,680,11,6/full/0/native.jpg)
خالد