Human Generated Data

Title

Reflex

Date

1935

People

Artist: Joseph Ehm, Czech 1908 - 1989

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Loan from the Navigator Foundation, 17.2002

Human Generated Data

Title

Reflex

People

Artist: Joseph Ehm, Czech 1908 - 1989

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Loan from the Navigator Foundation, 17.2002

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 97.5
Apparel 97.5
Person 96.9
Human 96.9
Person 83.1
Evening Dress 79.5
Robe 79.5
Gown 79.5
Fashion 79.5
Home Decor 69.5
Art 62.3
Performer 56.6
Linen 55.1

Clarifai
created on 2023-10-15

people 99.9
wear 98.3
adult 97.9
monochrome 97
one 95.9
portrait 95.3
woman 94.4
two 93.7
street 92
group 90.9
actress 90.8
theater 90.8
man 90.7
administration 88.6
actor 87.4
outfit 86.1
music 85.7
group together 84.7
veil 84.2
opera 84

Imagga
created on 2021-12-14

statue 57.9
sculpture 39.3
architecture 27.3
art 27.2
monument 26.2
religion 23.3
marble 21.5
history 21.5
stone 21.3
old 18.8
building 18.3
culture 17.9
dress 16.3
travel 16.2
detail 16.1
tourism 15.7
landmark 15.4
face 14.9
ancient 14.7
religious 14.1
famous 14
costume 13.4
city 13.3
clothing 12.7
fountain 12.5
traditional 12.5
god 12.4
covering 12.4
catholic 11.8
garment 11.8
mask 11.6
person 11.5
faith 11.5
church 11.1
historic 11
robe 10.8
carnival 10.7
saint 10.6
antique 10.5
portrait 10.4
historical 10.3
boutique 10.1
masquerade 9.8
disguise 9.8
spirituality 9.6
festival 9.6
love 9.5
golden 9.5
temple 9.2
structure 9.2
lady 8.9
romantic 8.9
decoration 8.9
spiritual 8.6
design 8.5
cross 8.5
people 8.4
figure 8.4
exterior 8.3
makeup 8.2
gold 8.2
carving 8.1
venetian 7.9
hidden 7.9
theater 7.8
bronze 7.8
kimono 7.7
facade 7.7
man 7.7
outdoor 7.6
column 7.6
decorative 7.5
style 7.4
cloak 7.3
color 7.2
film 7.2
colorful 7.2
memorial 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 98.8
statue 96.8
text 94.9
outdoor 92.1
black and white 88.5
sculpture 73.5
black 67
dress 63.1
clothing 57.7
old 45.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-57
Gender Male, 77.3%
Calm 97.1%
Sad 1.4%
Happy 0.9%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 32-48
Gender Male, 65.5%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Surprised 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%

Categories

Text analysis

Amazon

DICA
DON
al
جديع
its
YNTYRY
Аудан

Google

NEDA 3337 YNTY
NEDA
3337
YNTY