Human Generated Data

Title

Untitled (king and queen process through group)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1474

Human Generated Data

Title

Untitled (king and queen process through group)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.4
Person 99.4
Person 98.7
Person 96.3
Person 95.6
Text 93.3
Stage 92.3
Person 92.1
Person 90.1
Person 88.3
Apparel 86.3
Clothing 86.3
Person 74.2
Person 72.4
Person 72.4
Person 71.6
Crowd 70.7
People 65.8
Person 65.5
Person 62.4
Figurine 60.6
Person 55.3
Person 55.1

Clarifai
created on 2019-06-01

people 99.9
man 99.1
group 98.5
adult 97.6
group together 96.9
woman 95.7
many 94.8
leader 94.5
wear 94.3
partnership 93.1
several 91.3
administration 88.8
outfit 88
actor 86.2
crowd 85.1
two 84.1
musician 83.7
one 83.6
three 83.1
singer 82.6

Imagga
created on 2019-06-01

statue 61.9
architecture 38.3
sculpture 37.9
monument 29.9
boutique 29.1
religion 27.8
history 27.7
fan 26.9
landmark 26.2
marble 23.2
building 22.9
follower 21.8
ancient 21.6
old 21.6
tourism 21.4
travel 19.7
art 19.4
stone 19.1
city 19.1
famous 18.6
god 18.2
culture 17.9
historic 17.4
altar 16.4
fountain 16
structure 15.9
person 15.3
religious 15
vestment 14.1
tourist 13.6
gown 13
church 12.9
symbol 12.8
sky 12.8
temple 12.7
column 12.3
historical 12.2
faith 11.5
catholic 11.4
antique 11.2
worship 10.6
saint 10.6
memorial 10.5
scene 10.4
cross 10.4
traditional 10
national 10
holy 9.6
spiritual 9.6
attraction 9.6
park 9.1
cathedral 8.9
detail 8.8
sacred 8.8
pray 8.7
spirituality 8.6
ruler 8.6
golden 8.6
outerwear 8.4
color 8.3
gold 8.2
tranquil 8.1
night 8
black 7.8
roman 7.8
prayer 7.7
meditation 7.7
capital 7.6
man 7.5
vintage 7.4
decoration 7.3
peace 7.3
cemetery 7.3
people 7.2
clothing 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

white 87.8
statue 86.8
clothing 85.7
person 82.7
black 82
old 79.1
black and white 70.2
posing 57.1
man 50.4

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.1%
Angry 45.5%
Sad 46.2%
Happy 49.3%
Confused 45.5%
Surprised 45.6%
Calm 47.2%
Disgusted 45.7%

AWS Rekognition

Age 20-38
Gender Female, 50.8%
Calm 46.6%
Disgusted 47.4%
Confused 46.3%
Surprised 45.4%
Happy 46.1%
Sad 47.2%
Angry 46%

AWS Rekognition

Age 20-38
Gender Male, 50.1%
Happy 49.5%
Sad 49.7%
Surprised 49.5%
Confused 49.5%
Disgusted 49.6%
Calm 49.5%
Angry 50.1%

AWS Rekognition

Age 48-68
Gender Male, 50.1%
Disgusted 50.1%
Calm 49.6%
Surprised 49.5%
Angry 49.7%
Happy 49.6%
Sad 49.5%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Disgusted 45%
Confused 45.3%
Sad 46.9%
Happy 47.5%
Angry 45.2%
Surprised 45.2%
Calm 49.9%

AWS Rekognition

Age 17-27
Gender Male, 50.4%
Angry 49.6%
Happy 49.6%
Sad 49.5%
Disgusted 49.6%
Confused 49.6%
Calm 50%
Surprised 49.5%

AWS Rekognition

Age 45-63
Gender Female, 52.8%
Angry 46.1%
Calm 48.4%
Sad 46%
Surprised 45.9%
Disgusted 46.4%
Happy 46.5%
Confused 45.6%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a black and white photo of a man 87.6%
an old black and white photo of a man 86.6%
a vintage photo of a man 86.5%