Human Generated Data

Title

Untitled (tree dedication ceremony with graduates)

Date

1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1488

Human Generated Data

Title

Untitled (tree dedication ceremony with graduates)

People

Artist: Durette Studio, American 20th century

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.2
Human 98.2
Person 98.1
Person 97
Person 96.4
Person 93.6
Clothing 83.2
Apparel 83.2
Crowd 77.2
Person 76.4
People 74.5
Person 72.3
Person 71.3
Symbol 65.8
Text 65.2
Female 57.7
Parade 55.1

Clarifai
created on 2019-06-01

people 99.9
group 98.8
adult 97.4
wear 96.4
many 95.8
group together 95.7
man 94.8
woman 92.9
veil 90.5
outfit 88.5
dress 87
wedding 86.4
illustration 85.2
child 85.1
art 85.1
leader 85
administration 84.7
ceremony 83.2
one 83
crowd 77.5

Imagga
created on 2019-06-01

umbrella 17.8
flag 17.6
seller 17.1
dress 15.3
old 14.6
people 13.9
canopy 13.8
bride 13.4
emblem 13.3
parasol 12.6
shelter 11.7
couple 11.3
love 11
wedding 11
religion 10.7
barbershop 10.1
holiday 10
shop 10
travel 9.9
ceremony 9.7
summer 9.6
vintage 9.1
person 9.1
boutique 9
color 8.9
art 8.7
holy 8.7
grunge 8.5
flower 8.5
black 8.4
tourism 8.2
celebration 8
stall 7.9
happiness 7.8
coat hanger 7.7
faith 7.6
two 7.6
fashion 7.5
dark 7.5
religious 7.5
outdoors 7.5
style 7.4
church 7.4
man 7.4
adult 7.2
history 7.2
romantic 7.1
sunlight 7.1
life 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 76.9
white 66.3
person 55.9
black and white 53.3

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Angry 49.5%
Sad 50.5%
Surprised 49.5%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 48-68
Gender Male, 53%
Angry 45.5%
Calm 51.4%
Sad 45.4%
Surprised 45.7%
Disgusted 45.1%
Happy 46.7%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Male, 51.4%
Confused 45.1%
Happy 45.1%
Calm 45.6%
Disgusted 45.1%
Sad 54%
Angry 45.1%
Surprised 45.1%

AWS Rekognition

Age 48-68
Gender Female, 50.4%
Happy 49.5%
Sad 50.1%
Confused 49.6%
Angry 49.6%
Disgusted 49.5%
Surprised 49.6%
Calm 49.6%

AWS Rekognition

Age 17-27
Gender Female, 50.2%
Sad 49.9%
Happy 49.6%
Confused 49.5%
Angry 50%
Surprised 49.5%
Disgusted 49.5%
Calm 49.5%

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a group of people standing in front of a building 69.9%
a group of people standing in front of a white building 61.4%
a group of people standing in front of a store 52.4%

Text analysis

Amazon

VE2nbEbbrE