Human Generated Data

Title

Untitled (portrait of adults with children in front of painted backdrop)

Date

1934

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4276

Human Generated Data

Title

Untitled (portrait of adults with children in front of painted backdrop)

People

Artist: Durette Studio, American 20th century

Date

1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4276

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.8
Person 99.8
Person 99.5
Person 99.4
People 98.8
Person 98.6
Family 98.5
Person 97.9
Person 97.9
Person 96.9
Person 95.8
Clothing 95.8
Apparel 95.8
Footwear 95.8
Shoe 95.8
Shoe 94.5
Person 93.7
Person 92.7
Person 83.3
Photography 58.5
Photo 58.5
Shoe 56.1

Clarifai
created on 2019-06-01

people 99.9
group 99.5
woman 98.8
adult 98.8
group together 98.7
many 97.7
man 97.2
monochrome 95.7
wear 95.3
child 95
wedding 89.9
several 89.2
indoors 87.7
music 87.7
outfit 87.4
dancer 82
veil 81.9
facial expression 80.8
room 80.3
actress 80

Imagga
created on 2019-06-01

groom 80.3
kin 62.5
bride 37.7
wedding 33.1
dress 31.6
couple 30.5
love 27.6
people 24.5
bouquet 22.7
married 21.1
man 19.5
marriage 18
happiness 18
bridal 17.5
gown 17.1
male 17
happy 16.9
two 15.2
celebration 15.1
flowers 14.8
adult 14.8
wed 14.7
ceremony 14.5
church 12.9
portrait 12.9
veil 12.7
women 12.6
person 12.3
flower 12.3
together 12.3
musical instrument 11.4
wife 11.4
matrimony 10.8
smile 10.7
fashion 10.5
men 10.3
commitment 9.8
romantic 9.8
attractive 9.8
old 9.7
new 9.7
husband 9.7
art 9.2
religion 9
romance 8.9
family 8.9
accordion 8.5
rose 8.4
future 8.4
suit 8.1
history 8
black 7.8
party 7.7
summer 7.7
holy 7.7
loving 7.6
elegance 7.6
wind instrument 7.5
historical 7.5
human 7.5
city 7.5
monument 7.5
outdoors 7.5
keyboard instrument 7.4
indoor 7.3
smiling 7.2
looking 7.2
home 7.2
column 7.1
hair 7.1
face 7.1
mother 7.1
interior 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 97.6
person 96.7
window 96.2
posing 94.3
smile 84.6
boy 75.9
baby 70.6
toddler 70.2
child 66.8
gallery 56.7
human face 54
old 41

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 15-25
Gender Female, 53.8%
Disgusted 45.4%
Sad 45.5%
Confused 45.4%
Happy 48.4%
Angry 45.3%
Surprised 45.5%
Calm 49.6%

AWS Rekognition

Age 23-38
Gender Male, 50%
Disgusted 45.1%
Happy 47.4%
Sad 45.7%
Calm 51%
Angry 45.2%
Surprised 45.3%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Male, 52.5%
Happy 45.4%
Disgusted 45.2%
Angry 45.1%
Surprised 45.1%
Sad 46%
Calm 53%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Confused 45.1%
Surprised 45.2%
Sad 46.9%
Angry 45.2%
Happy 45.5%
Calm 52.1%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 53.6%
Confused 45.3%
Surprised 45.3%
Happy 45.5%
Angry 45.4%
Sad 45.9%
Disgusted 51.7%
Calm 45.9%

AWS Rekognition

Age 20-38
Gender Female, 53.6%
Confused 45.1%
Angry 45.1%
Sad 54.4%
Calm 45.1%
Disgusted 45.1%
Happy 45.1%
Surprised 45%

AWS Rekognition

Age 35-53
Gender Female, 53.4%
Disgusted 45.4%
Confused 45.2%
Angry 45.4%
Calm 46.9%
Surprised 45.3%
Happy 45.2%
Sad 51.5%

AWS Rekognition

Age 20-38
Gender Female, 51%
Happy 45.4%
Angry 45.5%
Surprised 45.3%
Disgusted 45.5%
Sad 49.3%
Calm 48.7%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Sad 45.5%
Angry 45.3%
Disgusted 45.1%
Surprised 45.4%
Happy 51.2%
Calm 47.4%
Confused 45.2%

Feature analysis

Amazon

Person 99.8%
Shoe 95.8%