Human Generated Data

Title

Untitled (portrait of group in living room wearing costumes)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4269

Human Generated Data

Title

Untitled (portrait of group in living room wearing costumes)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4269

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.1
Person 98.8
Person 97.3
Person 97
Person 96.8
Person 96.2
Person 95.7
Person 94.9
Person 93.2
Leisure Activities 88.5
Musical Instrument 88.3
Musician 88.3
Person 84.4
Guitar 80.8
People 75.5
Helmet 75.1
Apparel 75.1
Clothing 75.1
Person 74.9
Helmet 74.9
Crowd 65.9
Room 64
Indoors 64
Music Band 62.4
Performer 58.2
Guitarist 58.2
Advertisement 56.8
Poster 56.5
Female 55.3
Girl 55.3
Person 50.5

Clarifai
created on 2019-06-01

people 99.5
group 98
man 94.9
woman 94.7
child 94.3
adult 93.7
desktop 88.1
illustration 88
monochrome 87.2
group together 85.3
family 81.7
room 79.6
business 75.9
wear 75.6
music 75.1
many 75
boy 74.9
communication 74.8
crowd 74.4
actor 74.4

Imagga
created on 2019-06-01

art 21.1
black 18.6
grunge 17.9
silhouette 16.5
kin 16
man 15.5
drawing 15.5
design 15.2
person 14.3
sketch 12.9
people 12.3
style 11.9
decoration 11.1
pattern 10.9
retro 10.6
human 10.5
men 10.3
male 9.9
film 9.6
dirty 9
painting 9
science 8.9
costume 8.8
chandelier 8.8
graphic 8.7
ink 8.7
party 8.6
negative 8.6
frame 8.4
modern 8.4
equipment 8.4
vintage 8.3
world 8.2
music 8.1
paint 8.1
light 8
play 7.7
old 7.7
poster 7.5
fashion 7.5
symbol 7.4
star 7.2
team 7.2
cartoon 7.1
dance 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 95.6
window 95.5
person 93.2
clothing 92.5
holding 92
posing 80.7
group 77
old 65.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 51.7%
Surprised 45.1%
Happy 45.2%
Disgusted 45%
Calm 54.4%
Sad 45.1%
Confused 45.1%
Angry 45.1%

AWS Rekognition

Age 15-25
Gender Female, 54.1%
Confused 45.5%
Calm 48%
Sad 50.3%
Surprised 45.4%
Angry 45.3%
Disgusted 45.2%
Happy 45.3%

AWS Rekognition

Age 26-43
Gender Female, 52.6%
Surprised 45.8%
Sad 46.1%
Happy 51.5%
Angry 45.3%
Disgusted 45.3%
Confused 45.4%
Calm 45.4%

AWS Rekognition

Age 26-43
Gender Female, 51.1%
Calm 52.3%
Surprised 45.4%
Sad 45.5%
Confused 45.2%
Disgusted 45.3%
Happy 46.2%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Male, 52.7%
Confused 45.4%
Angry 45.6%
Sad 46.9%
Calm 47.1%
Disgusted 45.4%
Happy 49.3%
Surprised 45.5%

AWS Rekognition

Age 23-38
Gender Male, 53.2%
Angry 45.1%
Surprised 45.1%
Disgusted 54.3%
Sad 45.1%
Calm 45.3%
Happy 45.1%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 52.2%
Sad 45.6%
Angry 45.5%
Calm 45.6%
Confused 45.2%
Disgusted 45.2%
Happy 52.2%
Surprised 45.6%

AWS Rekognition

Age 26-43
Gender Female, 51.3%
Angry 45.3%
Happy 45.4%
Confused 45.3%
Calm 52.4%
Disgusted 45.5%
Sad 45.7%
Surprised 45.4%

AWS Rekognition

Age 12-22
Gender Female, 54%
Calm 51.1%
Surprised 45.7%
Sad 46.3%
Confused 45.5%
Disgusted 45.4%
Happy 45.6%
Angry 45.4%

Feature analysis

Amazon

Person 99.5%
Helmet 75.1%

Categories