Human Generated Data

Title

Untitled (studio photograph of three women costumed as clowns and dancers)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3731

Human Generated Data

Title

Untitled (studio photograph of three women costumed as clowns and dancers)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3731

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Dance Pose 98
Leisure Activities 98
Human 96.1
Person 95.8
Dance 92.8
Person 92.6
Stage 87.3
Person 83.3
Person 82.3
Mammal 79.8
Horse 79.8
Animal 79.8
Apparel 65.5
Clothing 65.5
Ballet 59.6
Performer 59.5
Ballerina 56.3

Clarifai
created on 2019-06-01

people 99.7
adult 98.2
wear 96.5
two 96
man 94.9
woman 93.5
one 93.2
outfit 92.2
portrait 87.5
retro 85.9
group 85
leader 84.4
group together 84
administration 83.8
three 83.2
chair 82.1
veil 77
uniform 76.1
vehicle 75.8
child 73.3

Imagga
created on 2019-06-01

astronaut 34.2
automaton 32.8
helmet 21.6
statue 18.4
sculpture 16.8
art 16
city 15.8
tourism 12.4
culture 12
person 11.8
history 11.6
man 11.4
travel 11.3
brass 10.7
costume 10.7
sword 10.4
mask 10.3
monument 10.3
traditional 10
human 9.7
decoration 9.6
celebration 9.6
people 9.5
old 9
religion 9
weapon 8.8
horse 8.5
portrait 8.4
amulet 8.3
football helmet 8.2
knight 7.9
urban 7.9
war 7.7
clothing 7.5
famous 7.4
historic 7.3
protection 7.3
color 7.2
building 7.2
colorful 7.2
adult 7.2
face 7.1
male 7.1
day 7.1
architecture 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

black 88.2
statue 83.7
horse 74.2
person 73.5
old 59.6
clothing 50.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Male, 80.9%
Angry 5.3%
Happy 23.2%
Confused 4.8%
Sad 2.4%
Calm 45.7%
Surprised 11.8%
Disgusted 6.8%

AWS Rekognition

Age 35-52
Gender Male, 52.4%
Surprised 45.1%
Sad 46.7%
Happy 45.4%
Angry 45.1%
Disgusted 45%
Confused 45.2%
Calm 52.4%

Feature analysis

Amazon

Person 95.8%
Horse 79.8%

Categories

Imagga

interior objects 99.8%