Human Generated Data

Title

Untitled (studio portrait of children with stuffed animals on floor)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4357

Human Generated Data

Title

Untitled (studio portrait of children with stuffed animals on floor)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Clothing 99.6
Apparel 99.6
Human 99.5
Person 99.5
Person 99.4
Person 98.8
Person 92.6
People 79.7
Dress 76.6
Baby 75.9
Face 75.3
Female 72.2
Hat 72.1
Pet 71.2
Canine 71.2
Animal 71.2
Mammal 71.2
Dog 71.2
Person 65.8
Bonnet 65.6
Girl 65.3
Photo 64.9
Photography 64.9
Portrait 64.9
Poster 62.5
Advertisement 62.5
Kid 61.7
Child 61.7
Woman 58.2
Text 56.7
Costume 55.1

Clarifai
created on 2019-06-01

people 99.9
group 96.9
adult 96.5
man 95.4
print 94.3
woman 91.8
wear 90.8
illustration 89.3
two 87.6
art 86.3
group together 86
leader 84.8
several 84.6
portrait 82.7
lithograph 80.1
four 80
child 80
one 77.1
three 77.1
vertical 76.9

Imagga
created on 2019-06-01

sketch 100
drawing 80.5
representation 61.6
art 23.9
people 15.1
symbol 14.8
statue 14.6
body 12.8
male 12.8
man 12.8
sculpture 12.8
black 12.6
silhouette 11.6
fashion 11.3
portrait 11
history 10.7
design 10.7
human 10.5
figure 10.2
sport 9.9
cartoon 9.8
amulet 9.8
head 9.2
face 9.2
dress 9
shape 8.9
posing 8.9
sexy 8.8
marble 8.8
charm 8.7
artistic 8.7
person 8.5
elegance 8.4
old 8.4
clip art 8.3
paint 8.1
decoration 8.1
religion 8.1
team 8.1
architecture 7.9
angel 7.8
3d 7.7
culture 7.7
historical 7.5
city 7.5
monument 7.5
film 7.3
colorful 7.2

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

window 97
text 96
drawing 95.6
sketch 95.1
clothing 94.1
person 92
cartoon 78.1
human face 69.5
painting 53.9
posing 35.7
picture frame 12.4

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Angry 45.3%
Happy 45.1%
Sad 45.9%
Disgusted 45.2%
Confused 45.8%
Calm 52.3%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 51.9%
Sad 52.9%
Angry 45.2%
Happy 45.4%
Confused 45.3%
Calm 45.9%
Surprised 45.1%
Disgusted 45.2%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Confused 45.4%
Surprised 45.3%
Calm 46.5%
Sad 52.2%
Happy 45.3%
Disgusted 45.1%
Angry 45.3%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Happy 1.6%
Surprised 5.3%
Calm 26.8%
Sad 38%
Disgusted 2.2%
Angry 5.5%
Confused 20.5%

Feature analysis

Amazon

Person 99.5%
Dog 71.2%

Captions

Microsoft

a group of people posing for a photo in front of a window 67%
a group of men posing for a photo in front of a window 51.7%
a group of men standing next to a window 50.4%