Human Generated Data

Title

Untitled (two photographs: studio portrait of baby in pram; studio portrait of girl in white dress and veil)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6078

Human Generated Data

Title

Untitled (two photographs: studio portrait of baby in pram; studio portrait of girl in white dress and veil)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6078

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98.3
Human 98.3
Clothing 97.9
Apparel 97.9
Person 95.2
Wheel 90.9
Machine 90.9
Wheel 80.6
Wheel 73.5
Canopy 68.9
Female 62.9
Footwear 59.8
Shoe 59.8
Furniture 59.6
Chair 57.1

Clarifai
created on 2019-11-16

people 99.8
woman 98.1
chair 96.9
adult 96.5
two 96.3
child 95.8
sit 94.6
furniture 94.6
seat 93.5
room 93.4
group 93.3
man 92.5
actor 92.3
wear 92.2
actress 91.7
family 91.4
wedding 91
three 85.1
monochrome 84.2
one 83.3

Imagga
created on 2019-11-16

cradle 38.7
furniture 33.4
baby bed 32.8
chair 25.2
furnishing 20.8
seat 19.6
adult 17.7
man 17.5
old 17.4
person 16.9
people 16.7
wheeled vehicle 15.2
male 14.2
wheelchair 12.9
handcart 12.4
couple 12.2
clothing 11.1
bench 11
holiday 10.7
scene 10.4
musical instrument 10.1
rocking chair 9.7
cold 9.5
men 9.4
winter 9.4
travel 9.1
shopping cart 9.1
portrait 9.1
dress 9
family 8.9
love 8.7
sitting 8.6
snow 8.4
carriage 8.4
fashion 8.3
holding 8.2
outdoors 8.2
style 8.2
suit 8.1
vehicle 8.1
history 8
black 7.8
season 7.8
mother 7.6
traditional 7.5
tricycle 7.5
vintage 7.4
barrow 7.1
women 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 94.5
furniture 93.1
clothing 88.6
white 88.1
black 87.4
indoor 87.2
umbrella 85.2
chair 84.1
person 79.9
black and white 74.6
dress 73.8
old 62.1
woman 52.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 54%
Fear 45.1%
Happy 45.2%
Calm 53.8%
Sad 45.6%
Angry 45%
Surprised 45.3%
Disgusted 45%
Confused 45.1%

AWS Rekognition

Age 17-29
Gender Female, 52.6%
Fear 45%
Sad 45.1%
Disgusted 45.1%
Surprised 45.2%
Calm 53.8%
Happy 45.2%
Angry 45.6%
Confused 45.1%

Microsoft Cognitive Services

Age 1
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Wheel 90.9%
Shoe 59.8%

Categories