Human Generated Data

Title

Untitled (two photographs: studio portrait of two girls with open book and white hair bows; studio portrait of girl standing by toddler in chair, both in white)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6114

Human Generated Data

Title

Untitled (two photographs: studio portrait of two girls with open book and white hair bows; studio portrait of girl standing by toddler in chair, both in white)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6114

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.6
Human 99.6
Person 99
Person 98.6
Person 97.2
Furniture 96.3
Sitting 96
Shoe 75.7
Apparel 75.7
Footwear 75.7
Clothing 75.7
Face 71.2
Photography 70.3
Photo 70.3
Portrait 70.3
Worker 67.5
People 66.1
Flooring 64.6
Hairdresser 58.7
Chair 57
Shoe 52.2
Chair 51.7

Clarifai
created on 2019-11-16

people 100
group 99.5
child 98.8
woman 98.5
group together 98.2
room 98.2
furniture 98.1
adult 97.5
man 96.8
family 96.5
sit 96.3
several 94.9
chair 94
five 93.3
four 92.4
seat 92.1
three 91.5
indoors 91.3
offspring 90
administration 89.1

Imagga
created on 2019-11-16

man 27.5
person 22.8
adult 21.5
people 20.1
musical instrument 18.8
kin 18.5
male 17.8
couple 17.4
world 16.9
portrait 15.5
room 13.7
fashion 13.6
clothing 13.5
chair 12.2
teacher 12.1
men 12
old 11.8
dark 11.7
accordion 11.4
black 10.2
business 9.7
keyboard instrument 9.7
home 9.6
happiness 9.4
happy 9.4
clothes 9.4
two 9.3
historic 9.2
wind instrument 9.2
family 8.9
interior 8.8
businessman 8.8
professional 8.7
standing 8.7
love 8.7
sitting 8.6
attractive 8.4
educator 8.4
worker 8.3
human 8.2
dress 8.1
religion 8.1
romantic 8
building 8
women 7.9
model 7.8
wall 7.7
window 7.3
life 7.3
mother 7.2
sexy 7.2
shop 7.2
history 7.2
hair 7.1
architecture 7
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 99.2
person 98.6
human face 94.3
man 93.4
furniture 92
smile 90.7
text 73.8
footwear 70.5
boy 68.3
chair 67.9
child 53
baby 51.2
old 50.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 54.1%
Calm 53.6%
Sad 45.9%
Happy 45%
Angry 45.3%
Fear 45%
Surprised 45%
Confused 45.2%
Disgusted 45%

AWS Rekognition

Age 4-14
Gender Female, 53.6%
Surprised 49.4%
Fear 45.6%
Angry 46.2%
Happy 45%
Sad 45.4%
Confused 45.5%
Calm 47.7%
Disgusted 45.2%

AWS Rekognition

Age 0-3
Gender Female, 52.9%
Happy 45%
Confused 45%
Calm 45%
Angry 45.1%
Disgusted 45%
Surprised 45%
Fear 45%
Sad 54.8%

AWS Rekognition

Age 5-15
Gender Female, 54.2%
Confused 45.2%
Happy 45%
Disgusted 45%
Calm 45.8%
Angry 45%
Sad 46.1%
Surprised 45.2%
Fear 52.6%

Microsoft Cognitive Services

Age 10
Gender Female

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 75.7%
Chair 57%

Categories