Human Generated Data

Title

Untitled (mother and baby sitting in armchair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17141

Human Generated Data

Title

Untitled (mother and baby sitting in armchair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17141

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 97.1
Clothing 96.9
Apparel 96.9
Person 95.5
Person 89.6
Baby 89
Furniture 87.6
Face 83.2
Chair 81.2
Newborn 77
Home Decor 70.5
Portrait 68.6
Photography 68.6
Photo 68.6
Hat 65.7
Couch 65.2
Performer 58.9
Kid 58.5
Child 58.5
Female 58
Person 50.2

Clarifai
created on 2023-10-29

people 99.7
monochrome 99.6
child 98.4
woman 97.6
two 96.4
man 95.4
adult 94.9
family 94.9
baby 93.6
sit 90.7
offspring 89.5
son 89.3
indoors 88.9
furniture 88.7
chair 88.7
wear 88.1
seat 88
girl 85.7
wedding 84.9
actress 84.2

Imagga
created on 2022-02-26

people 24
person 20.6
man 20.1
adult 16.9
male 15.6
portrait 14.9
face 14.2
shower cap 12.1
salon 11.8
working 11.5
smile 11.4
equipment 11.3
cap 11.2
health 11.1
professional 11.1
dress 10.8
black 10.8
clothing 10.7
science 10.7
art 10.7
happy 10.6
medical 10.6
fashion 10.5
human 10.5
doctor 10.3
hair 10.3
headdress 9.8
negative 9.6
mask 9.5
sitting 9.4
luxury 9.4
senior 9.4
indoor 9.1
hand 9.1
care 9
technology 8.9
worker 8.9
medicine 8.8
film 8.7
work 8.7
women 8.7
love 8.7
lifestyle 8.7
costume 8.6
men 8.6
chair 8.4
room 8.4
mature 8.4
hospital 8.1
romantic 8
celebration 8
smiling 8
indoors 7.9
color 7.8
old 7.7
device 7.6
nurse 7.6
television 7.5
brass 7.5
sculpture 7.5
holding 7.4
surgeon 7.3
musical instrument 7.3
sexy 7.2
looking 7.2
home 7.2
look 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.8
window 93
person 91.1
baby 90.9
toddler 89.1
clothing 87.1
human face 86.2
smile 58.7
old 44.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 97.4%
Happy 73.7%
Calm 13.8%
Surprised 8.4%
Fear 1.6%
Disgusted 0.8%
Angry 0.6%
Sad 0.6%
Confused 0.5%

AWS Rekognition

Age 29-39
Gender Female, 98.4%
Calm 44%
Sad 35.7%
Happy 12.2%
Fear 2.3%
Surprised 2.2%
Disgusted 1.3%
Angry 1.3%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Couch
Person 95.5%
Person 89.6%
Person 50.2%
Couch 65.2%

Categories

Text analysis

Amazon

KODAK-SLA

Google

YT37A2-XAO
YT37A2-XAO