Human Generated Data

Title

Untitled (portrait of woman with baby and child in living room)

Date

c.1945

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18903

Human Generated Data

Title

Untitled (portrait of woman with baby and child in living room)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18903

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 96.8
Human 96.3
Person 96.3
Person 94.9
Couch 94.6
Living Room 89.9
Room 89.9
Indoors 89.9
Interior Design 89.6
Home Decor 88
Apparel 86.5
Clothing 86.5
Tree 82.8
Plant 82.8
People 65.2
Dress 58.5
Female 58.4
Face 58.1
Chair 56.9
Cushion 55.7

Clarifai
created on 2019-11-16

people 99.7
adult 96.5
child 95.6
two 94.8
woman 94.5
group 92.5
family 92.4
dress 91.6
portrait 91.3
one 90.6
room 90.5
wedding 89.4
man 87.6
monochrome 86.5
chair 86.3
sit 85.1
wear 84.7
girl 83.6
offspring 79.1
bride 76.1

Imagga
created on 2019-11-16

iron lung 26.5
respirator 21.2
decoration 19.7
people 19
graffito 18.8
person 17.7
room 16.5
breathing device 16
snow 16
man 15.4
old 15.3
city 13.3
device 11.4
weather 11.3
toilet 11.1
male 10.8
vintage 10.7
art 10.4
grunge 10.2
black 10.2
kin 10.2
design 10.1
fashion 9.8
adult 9.7
winter 9.4
street 9.2
aged 9
family 8.9
happy 8.8
house 8.4
hand 8.3
hospital 8.3
outdoors 8.2
retro 8.2
style 8.2
paint 8.1
dirty 8.1
currency 8.1
detail 8
child 8
smile 7.8
antique 7.8
scene 7.8
ancient 7.8
portrait 7.8
men 7.7
finance 7.6
building 7.4
business 7.3
dress 7.2
work 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 92
clothing 87.8
toddler 85.4
person 84.4
baby 83.8
child 73.2
human face 54
old 53.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-28
Gender Female, 89.9%
Calm 37.9%
Angry 14.5%
Surprised 14.8%
Disgusted 10.4%
Happy 8.1%
Sad 1.1%
Fear 2.8%
Confused 10.5%

AWS Rekognition

Age 0-3
Gender Female, 72.4%
Surprised 22.1%
Angry 5.8%
Happy 1.4%
Fear 8.2%
Calm 50.8%
Disgusted 3%
Sad 3%
Confused 5.6%

AWS Rekognition

Age 1-5
Gender Female, 96.2%
Happy 99.5%
Disgusted 0.1%
Fear 0%
Calm 0.3%
Confused 0%
Sad 0.1%
Surprised 0.1%
Angry 0%

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 1
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%

Categories

Imagga

paintings art 64%
people portraits 35.1%

Captions