Human Generated Data

Title

Untitled (man in suit seated in upholstered chair with little girl holding doll on lap, seated by fireplace)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12898

Human Generated Data

Title

Untitled (man in suit seated in upholstered chair with little girl holding doll on lap, seated by fireplace)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12898

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 99.8
Human 99.4
Person 99.4
Person 96.8
Indoors 96.3
Fireplace 96.3
Armchair 90.1
Hearth 84.2
Couch 79.6
Room 75.7
Living Room 75.7
Clothing 74.3
Apparel 74.3

Clarifai
created on 2019-11-16

people 100
two 99.6
adult 99
man 98.2
actress 98.2
child 97.4
woman 97.3
group 97.2
portrait 95.9
family 95.8
offspring 95.8
actor 95.5
administration 94.8
leader 94.7
furniture 94.5
three 94.3
one 93.5
sit 91.9
room 91.2
music 90.3

Imagga
created on 2019-11-16

man 20.1
black 19.5
person 15.4
window 15.3
world 15.2
people 13.9
male 13.8
kin 13.5
old 13.2
portrait 12.9
dress 12.6
dark 11.7
adult 11.7
family 11.6
home 11.2
culture 11.1
house 10.9
vintage 10.7
happy 10.6
sibling 10.4
hair 10.3
wall 10.3
happiness 10.2
performer 10
comedian 10
sexy 9.6
child 9.5
face 9.2
historic 9.2
fashion 9
history 8.9
building 8.9
mother 8.9
decoration 8.8
ancient 8.6
grunge 8.5
historical 8.5
attractive 8.4
one 8.2
holiday 7.9
urban 7.9
sill 7.9
couple 7.8
shop 7.7
youth 7.7
room 7.7
human 7.5
city 7.5
future 7.4
dirty 7.2
looking 7.2
body 7.2
art 7.1
entertainer 7.1
women 7.1
architecture 7.1
night 7.1
love 7.1
interior 7.1
daughter 7
grandfather 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

human face 93.4
person 91.8
toddler 90.8
clothing 90.1
baby 90
black and white 87.6
text 85.4
smile 73.3
boy 68.6
old 42.2
fireplace 21.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-32
Gender Male, 95.5%
Calm 90.7%
Disgusted 1%
Sad 1%
Happy 0.6%
Angry 1.2%
Fear 0.2%
Surprised 1.9%
Confused 3.5%

AWS Rekognition

Age 1-5
Gender Female, 88.7%
Angry 0.6%
Happy 91%
Disgusted 0.2%
Fear 0.3%
Confused 1.2%
Surprised 0.3%
Calm 3.9%
Sad 2.5%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

VEEVWLLBVLE