Human Generated Data

Title

Untitled (mother with two children)

Date

copy negative made c. 1960 of an earlier image

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21624

Human Generated Data

Title

Untitled (mother with two children)

People

Artist: John Howell, American active 1930s-1960s

Date

copy negative made c. 1960 of an earlier image

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21624

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.8
Human 98.8
Clothing 97.6
Apparel 97.6
Person 97.3
Painting 88.8
Art 88.8
Furniture 88.4
Face 83.2
Person 81
Baby 76.5
People 67.4
Bed 67.1
Hat 65.9
Portrait 65.3
Photography 65.3
Photo 65.3
Person 64.6
Drawing 64.3
Collage 62.7
Poster 62.7
Advertisement 62.7
Kid 60.8
Child 60.8
Couch 58.5
Bonnet 56.8

Clarifai
created on 2023-10-22

people 100
child 99.5
group 99.2
portrait 98.7
son 98.1
monochrome 97.3
woman 97.2
two 97
baby 96.9
adult 96.4
offspring 95.8
family 95.2
man 93.9
three 93
sibling 92.2
interaction 90.2
girl 90.1
wear 89.8
facial expression 89.6
documentary 88.5

Imagga
created on 2022-03-05

man 37
male 30.6
person 26.5
people 26.2
adult 26.1
grandfather 24.4
black 21.6
portrait 21.3
human 18.7
dark 17.5
attractive 17.5
model 16.3
hair 15.8
love 15.8
bow tie 15
sexy 14.4
one 14.2
couple 13.9
body 13.6
sitting 12.9
expression 12.8
serious 12.4
necktie 11.9
face 11.4
fitness 10.8
lifestyle 10.8
suit 10.8
posing 10.7
happy 10.6
guy 10.6
fashion 10.5
looking 10.4
men 10.3
room 9.6
passion 9.4
clothing 9.2
macho 9
handsome 8.9
happiness 8.6
muscular 8.6
smile 8.5
child 8.4
pretty 8.4
sensuality 8.2
dress 8.1
family 8
home 8
cute 7.9
youth 7.7
jeans 7.6
husband 7.6
wife 7.6
garment 7.6
silhouette 7.4
fit 7.4
device 7.3

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

human face 98.5
person 98.1
text 97.7
wall 97.3
man 96.8
clothing 95.4
baby 75.5
boy 50.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Male, 96.6%
Calm 94.4%
Sad 3.7%
Surprised 0.6%
Confused 0.4%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 4-12
Gender Male, 90%
Calm 97%
Sad 1.7%
Fear 0.5%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%
Confused 0%

AWS Rekognition

Age 27-37
Gender Male, 77.3%
Calm 57.5%
Sad 31.5%
Disgusted 3.1%
Surprised 2.4%
Fear 1.6%
Confused 1.5%
Happy 1.3%
Angry 1%

Feature analysis

Amazon

Person
Painting
Person 98.8%
Person 97.3%
Person 81%
Person 64.6%
Painting 88.8%

Categories

Captions

Text analysis

Amazon

AS