Human Generated Data

Title

Untitled (bearded man and child, seated, three-quarter view)

Date

c. 1860

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs; Kate, Maurice R. and Melvin R. Seiden Purchase Fund for Photographs; Discretionary Fund for the Photograph Department; Robert M. Sedgwick II Fund, P1997.37.9

Human Generated Data

Title

Untitled (bearded man and child, seated, three-quarter view)

People

Artist: Unidentified Artist,

Date

c. 1860

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.3
Clothing 99.3
Human 99.2
Person 99.2
Person 98.7
Art 89
Painting 89
Footwear 77
Boot 72.9
Furniture 60.2
Home Decor 59.1
Drawing 56.5
Floor 56.1
Sketch 56

Imagga
created on 2022-01-29

kin 31.4
mother 26.5
parent 21
portrait 18.1
old 17.4
dress 17.2
man 16.8
ancient 16.4
statue 16.2
religion 16.1
child 15.5
people 15.1
culture 14.5
art 13.7
male 13.6
face 13.5
sculpture 13.4
religious 12.2
stone 11.8
family 11.6
decoration 11.1
love 11
adult 11
historic 11
traditional 10.8
vintage 10.8
grandfather 10.7
costume 10.5
happy 10
person 9.8
boy 9.6
antique 9.5
historical 9.4
wall 9.4
architecture 9.4
sibling 9.3
makeup 9.2
father 9.1
black 9
dad 8.9
mask 8.8
plaything 8.6
dark 8.4
fashion 8.3
style 8.2
romantic 8
couple 7.8
sepia 7.8
golden 7.7
bride 7.7
grunge 7.7
card 7.7
head 7.6
human 7.5
monument 7.5
one 7.5
closeup 7.4
church 7.4
building 7.3
world 7.2
celebration 7.2
history 7.2
interior 7.1
disguise 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

wall 99.5
human face 96.9
drawing 96.2
clothing 94
person 93.5
baby 91.9
sketch 91.1
toddler 88.5
old 63.8
text 63.2
child 61.7
boy 53.3
stone 6.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Male, 98.8%
Calm 69.9%
Sad 8%
Fear 6.3%
Confused 5.6%
Surprised 5.3%
Angry 3.1%
Disgusted 1.2%
Happy 0.5%

Microsoft Cognitive Services

Age 3
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Painting 89%

Captions

Microsoft

a man standing in front of a building 77.5%
an old photo of a man 77.4%
a man standing in front of an old building 77.3%