Human Generated Data

Title

Untitled (portrait of a seated man)

Date

1860s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2688

Human Generated Data

Title

Untitled (portrait of a seated man)

People

Artist: Unidentified Artist,

Date

1860s

Classification

Photographs

Machine Generated Data

Tags

Clarifai
created on 2018-08-20

people 99.3
man 97.2
adult 95.5
one 95.3
portrait 92.8
monochrome 91.7
dark 87.6
wear 85.5
sit 85.2
lid 84.9
indoors 83.3
music 81.2
street 79.6
Halloween 78.3
art 77.2
retro 74.5
chair 74.2
smoke 74
woman 73.8
veil 72.2

Imagga
created on 2018-08-20

banjo 85
stringed instrument 77.2
musical instrument 66.1
man 28.2
person 27.7
male 23.4
black 20.8
adult 20.8
people 20.1
sax 15.7
old 13.9
serious 13.3
portrait 12.9
human 12.7
religion 12.5
one 11.9
attractive 11.9
dark 11.7
couple 11.3
money 11.1
vintage 10.7
statue 10.5
art 10.4
antique 10.4
style 10.4
business 10.3
currency 9.9
body 9.6
love 9.5
grunge 9.4
dollar 9.3
face 9.2
hand 9.1
suit 9
posing 8.9
device 8.8
sexy 8.8
conceptual 8.8
hair 8.7
light 8.7
men 8.6
wind instrument 8.5
silhouette 8.3
sensuality 8.2
dress 8.1
shadow 8.1
looking 8
sculpture 7.8
god 7.6
passion 7.5
religious 7.5
banking 7.3
lady 7.3
figure 7.3
success 7.2
music 7.2
romance 7.1
handsome 7.1

Google
created on 2018-08-20

Microsoft
created on 2018-08-20

person 96.4
man 93.9
indoor 85.9
old 62.8
posing 36.7

Face analysis

Microsoft

Google

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Captions

Microsoft

an old photo of a man 89.1%
old photo of a man 87.4%
a man posing for a photo 83.1%