Human Generated Data

Title

Mary Tippie (Zouave Mary)

Date

19th century

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2019.155

Human Generated Data

Title

Mary Tippie (Zouave Mary)

People

Artist: Unidentified Artist,

Date

19th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2019.155

Machine Generated Data

Tags

Amazon
created on 2019-10-05

Human 98.8
Person 98.8
Apparel 97.3
Hat 97.3
Clothing 97.3
Leisure Activities 94
Banjo 92.4
Musical Instrument 92.4
Chair 85.3
Furniture 85.3
Footwear 66.3
Shoe 66.3

Clarifai
created on 2019-10-05

people 99.8
wear 98.6
one 98.5
adult 98.3
portrait 96.2
woman 93.8
veil 91.3
lid 88.7
military 86.4
retro 86
art 85.5
administration 82.6
print 81.2
outfit 80.8
coat 80.8
soldier 78.7
man 77.9
uniform 77.5
war 75.5
weapon 72.5

Imagga
created on 2019-10-05

banjo 47.5
musical instrument 47.4
stringed instrument 42.7
dress 29.8
people 24
adult 24
person 23.2
man 21.5
wind instrument 20.4
portrait 20.1
fashion 19.6
face 17.7
brass 16.5
couple 15.7
male 15.6
happy 14.4
happiness 14.1
cornet 14.1
model 14
attractive 14
black 13.9
clothing 13.8
bride 13.6
smile 13.5
women 12.6
love 12.6
old 12.5
sexy 12
culture 12
hair 11.9
style 11.9
costume 11.7
posing 11.5
married 11.5
outdoor 11.5
brunette 11.3
outdoors 11.2
pretty 11.2
art 11.1
wedding 11
vintage 10.7
holding 10.7
cheerful 10.6
lady 10.5
riding crop 10.2
smiling 10.1
instrument 9.9
religion 9.9
romantic 9.8
outfit 9.7
elegant 9.4
lifestyle 9.4
whip 9.1
device 9.1
one 9
romance 8.9
celebration 8.8
guitar 8.7
cute 8.6
expression 8.5
bouquet 8.5
clothes 8.4
elegance 8.4
color 8.3
life 8.3
human 8.2
makeup 8.2
fun 8.2
stylish 8.1
musician 8.1
singer 8
interior 8
hat 7.8
day 7.8
outside 7.7
hairstyle 7.6
marriage 7.6
studio 7.6
passion 7.5
joy 7.5
dark 7.5
weapon 7.5
guy 7.5
retro 7.4
skirt 7.3
sensual 7.3
music 7.2
sunlight 7.1
family 7.1

Google
created on 2019-10-05

Microsoft
created on 2019-10-05

wall 99.3
person 96
floor 93.2
clothing 86.9
old 78.6
text 60.8
man 55.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-29
Gender Male, 95.1%
Surprised 0.4%
Sad 9.6%
Calm 78.8%
Fear 0.2%
Happy 0.1%
Disgusted 0.4%
Confused 7.6%
Angry 2.9%

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Hat 97.3%
Chair 85.3%
Shoe 66.3%

Categories

Imagga

interior objects 77.4%
paintings art 21.7%

Captions