Human Generated Data

Title

Untitled (two merchants and their wares)

Date

c. 1860-1880

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.55

Human Generated Data

Title

Untitled (two merchants and their wares)

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Date

c. 1860-1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.55

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Clothing 99.2
Apparel 99.2
Human 98.3
Person 98.3
Person 97.7
Art 92.6
Painting 92.6
People 81.8
Hat 81.4
Helmet 80.4
Headband 74
Turban 74
Person 66.3
Cake 62.2
Food 62.2
Dessert 62.2

Clarifai
created on 2018-10-18

people 100
two 99.2
adult 99
group 98.5
wear 98.2
three 96.7
veil 96.3
woman 95
one 94.8
several 93.2
four 92.3
man 90.8
many 90.2
furniture 89.4
sit 87.4
leader 87.4
group together 86
five 85.8
child 84.9
administration 83.9

Imagga
created on 2018-10-18

dress 30.7
person 25.3
portrait 23.3
bride 23.2
adult 22.2
fashion 21.9
clothing 21.6
people 21.2
women 20.6
wedding 19.3
happy 16.3
happiness 15.7
face 15.6
home 15.2
love 15
smile 15
mother 14.9
lady 14.6
bouquet 14.1
traditional 13.3
attractive 13.3
marriage 12.3
male 12.2
old 11.8
elegance 11.8
gown 11.7
holding 11.6
married 11.5
cheerful 11.4
wife 11.4
celebration 11.2
culture 11.1
hair 11.1
model 10.9
smiling 10.9
hand 10.6
interior 10.6
human 10.5
pretty 10.5
flowers 10.4
mature 10.2
man 10.1
posing 9.8
ceremony 9.7
sexy 9.6
men 9.4
holiday 9.3
indoor 9.1
garment 9
blond 9
family 8.9
look 8.8
couple 8.7
lifestyle 8.7
room 8.6
day 8.6
sitting 8.6
groom 8.5
costume 8.4
black 8.4
summer 8.4
one 8.2
sensuality 8.2
child 8.1
white 8.1
religion 8.1
romance 8
life 7.8
bridal 7.8
luxury 7.7
two 7.6
seller 7.6
clothes 7.5
style 7.4
church 7.4
tradition 7.4
domestic 7.2
cute 7.2
husband 7.1

Google
created on 2018-10-18

Microsoft
created on 2018-10-18

person 97.7
old 91.6
military vehicle 61.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 95.9%
Angry 5.1%
Sad 62.2%
Disgusted 4.8%
Surprised 4%
Happy 4.8%
Confused 4.4%
Calm 14.8%

AWS Rekognition

Age 35-53
Gender Male, 90.7%
Disgusted 0.5%
Angry 4.6%
Sad 42%
Surprised 1.2%
Confused 5.6%
Happy 1.9%
Calm 44.4%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Painting 92.6%
Hat 81.4%
Helmet 80.4%

Captions

Microsoft
created on 2018-10-18

a vintage photo of a person 93.6%
an old photo of a person 93.5%
old photo of a person 93.4%

Text analysis

Amazon

ooo