Human Generated Data

Title

Melon Seller

Date

c. 1891

People

Artist: Sébah & Joaillier,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Daniel and Janet Tassel, 2015.137

Human Generated Data

Title

Melon Seller

People

Artist: Sébah & Joaillier,

Date

c. 1891

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Daniel and Janet Tassel, 2015.137

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Human 98.5
Person 98.5
Soil 83.4
Worker 81
Outdoors 64.6
Plant 64
Apparel 57.1
Clothing 57.1

Clarifai
created on 2018-02-16

people 100
adult 99.7
one 99.7
two 98.6
man 98.1
military 97.5
war 97.2
wear 97.1
group 96.1
sit 94.8
soldier 94.6
three 94.3
woman 93.4
four 93.3
portrait 92.9
veil 92.5
bucket 91.3
group together 90.9
weapon 87.9
facial hair 87.5

Imagga
created on 2018-02-16

statue 19.1
man 18.8
sculpture 18.6
person 18.2
fountain 15.2
history 15.2
architecture 14.8
model 14.8
container 14.7
stone 14.6
culture 14.5
people 14.5
rustic 14.4
portrait 14.2
travel 14.1
attractive 14
art 13.3
old 13.2
outdoor 13
ancient 13
adult 12.9
seller 12.4
structure 11.6
black 11.5
can 11.5
lady 11.4
male 11.3
fashion 11.3
human 11.2
sexy 11.2
monument 11.2
body 11.2
tourism 10.7
water 10
religion 9.9
wall 9.7
style 9.6
youth 9.4
famous 9.3
face 9.2
city 9.1
posing 8.9
women 8.7
men 8.6
historical 8.5
religious 8.4
hand 8.4
sky 8.3
bucket 8
lifestyle 7.9
sand 7.9
building 7.9
vessel 7.9
standing 7.8
work 7.7
pretty 7.7
tree 7.7
beach 7.6
tourist 7.5
outdoors 7.5
holding 7.4
figure 7.3
girls 7.3
dirty 7.2
child 7.2
sunset 7.2
cute 7.2

Google
created on 2018-02-16

Microsoft
created on 2018-02-16

outdoor 96.7
ground 96.3
old 93.9
stone 27.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 35-52
Gender Male, 83%
Happy 3.9%
Disgusted 1.6%
Angry 1.8%
Calm 79.5%
Surprised 1.7%
Sad 8.9%
Confused 2.7%

Microsoft Cognitive Services

Age 58
Gender Male

Feature analysis

Amazon

Person 98.5%

Text analysis

Amazon

22
IK