Human Generated Data

Title

Untitled (vegetable merchant)

Date

c. 1860-1880

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.74

Human Generated Data

Title

Untitled (vegetable merchant)

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Date

c. 1860-1880

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Person 98.8
Human 98.8
Person 97.8
Person 97.7
Person 96.8
Market 94.3
Nature 92.2
Outdoors 85.8
Clothing 84.5
Apparel 84.5
Urban 81.5
Shop 78.9
Bazaar 78.9
Countryside 73.3
Building 65.8
Rural 65.6
Hut 58.9
Shelter 58.7
Plant 56.9

Clarifai
created on 2018-10-18

people 100
adult 99.6
group 99.2
woman 98
two 97.2
home 96.7
man 96.7
several 96.6
four 96.6
wear 94
child 93.7
three 93.3
one 93
many 92.8
group together 91.5
five 90.9
vehicle 90.6
merchant 90.4
war 88.7
offspring 85.9

Imagga
created on 2018-10-18

seller 100
travel 20.4
house 20
roof 19.6
old 18.8
rock 17.4
sky 17.2
landscape 16.4
stone 16
mountain 16
tourism 14
village 13.9
outdoors 13.4
people 13.4
architecture 13.3
summer 12.9
construction 12.8
man 12.8
building 12.7
rural 12.3
vacation 12.3
home 12
outdoor 11.5
wall 11.1
religion 10.7
park 10.7
tourist 10.4
culture 10.3
thatch 10.1
person 9.8
ancient 9.5
brick 9.4
relax 9.3
tree 9.2
outside 8.6
temple 8.5
tropical 8.5
adult 8.4
garden 8.4
peaceful 8.2
farm 8
women 7.9
country 7.9
sitting 7.7
dirt 7.6
hut 7.6
stall 7.5
serene 7.5
relaxation 7.5
resort 7.5
desert 7.5
mountains 7.4
historic 7.3
sun 7.2
smiling 7.2
lifestyle 7.2
day 7.1

Google
created on 2018-10-18

Microsoft
created on 2018-10-18

outdoor 94.4
old 87.3
building 84.1
posing 40.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Male, 54%
Angry 10.4%
Calm 31.2%
Confused 5.4%
Disgusted 12.8%
Surprised 4%
Happy 8.6%
Sad 27.6%

AWS Rekognition

Age 35-52
Gender Male, 99.1%
Sad 56.4%
Calm 11.8%
Happy 1.3%
Confused 16.3%
Disgusted 1.4%
Surprised 2.8%
Angry 10%

AWS Rekognition

Age 35-55
Gender Male, 59.2%
Happy 1.9%
Disgusted 2.6%
Angry 5.7%
Confused 2.6%
Sad 33.7%
Surprised 3.8%
Calm 49.8%

AWS Rekognition

Age 19-36
Gender Male, 93%
Angry 14.1%
Disgusted 53.2%
Confused 5.7%
Sad 6.7%
Surprised 6.9%
Calm 8.1%
Happy 5.3%

AWS Rekognition

Age 35-55
Gender Female, 55.6%
Surprised 7%
Happy 4.6%
Angry 4.5%
Calm 27.2%
Sad 50.8%
Disgusted 2.5%
Confused 3.4%

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people standing in front of a building 88.9%
an old photo of a person 88.8%
a person standing in front of an old building 88.7%