Human Generated Data

Title

Untitled ("Making Kucha Well")

Date

c.1860-1880

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.75

Human Generated Data

Title

Untitled ("Making Kucha Well")

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Date

c.1860-1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.75

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Human 99.7
Person 99.7
Person 99.6
Person 99.5
Soil 99.4
Person 99.3
Person 98.1
Animal 97.4
Cow 97.4
Cattle 97.4
Mammal 97.4
Outdoors 96.2
Garden 95.7
Gardener 92.9
Worker 92.9
Gardening 92.9
Person 77.8
Wood 73.1
Ground 59

Clarifai
created on 2018-10-18

people 100
adult 99.9
group 99.8
group together 99.3
many 98.9
man 98.5
war 98.4
two 98.2
one 98
military 98
soldier 97.1
several 96.7
woman 96.1
child 95
wear 93.8
three 93.5
vehicle 93.1
four 92.3
weapon 92
skirmish 91.7

Imagga
created on 2018-10-18

tool 54.4
plow 35.4
barrow 30.1
handcart 24.5
wheeled vehicle 23.2
shovel 20.3
outdoor 19.1
outdoors 19
man 18.8
beach 17
sunset 16.2
people 16.2
summer 16.1
person 15.8
vehicle 15.7
adult 14.9
sand 14.4
landscape 14.1
travel 14.1
hand tool 13.5
male 13.5
boy 13
rake 12.8
hoe 12.7
sea 12.5
park 12.4
water 12
sport 12
field 11.7
vacation 11.5
grass 11.1
sky 10.8
tree 10.5
sun 10.5
outside 10.3
mountain 9.8
hiking 9.6
men 9.5
child 9.4
active 9.1
sunlight 8.9
rural 8.8
couple 8.7
rock 8.7
day 8.6
conveyance 8.6
old 8.4
fun 8.2
countryside 8.2
teenager 8.2
farm 8
lifestyle 8
model 7.8
attractive 7.7
two 7.6
earth 7.6
sunrise 7.5
leisure 7.5
ocean 7.5
sports 7.4
lake 7.3
activity 7.2
women 7.1

Google
created on 2018-10-18

tree 78.5
soil 74.1
black and white 63.5
plant 58.9
vehicle 51.2

Microsoft
created on 2018-10-18

outdoor 99.9
old 91.3
standing 79.7
vintage 32
raft 10.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 53.6%
Disgusted 45.2%
Sad 49.5%
Angry 45.3%
Surprised 45.3%
Happy 46.8%
Calm 47.6%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Disgusted 45%
Calm 45.5%
Sad 54.1%
Angry 45.2%
Confused 45.1%
Happy 45.1%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Female, 70.3%
Calm 89.6%
Confused 1%
Sad 4.5%
Disgusted 0.7%
Happy 0.9%
Angry 2.4%
Surprised 1%

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Disgusted 45.9%
Happy 49.9%
Surprised 45.3%
Angry 45.5%
Calm 45.4%
Confused 45.3%
Sad 47.7%

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Cow 97.4%