Human Generated Data

Title

Untitled (three workers and two yoked cattle working at a waterwheel or a mill)

Date

c. 1860-1880

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.68

Human Generated Data

Title

Untitled (three workers and two yoked cattle working at a waterwheel or a mill)

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Date

c. 1860-1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.68

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Mammal 98.7
Cow 98.7
Cattle 98.7
Animal 98.7
Human 98.4
Person 98.4
Person 91.8
Cow 75.1
Person 72.7
Person 72.2
Machine 63.6
Transportation 61.6
Vehicle 61.6
Spoke 58.9
Tree 56
Plant 56
Wheel 56
Construction 55.3
Person 54

Clarifai
created on 2018-10-18

people 100
group together 99.7
group 99.6
adult 99.6
vehicle 99.3
war 98.8
many 98.6
military 98.5
two 98.2
skirmish 98.1
soldier 98
one 97.6
several 96.9
transportation system 96.6
man 96
cavalry 94.7
weapon 93.6
three 93.2
four 92.9
cart 88.7

Imagga
created on 2018-10-18

shopping cart 85.6
handcart 74
wheeled vehicle 59.1
container 35.1
snow 24.2
landscape 22.3
tree 20.9
conveyance 20.3
winter 17.9
cold 16.4
swing 15.5
old 15.3
sky 15.3
wood 15
architecture 14.8
chair 14.8
building 14.4
trees 14.2
outdoors 14.2
forest 13.9
outdoor 13
travel 12.7
park 12.3
machine 11.9
season 11.7
rural 11.5
water 11.3
scene 11.3
structure 11.2
mechanical device 11
plaything 11
house 10.9
empty 10.3
construction 10.3
industry 10.2
vintage 9.9
tool 9.7
sun 9.7
frozen 9.6
path 9.4
weather 9.3
mechanism 9.3
concrete mixer 9.2
city 9.1
environment 9
scenery 9
beach 8.9
light 8.7
day 8.6
garden 8.4
ice 8.3
device 8.1
seat 7.9
country 7.9
scenic 7.9
textured 7.9
sea 7.8
black 7.8
barrow 7.7
grunge 7.7
relax 7.6
relaxation 7.5
tourism 7.4
bench 7.4
exterior 7.4
vacation 7.4
danger 7.3
tourist 7.3
furniture 7.2
area 7.1
river 7.1

Google
created on 2018-10-18

Microsoft
created on 2018-10-18

outdoor 99.9
tree 99.2
ground 98.3
old 94
drawn 63.4
carriage 56
pulling 48.9
cart 46.3
vintage 40

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 53.6%
Angry 48.3%
Calm 48.5%
Confused 45.3%
Disgusted 45.6%
Surprised 45.5%
Happy 45.8%
Sad 46%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Calm 45.4%
Disgusted 45%
Surprised 45%
Angry 45%
Happy 45%
Confused 45.1%
Sad 54.4%

AWS Rekognition

Age 10-15
Gender Female, 53.4%
Sad 54.8%
Disgusted 45%
Angry 45.1%
Happy 45%
Confused 45%
Surprised 45%
Calm 45.1%

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Cow 98.7%
Person 98.4%