Human Generated Data

Title

Untitled (a cart with men)

Date

c. 1860-1880

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.61

Human Generated Data

Title

Untitled (a cart with men)

People

Artist: Willoughby Wallace Hooper, British 1837 - 1912

Date

c. 1860-1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2018.61

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Mammal 99.7
Animal 99.7
Horse 99.7
Wheel 99.7
Machine 99.7
Horse Cart 99.6
Vehicle 99.6
Transportation 99.6
Wagon 99.6
Person 99.5
Human 99.5
Person 99.1
Person 93
Carriage 86.2
Person 76.9
Person 71.6
Bicycle 69.6
Bike 69.6

Clarifai
created on 2018-10-18

people 100
group 98.9
adult 98.7
group together 97.9
cart 97.8
transportation system 97.6
vehicle 97.5
carriage 97.2
man 96.3
two 96.3
wagon 94.8
one 94
many 93.9
street 93.4
three 91.7
cavalry 90.7
administration 89.2
four 87.6
several 86.6
soldier 85.8

Imagga
created on 2018-10-18

carriage 100
horse cart 64.2
cart 59.8
wagon 43.2
old 27.9
wheeled vehicle 26.2
transport 21
horse 20.9
transportation 20.6
vehicle 20.1
wheel 19.8
bicycle 19.6
street 17.5
city 17.5
bike 16.6
chair 16.1
seat 15.5
travel 15.5
antique 13
architecture 12.5
outdoor 12.2
tourist 11.8
summer 11.6
urban 11.4
sun 11.3
building 11.2
house 10.9
vacation 10.6
rural 10.6
wall 10.3
wood 10
vintage 9.9
tourism 9.9
wheels 9.8
cycle 9.8
ride 9.7
wooden 9.7
table 9.5
history 8.9
sky 8.9
culture 8.5
brick 8.5
stone 8.4
relax 8.4
window 8.2
park 8.2
trees 8
grass 7.9
country 7.9
scene 7.8
sunny 7.7
outside 7.7
tree 7.7
rustic 7.7
beach 7.6
outdoors 7.5
landscape 7.4
historic 7.3
road 7.2
sunlight 7.1
patio 7.1

Google
created on 2018-10-18

Microsoft
created on 2018-10-18

building 100
outdoor 100
carriage 99.8
ground 99.3
drawn 99
pulling 97.9
horse 95.6
cart 91.5
old 82.9
street 81.7
transport 78.7
horse-drawn vehicle 72.3
attached 39.1
pulled 38

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 54.9%
Sad 48.9%
Calm 48.7%
Surprised 45.2%
Disgusted 45.5%
Happy 45.3%
Angry 45.5%
Confused 45.9%

AWS Rekognition

Age 26-44
Gender Female, 53.6%
Disgusted 48.9%
Happy 45.1%
Surprised 45.3%
Sad 45.8%
Calm 45.8%
Angry 49%
Confused 45.1%

AWS Rekognition

Age 38-59
Gender Female, 50.9%
Angry 48.2%
Calm 47.6%
Confused 45.4%
Disgusted 46.1%
Surprised 46%
Happy 45.2%
Sad 46.6%

AWS Rekognition

Age 30-47
Gender Female, 53.2%
Calm 50.1%
Surprised 45.6%
Angry 45.3%
Happy 46.4%
Disgusted 45.3%
Confused 45.5%
Sad 46.8%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Horse 99.7%
Wheel 99.7%
Person 99.5%
Bicycle 69.6%