Human Generated Data

Title

A house in the Delta

Date

1963, printed 2010

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2013.114

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

A house in the Delta

People

Artist: Danny Lyon, American born 1942

Date

1963, printed 2010

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2013.114

Copyright

© Danny Lyon/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Nature 99.8
Outdoors 99.7
Person 99.5
Human 99.5
Building 98.9
Shelter 98.9
Countryside 98.9
Rural 98.9
Person 98.7
Person 98.6
Housing 98.1
Person 96.7
Person 96.6
Machine 96.3
Wheel 96.3
Person 94.9
Person 94.9
Person 94.2
Hut 94.2
Shack 93.2
Person 91.6
House 87.7
Wheel 86.9
Furniture 80.8
Bench 80.8
Cabin 70.3
Soil 59.3

Clarifai
created on 2018-02-09

vehicle 98.3
people 95.9
vintage 93.2
house 93
transportation system 92.2
machine 89.4
tractor 89.4
truck 89.2
building 88.7
old 88
wagon 86
industry 85.8
expression 79.8
no person 79.3
equipment 76.9
abandoned 76.4
street 76.3
home 75.9
machinery 75.3
family 74.4

Imagga
created on 2018-02-09

building 41.9
device 33.8
architecture 33.3
house 30.2
structure 27.8
sky 24.2
old 23.7
snow 23.3
landscape 20.8
travel 18.3
history 17.9
winter 16.2
shelter 14
city 13.3
hut 13
exterior 12.9
construction 12.8
landmark 12.6
wood 12.5
roof 12.3
wooden 12.3
scenic 12.3
outdoor 12.2
window 12
home 12
tourism 11.6
facade 11.4
wall 11.2
cold 11.2
historic 11
scenery 10.8
vintage 10.8
water 10.7
office 10.5
outdoors 10.5
church 10.2
barn 10
park 9.9
religion 9.9
river 9.8
rural 9.7
country 9.7
brick 9.4
culture 9.4
light 9.4
clouds 9.3
tower 9.2
trees 8.9
grass 8.7
sea 8.6
estate 8.5
industry 8.5
tree 8.5
monument 8.4
mountain 8.3
weather 8.2
farm 8
ancient 7.8
mechanism 7.6
buildings 7.6
field 7.5
traditional 7.5

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

outdoor 97.2
old 61.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 49-69
Gender Male, 51.7%
Disgusted 45.7%
Calm 46.4%
Confused 45.4%
Surprised 45.3%
Sad 51.2%
Happy 45.3%
Angry 45.7%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Angry 49.5%
Sad 50.4%
Calm 49.5%
Disgusted 49.5%

AWS Rekognition

Age 35-53
Gender Female, 52.5%
Happy 45.3%
Surprised 45.1%
Disgusted 45.2%
Sad 49%
Angry 45.2%
Confused 45.1%
Calm 50.1%

AWS Rekognition

Age 9-14
Gender Female, 50.2%
Happy 50.1%
Disgusted 49.5%
Calm 49.7%
Surprised 49.5%
Sad 49.6%
Confused 49.5%
Angry 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Calm 50.1%
Sad 49.7%
Happy 49.6%
Surprised 49.5%
Angry 49.6%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 14-23
Gender Female, 50.4%
Happy 49.6%
Calm 50.3%
Confused 49.5%
Sad 49.6%
Surprised 49.5%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 16-27
Gender Male, 50.3%
Confused 49.5%
Surprised 49.5%
Happy 50%
Angry 49.5%
Disgusted 49.5%
Sad 49.6%
Calm 49.9%

Feature analysis

Amazon

Person 99.5%
Wheel 96.3%
Bench 80.8%