Human Generated Data

Title

The Line

Date

1967-1969

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of The Mr. and Mrs. Stanley Marcus Foundation, P1972.23

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

The Line

People

Artist: Danny Lyon, American born 1942

Date

1967-1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of The Mr. and Mrs. Stanley Marcus Foundation, P1972.23

Copyright

© Danny Lyon/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-03-29

Person 99.7
Human 99.7
Person 99.3
Person 99.1
Person 98.8
Person 98.4
Person 98.3
Person 97.6
Outdoors 94.1
Person 93.5
Person 93.1
Person 92.7
Mammal 91.6
Horse 91.6
Animal 91.6
Soil 89.9
Water 87.7
Person 84.7
Person 84.7
Person 79.1
Crowd 76.1
Person 73.2
Person 72.4
Person 71.9
People 71
Person 70.2
Fishing 67.1
Person 66.8
Field 66.4
Person 65.4
Ground 62.7
Leisure Activities 62
Military 58.7
Angler 56.1
Military Uniform 55.5

Clarifai
created on 2018-02-09

people 99.9
group together 99.5
group 99.5
many 99.3
military 98.8
war 98.1
man 97.7
adult 97.4
soldier 97.3
crowd 92.3
skirmish 92
vehicle 91.4
cavalry 90.4
several 87.8
watercraft 85.7
weapon 85.1
gun 82.5
army 80.8
wear 77.6
woman 74.9

Imagga
created on 2018-02-09

sky 27
landscape 25.3
travel 23.9
line 19.1
rope 17.3
outdoor 16.8
people 16.2
old 16
fisherman 15.8
tourism 15.7
summer 15.4
city 15
vacation 14.7
tree 14.7
water 14.7
park 14.2
man 13.4
structure 13
building 12.8
clouds 12.7
road 12.6
sea 12.5
architecture 12.5
scenery 11.7
silhouette 11.6
rural 11.5
walking 11.4
leisure 10.8
mountain 10.8
hiking 10.6
device 10.4
day 10.2
weapon 10.2
lake 10.1
outdoors 10
male 9.9
environment 9.9
history 9.8
rock 9.6
season 9.4
house 9.3
field 9.2
active 9
recreation 9
snow 8.9
trees 8.9
sun 8.9
beach 8.8
urban 8.7
grass 8.7
valley 8.6
sport 8.6
ocean 8.5
fountain 8.5
person 8.4
countryside 8.2
tourist 8.2
landmark 8.1
horizon 8.1
sunset 8.1
river 8
sand 8
scenic 7.9
maze 7.9
stone 7.7
shore 7.6
hill 7.5
town 7.4
holiday 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

sky 98.2
outdoor 97.4
grass 96.1
white 71.5
old 70.3
black 69.5
group 61.3
line 30.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Happy 49.7%
Angry 49.6%
Surprised 49.5%
Calm 50.1%
Disgusted 49.6%
Sad 49.6%
Confused 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.3%
Disgusted 49.6%
Angry 49.6%
Sad 49.8%
Confused 49.5%
Happy 49.5%
Calm 49.9%
Surprised 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Sad 49.7%
Surprised 49.6%
Calm 49.6%
Confused 49.5%
Angry 49.6%
Happy 49.9%
Disgusted 49.6%

AWS Rekognition

Age 35-55
Gender Female, 50.2%
Angry 49.6%
Disgusted 49.5%
Calm 49.9%
Happy 49.5%
Sad 49.8%
Surprised 49.6%
Confused 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Sad 49.6%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Happy 49.5%
Calm 50.3%

AWS Rekognition

Age 38-57
Gender Female, 50%
Angry 49.5%
Confused 49.5%
Disgusted 49.5%
Sad 50.4%
Happy 49.6%
Calm 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Male, 55%
Disgusted 45%
Surprised 45.1%
Confused 45.1%
Calm 54.6%
Happy 45.1%
Sad 45.1%
Angry 45.1%

Feature analysis

Amazon

Person 99.7%
Horse 91.6%