Human Generated Data

Title

Demonstrations at an “all-white” swimming pool in Cairo, Illinois

Date

1962, printed 2010

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2013.108

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Demonstrations at an “all-white” swimming pool in Cairo, Illinois

People

Artist: Danny Lyon, American born 1942

Date

1962, printed 2010

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2013.108

Copyright

© Danny Lyon/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99.9
Person 99.9
Person 99.8
Person 99.6
Person 99.5
Person 99.5
Apparel 99.5
Clothing 99.5
Person 98.1
Person 96.3
Person 96.2
Person 95.4
Person 83
Pants 73.8
Footwear 72.5
Shoe 72.5
Hat 72.3
Shorts 70.9
Person 70

Clarifai
created on 2018-02-09

people 99.7
group 97.6
group together 97.2
man 96.2
adult 95
many 91
administration 89.7
woman 89.5
several 89.4
offense 88.7
police 86
street 84.8
wear 82.5
child 80.2
war 80.1
law 77.6
leader 74.9
military 74.5
segregation 74.4
five 74.4

Imagga
created on 2018-02-09

barbershop 100
shop 100
establishment 35.2
city 21.6
building 16.8
urban 16.6
people 15.6
window 15.6
street 14.7
door 13.6
business 13.4
old 13.2
man 12.8
travel 12.7
sign 12
wall 12
architecture 11.7
glass 10.9
men 10.3
office 10.3
road 9.9
male 9.9
historic 9.2
transportation 9
history 8.9
entrance 8.7
light 8.7
adult 8.4
house 8.4
structure 8.3
alone 8.2
dirty 8.1
home 8
sky 7.6
outdoors 7.5
vintage 7.4
exterior 7.4
transport 7.3
room 7.3
black 7.2
women 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 97.9
standing 91.2
group 73.2
people 64.4
posing 47.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-47
Gender Male, 99.6%
Angry 5.7%
Confused 20.6%
Disgusted 1.7%
Happy 0.8%
Surprised 3.7%
Sad 10.3%
Calm 57.1%

AWS Rekognition

Age 26-43
Gender Female, 67.8%
Disgusted 1.4%
Sad 35.7%
Angry 57.1%
Calm 3.4%
Surprised 0.9%
Happy 0.7%
Confused 0.9%

AWS Rekognition

Age 49-69
Gender Male, 81.4%
Surprised 4.1%
Happy 0.8%
Disgusted 1.4%
Calm 85.4%
Sad 3.1%
Angry 3.3%
Confused 1.9%

AWS Rekognition

Age 30-47
Gender Male, 99.8%
Calm 45.5%
Confused 10.9%
Surprised 4%
Disgusted 5.2%
Happy 1.4%
Angry 18.5%
Sad 14.5%

AWS Rekognition

Age 35-52
Gender Male, 73.7%
Disgusted 1.9%
Sad 50.4%
Calm 12.1%
Surprised 4.2%
Happy 2.3%
Confused 3%
Angry 26.1%

AWS Rekognition

Age 15-25
Gender Male, 87.4%
Calm 79.4%
Disgusted 3.6%
Angry 4.8%
Happy 3.1%
Confused 1.7%
Sad 4%
Surprised 3.4%

AWS Rekognition

Age 26-43
Gender Male, 95.8%
Disgusted 2.5%
Calm 59.4%
Happy 1.2%
Angry 6.6%
Sad 22.5%
Confused 5.8%
Surprised 2.1%

AWS Rekognition

Age 20-38
Gender Female, 54.1%
Happy 45.3%
Surprised 45.2%
Disgusted 45.3%
Confused 45.2%
Angry 45.3%
Calm 47.7%
Sad 50.9%

AWS Rekognition

Age 26-43
Gender Female, 51%
Disgusted 45.2%
Surprised 45.4%
Happy 48.1%
Confused 45.2%
Calm 46%
Sad 49.9%
Angry 45.3%

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Shoe 72.5%

Text analysis

Amazon

PRIVATE
MEMBERS
PRIVATE PooL
MEMBERS ONLy
PooL
ONLy

Google

RIVATE『00L ENBERS UNLL
RIVATE
00L
ENBERS
UNLL