Human Generated Data

Title

Untitled (copy photograph of Ben Shahn's Rikers Island Penitentiary mural study: prisoners descending stairs)

Date

1934-1935

People

Artist: Walker Evans, American 1903 - 1975

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P2000.45

Human Generated Data

Title

Untitled (copy photograph of Ben Shahn's Rikers Island Penitentiary mural study: prisoners descending stairs)

People

Artist: Walker Evans, American 1903 - 1975

Date

1934-1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Person 99.2
Person 97.8
Person 94.1
Person 92.7
Person 92
Person 91.1
Handrail 87.9
Banister 87.9
People 84.7
Painting 80.5
Art 80.5
Person 62
Person 61.9
Military 61.5
Sailor Suit 61.2
Military Uniform 57.2

Imagga
created on 2021-12-14

architecture 56.6
prison 50.2
building 47.3
correctional institution 40.4
old 34.2
city 31.8
penal institution 30.3
tourism 27.2
church 26.8
travel 26.8
step 26.7
historic 23.9
support 23.7
institution 23.1
ancient 21.6
urban 21
landmark 19.9
history 19.7
window 18.7
religion 17.9
cathedral 17.8
stone 17.7
column 17.6
arch 17.6
street 17.5
tourist 17.3
famous 16.8
device 16.7
exterior 16.6
balcony 16.4
wall 15.4
historical 15.1
house 15
monument 15
construction 14.6
town 13.9
buildings 13.2
structure 12.9
brick 12.6
facade 12.4
attraction 12.4
tower 11.6
marble 10.8
statue 10.6
place 10.2
establishment 10.2
bridge 10.2
roof 10.2
stairs 9.8
vacation 9.8
detail 9.7
cityscape 9.5
art 9.2
columns 8.8
antique 8.7
door 8.6
palace 8.5
alley 8.3
vintage 8.3
catholic 7.8
houses 7.8
dome 7.7
architectural 7.7
windows 7.7
world 7.7
sky 7.7
lamp 7.6
destination 7.5
cell 7.4
night 7.1
interior 7.1
summer 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

drawing 96
cartoon 89.7
text 82.4
person 78.8
clothing 78.5
sketch 75.5
painting 71.8
old 57.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Male, 99.6%
Calm 83.7%
Angry 4.1%
Confused 3.4%
Surprised 2.7%
Happy 2%
Disgusted 1.6%
Fear 1.6%
Sad 1%

AWS Rekognition

Age 50-68
Gender Male, 72.7%
Calm 66.1%
Surprised 12.9%
Angry 6.6%
Sad 6.2%
Fear 4.3%
Confused 2.9%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 45-63
Gender Male, 84.1%
Sad 75.7%
Fear 20.1%
Confused 2%
Surprised 1.1%
Calm 0.6%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 20-32
Gender Male, 95.4%
Angry 36.9%
Calm 25.8%
Disgusted 18%
Surprised 5.2%
Sad 5%
Confused 4.7%
Fear 2.5%
Happy 1.9%

AWS Rekognition

Age 32-48
Gender Male, 99.4%
Calm 99.4%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Sad 0.1%
Fear 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 50-68
Gender Male, 85.1%
Sad 64.9%
Calm 29.9%
Angry 2.6%
Fear 1%
Confused 0.9%
Surprised 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 34-50
Gender Male, 93.8%
Calm 50.7%
Sad 43.8%
Angry 2.7%
Confused 1%
Fear 0.7%
Surprised 0.5%
Happy 0.5%
Disgusted 0.3%

AWS Rekognition

Age 39-57
Gender Male, 94.9%
Calm 66.3%
Sad 26.7%
Angry 3.9%
Fear 1.8%
Confused 0.5%
Happy 0.3%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 16-28
Gender Male, 97.5%
Calm 96.1%
Sad 2.8%
Angry 0.6%
Happy 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 24-38
Gender Female, 62%
Sad 94.2%
Calm 2.8%
Fear 1.5%
Angry 0.8%
Confused 0.3%
Surprised 0.2%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 33-49
Gender Male, 97%
Angry 39.3%
Sad 22.8%
Calm 21.6%
Disgusted 5.8%
Confused 4.4%
Fear 2.8%
Happy 2.2%
Surprised 1.2%

AWS Rekognition

Age 20-32
Gender Male, 57%
Calm 40.8%
Sad 38.2%
Confused 8.3%
Fear 5%
Surprised 3.8%
Angry 1.8%
Happy 1%
Disgusted 1%

AWS Rekognition

Age 26-40
Gender Female, 81.9%
Sad 65.2%
Calm 16.9%
Angry 8.3%
Happy 3.8%
Confused 2%
Surprised 1.5%
Fear 1.4%
Disgusted 0.8%

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Painting 80.5%

Captions

Microsoft

a group of people posing for a photo 69.5%
a group of people in an old photo of a person 57.3%
a group of people pose for a photo 57.2%