Human Generated Data

Title

Untitled (Middleboro, Kentucky)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1515

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Middleboro, Kentucky)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1515

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.8
Hat 99.5
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.7
Person 98.2
Person 97.5
Footwear 92.4
Shoe 92.4
Person 91.9
Shoe 90.7
Shoe 90.7
Shoe 87.5
Coat 84.9
Shoe 84.5
Coat 81.5
People 78.1
Face 76.1
Head 76.1
Person 75.8
Coat 74.9
Shoe 62.3
Shoe 62.1
Person 61
Officer 57.7
Cap 56.4
Overcoat 56
Formal Wear 55.8
Suit 55.8
Captain 55.8

Clarifai
created on 2018-05-11

people 99.9
group together 99.4
group 98.2
military 98.1
adult 97.6
outfit 97.6
uniform 97.2
administration 96.6
police 96.6
man 95.3
soldier 94.7
offense 94.2
war 94
many 93.7
several 93.2
vehicle 91.5
five 90.4
military uniform 88.5
gun 88.3
weapon 87.6

Imagga
created on 2023-10-06

uniform 34.5
man 26.9
clothing 26.5
military uniform 25.1
military 22.2
male 21.3
person 21.3
danger 20.9
people 20.6
soldier 20.5
protection 20
mask 18.5
adult 16.9
gun 15.9
war 15.5
weapon 14.6
camouflage 13.7
army 13.7
sport 13.4
covering 12.7
toxic 12.7
protective 12.7
dirty 12.7
men 12
radioactive 11.8
radiation 11.7
accident 11.7
nuclear 11.6
chemical 11.6
gas 11.6
consumer goods 11.1
safety 11
street 11
helmet 10.9
industrial 10.9
destruction 10.7
disaster 10.7
outdoor 10.7
walking 10.4
world 10.2
stalker 8.9
stretcher 8.9
rifle 8.7
walk 8.6
culture 8.5
black 8.5
travel 8.4
dark 8.3
pedestrian 8.3
outdoors 8.2
history 8
hat 7.9
photographer 7.8
protect 7.7
industry 7.7
old 7.7
two 7.6
musical instrument 7.6
traditional 7.5
smoke 7.4
player 7.4
environment 7.4
kin 7.4
equipment 7.3
building 7.1
portrait 7.1
litter 7.1
to 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
outdoor 99.4
ground 96
people 56.8
group 56.7
old 47.7
posing 35.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Sad 79.5%
Calm 55.3%
Surprised 6.5%
Fear 6.1%
Happy 2.6%
Confused 2%
Angry 1.9%
Disgusted 1.1%

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 91.8%
Surprised 9.7%
Fear 6.4%
Sad 2.2%
Happy 0.5%
Angry 0.3%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 35-43
Gender Male, 100%
Happy 86.2%
Calm 10.6%
Surprised 6.8%
Fear 6.1%
Sad 2.3%
Angry 0.6%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 42-50
Gender Male, 86.2%
Happy 90.8%
Fear 6.5%
Surprised 6.5%
Sad 3.2%
Calm 2.9%
Angry 0.5%
Disgusted 0.4%
Confused 0.4%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Shoe 92.4%
Coat 84.9%

Text analysis

Amazon

AND
VEGETA
RUI
Al