Human Generated Data

Title

Untitled (Jenkins, Kentucky)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1231

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Jenkins, Kentucky)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1231

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Adult 99
Male 99
Man 99
Person 99
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.3
Person 98.3
Person 98.1
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Person 95.9
Adult 95.5
Male 95.5
Man 95.5
Person 95.5
Person 92.2
Person 87.1
Railway 78.8
Transportation 78.8
Face 75.2
Head 75.2
Train 71.9
Vehicle 71.9
Car 65.7
Car 62.6
Footwear 62.5
Shoe 62.5
Shoe 58.9
Terminal 57.1
People 56.9
Walking 56.8
Jeans 56.8
Pants 56.8
Train Station 56.3
Shoe 56.2
Worker 55.8
Overcoat 55.8
Shoe 55.1
Outdoors 55.1

Clarifai
created on 2018-05-11

people 100
group together 99.2
adult 99.1
military 98.7
group 98.7
many 98.2
war 97.9
vehicle 96.6
administration 96.3
soldier 95.8
man 94.7
transportation system 92.3
several 89.5
child 89.2
wear 88.2
outfit 87
leader 84.3
uniform 84
skirmish 81.5
weapon 80

Imagga
created on 2023-10-06

man 29.6
person 22.5
people 22.3
outdoor 19.1
dairy 18.7
male 18.6
adult 17.5
kin 16.2
beach 16.1
boy 14.8
danger 14.5
mountain 14.2
hiking 13.5
outdoors 13
summer 12.9
sky 12.8
active 12.7
destruction 12.7
travel 12.7
nuclear 12.6
vacation 12.3
track 11.8
walking 11.4
sport 11.3
couple 11.3
love 11
two 11
protection 10.9
lifestyle 10.8
life 10.6
child 10.6
sun 10.5
adventure 10.4
world 10.2
pedestrian 9.9
activity 9.8
hike 9.7
landscape 9.7
risk 9.6
men 9.4
sea 9.4
water 9.3
ocean 9.2
dark 9.2
leisure 9.1
environment 9
engineer 9
sunset 9
family 8.9
soldier 8.8
disaster 8.8
together 8.8
uniform 8.7
gas 8.7
war 8.7
walk 8.6
industrial 8.2
dirty 8.1
women 7.9
hiker 7.9
urban 7.9
sand 7.9
accident 7.8
toxic 7.8
portrait 7.8
father 7.7
military 7.7
chemical 7.7
parent 7.7
explosion 7.7
outside 7.7
mask 7.7
old 7.7
power 7.6
fun 7.5
silhouette 7.4
mountains 7.4
building 7.4
against 7.3
group 7.3
horizon 7.2
working 7.1
spring 7.1
work 7.1
happiness 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 100
sky 98.1
person 97.4
ground 96
grass 96
standing 91.8
people 73.9
group 64.5
posing 46.9
crowd 0.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Male, 99.7%
Fear 57.7%
Disgusted 22.7%
Confused 18.8%
Surprised 7%
Sad 5.5%
Calm 2.3%
Angry 1.4%
Happy 0.6%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 96.2%
Surprised 6.9%
Fear 6%
Sad 2.3%
Angry 0.9%
Confused 0.5%
Disgusted 0.5%
Happy 0.2%

AWS Rekognition

Age 18-24
Gender Female, 82.2%
Sad 100%
Surprised 6.7%
Fear 6.7%
Calm 6.5%
Happy 1.7%
Angry 1.6%
Confused 0.9%
Disgusted 0.5%

AWS Rekognition

Age 7-17
Gender Female, 57.3%
Calm 66.6%
Fear 11.4%
Happy 8.7%
Surprised 7.8%
Angry 3.7%
Sad 3.4%
Confused 2.4%
Disgusted 1.6%

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Train 71.9%
Car 65.7%
Shoe 62.5%
Jeans 56.8%