Human Generated Data

Title

Untitled (Jenkins, Kentucky)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1236

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Jenkins, Kentucky)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1236

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Person 99
Person 98.6
Person 98.6
Person 98.5
Person 98.5
Person 98.3
Person 97.9
People 93.7
Car 90.2
Transportation 90.2
Vehicle 90.2
Railway 84.7
Train 84.7
Coat 82.7
Outdoors 81
Walking 80.2
Person 79.6
Coat 74.7
Face 72.2
Head 72.2
Footwear 67.7
Shoe 67.7
Car 62.3
Car 61.9
Machine 58.8
Wheel 58.8
Nature 57.6
Road 56.9
Terminal 56.2
Train Station 56.2
Formal Wear 56
Suit 56
Shoe 55.8
Worker 55.7
Overcoat 55.3

Clarifai
created on 2018-05-11

people 100
many 99.8
group together 99.6
group 99.5
adult 99.1
war 98.3
military 98.1
administration 97
man 96.9
soldier 96.1
several 94.6
vehicle 93.2
woman 91.3
skirmish 89.9
wear 87.7
child 87.3
weapon 85.3
gun 83.9
police 83.4
transportation system 83.3

Imagga
created on 2023-10-06

stretcher 28.2
man 24.9
litter 22.7
landscape 20.8
outdoor 20.6
engineer 19.4
sky 19.1
mountain 18.7
conveyance 17.7
weapon 16.1
chemical weapon 16
people 15.6
male 14.9
person 14.4
pedestrian 14.3
walking 13.3
outdoors 13.1
vacation 13.1
weapon of mass destruction 12.8
summer 12.2
active 12
travel 12
beach 11.8
hiking 11.5
sport 11.3
group 11.3
clouds 11
protection 10.9
backpack 10.7
old 10.5
sea 10.2
dairy 10
horizon 9.9
tourism 9.9
environment 9.9
uniform 9.7
adult 9.7
nuclear 9.7
military 9.7
men 9.4
natural 9.4
child 9.2
leisure 9.1
ocean 9.1
danger 9.1
snow 8.9
gun 8.9
family 8.9
destruction 8.8
scenic 8.8
hike 8.8
fog 8.7
water 8.7
cloud 8.6
walk 8.6
outside 8.6
tourist 8.5
hill 8.4
dark 8.4
industrial 8.2
clothing 8.2
coast 8.1
life 8.1
farm 8
lifestyle 8
grass 7.9
war 7.9
boy 7.8
animal 7.8
father 7.8
sunny 7.7
military uniform 7.7
winter 7.7
seascape 7.6
dusk 7.6
stone 7.6
adventure 7.6
rifle 7.6
park 7.4
history 7.2
trees 7.1
rural 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 100
person 98
standing 79.1
group 78
people 76.8
white 62.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 66.8%
Confused 11%
Disgusted 9.2%
Surprised 8.7%
Fear 6%
Sad 4%
Happy 2.3%
Angry 1.8%

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 77%
Confused 8%
Surprised 7.4%
Fear 7.2%
Sad 5%
Angry 2.2%
Happy 0.8%
Disgusted 0.6%

AWS Rekognition

Age 21-29
Gender Male, 95.1%
Calm 40.6%
Happy 20.9%
Confused 18.2%
Surprised 10.2%
Fear 6.4%
Angry 4.6%
Disgusted 4%
Sad 3.8%

AWS Rekognition

Age 19-27
Gender Male, 100%
Happy 53.3%
Calm 45.6%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 23-31
Gender Female, 98.6%
Confused 74.3%
Calm 17.4%
Surprised 6.7%
Fear 6%
Sad 3.1%
Angry 2.3%
Disgusted 1.7%
Happy 0.4%

AWS Rekognition

Age 18-24
Gender Female, 50.1%
Fear 57.9%
Sad 27.4%
Calm 22%
Surprised 8.1%
Angry 3.4%
Disgusted 2.4%
Happy 1.6%
Confused 1.4%

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Car 90.2%
Train 84.7%
Coat 82.7%
Shoe 67.7%
Wheel 58.8%

Categories