Human Generated Data

Title

Untitled (Jenkins, Kentucky)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1235

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Jenkins, Kentucky)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1235

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 99
Person 98.7
Person 98.6
Person 98.4
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Person 97.9
Adult 95.5
Male 95.5
Man 95.5
Person 95.5
Clothing 93.8
Coat 93.8
Person 91.3
Person 86.8
Person 86.4
Jeans 83.6
Pants 83.6
People 81.2
Person 80.5
Footwear 77.4
Shoe 77.4
Face 71.6
Head 71.6
Shoe 71.5
Person 70.1
Railway 68.7
Train 68.7
Transportation 68.7
Vehicle 68.7
Outdoors 68.4
Person 67.6
Car 59.7
Shoe 57.5
Road 56.9
Airfield 56.3
Airport 56.3
Formal Wear 56.2
Suit 56.2
Worker 55.9
Terminal 55.4

Clarifai
created on 2018-05-11

people 100
group together 99.4
military 98.5
adult 98.4
war 98.3
many 98.1
group 96.4
soldier 96.1
administration 93.6
man 92.3
outfit 92.1
child 89.9
wear 89.4
several 87.2
uniform 87.1
weapon 84.2
skirmish 82.3
gun 82
vehicle 79.2
five 78.5

Imagga
created on 2023-10-06

military uniform 73.8
uniform 72.9
clothing 46.4
covering 30.5
consumer goods 29.6
engineer 28.5
man 26.2
military 24.1
war 23.4
soldier 22.5
weapon 21.3
male 21.3
gun 20.8
rifle 19.3
army 18.5
camouflage 17.9
danger 16.4
person 16.3
walking 16.1
hiking 15.4
sport 15.1
mountain 15.1
commodity 14.7
protection 13.6
people 13.4
travel 13.4
stretcher 13.2
action 13
adult 13
outdoors 12.8
active 12.7
outdoor 12.2
summer 11.6
sky 11.5
adventure 11.4
men 11.2
battle 10.8
backpack 10.7
litter 10.5
grass 10.3
target 9.9
vacation 9.8
fight 9.7
competition 9.2
tourist 9.1
hiker 8.9
conveyance 8.9
combat 8.9
mask 8.8
hike 8.8
helmet 8.8
boy 8.7
vehicle 8.6
extreme 8.6
walk 8.6
horse 8.5
field 8.4
sports 8.3
leisure 8.3
group 8.1
warfare 7.9
forces 7.9
sand 7.9
protective 7.8
shoot 7.7
two 7.6
beach 7.6
dairy 7.4
mountains 7.4
animal 7.3
speed 7.3
child 7.3
activity 7.2
family 7.1
trees 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
person 98.2
group 81
people 76.1
standing 75.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 100%
Calm 98.2%
Surprised 6.3%
Fear 5.9%
Sad 2.6%
Confused 0.3%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 22-30
Gender Male, 99.8%
Calm 98.8%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Happy 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 28-38
Gender Male, 99.9%
Calm 97.6%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Happy 0.8%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 29-39
Gender Male, 85.8%
Happy 97.3%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Calm 1.6%
Angry 0.2%
Disgusted 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Jeans 83.6%
Shoe 77.4%
Train 68.7%
Car 59.7%

Categories