Human Generated Data

Title

Untitled (squatter's camp, near U.S. Highway 70, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2082

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (squatter's camp, near U.S. Highway 70, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2082

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Architecture 99.9
Building 99.9
Countryside 99.9
Hut 99.9
Nature 99.9
Outdoors 99.9
Rural 99.9
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Person 99
Shelter 80.5
Shack 79.6
Face 71
Head 71
Clothing 68.8
Footwear 68.8
Shoe 68.8
Chair 60.1
Furniture 60.1
Weapon 57
Shorts 55.2

Clarifai
created on 2018-05-10

people 100
adult 99.9
group together 99.6
group 99.4
wear 98.8
two 98.6
child 97.6
one 97.5
man 97.4
war 95.5
several 95.1
three 95
woman 94.9
home 93.6
four 92.6
weapon 91.8
administration 91.3
outfit 90.5
military 89.7
many 86.6

Imagga
created on 2023-10-05

industry 24.8
construction 23.9
industrial 22.7
sky 22.3
building 19
man 16.8
male 15.6
machine 14.8
equipment 14.4
architecture 14.1
maypole 13.8
person 13.4
world 13.2
device 12.6
site 12.2
crane 12.1
factory 11.9
city 11.6
post 11.6
transportation 10.8
urban 10.5
metal 10.5
power 10.1
danger 10
dirty 9.9
landmark 9.9
outdoors 9.7
beam 9.6
brass 9.5
people 9.5
men 9.4
truck 9.2
house 9.2
old 9.1
statue 9
trombone 8.9
steel 8.8
labor 8.8
structure 8.6
wind instrument 8.5
child 8.5
weapon 8.3
upright 8.3
environment 8.2
protection 8.2
business 7.9
loading 7.9
day 7.8
destruction 7.8
adult 7.8
dirt 7.6
engineering 7.6
sport 7.6
energy 7.6
bridge 7.5
activity 7.2
job 7.1
working 7.1
work 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.8
person 94.8
man 93.6
old 88.1
black 79.3
white 65.4
weapon 64.1
gun 63.5
posing 57.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 27.4%
Sad 22.9%
Angry 20.7%
Confused 12%
Happy 10.1%
Surprised 9.8%
Fear 7%
Disgusted 3.2%

AWS Rekognition

Age 33-41
Gender Female, 94.9%
Sad 80.3%
Calm 38.3%
Happy 14.3%
Surprised 7.7%
Fear 6.5%
Disgusted 2.5%
Angry 2.4%
Confused 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Shoe 68.8%
Chair 60.1%

Categories

Captions

Text analysis

Amazon

24