Human Generated Data

Title

Untitled (Kentucky or Tennessee?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1159

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Kentucky or Tennessee?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1159

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Car 92.8
Transportation 92.8
Vehicle 92.8
Architecture 92.6
Building 92.6
Factory 92.6
Face 90.7
Head 90.7
Car 85.1
Manufacturing 83.5
Assembly Line 80.2
Machine 70.3
Wheel 70.3
Wheel 64.8
Car 63.1
Text 62
Adult 60.6
Male 60.6
Man 60.6
Person 60.6
Photography 60.1
Portrait 60.1
License Plate 57.9
Reading 57.8
Clothing 56
Coat 56
Terminal 55.1

Clarifai
created on 2018-05-11

people 99.9
adult 99.4
vehicle 98.6
transportation system 98.3
man 97.5
one 97.4
two 94.5
group together 93
administration 93
woman 92.9
military 92.1
war 92
group 90.8
wear 90.4
three 88.4
portrait 85.3
train 84.7
actor 83
street 83
four 81.2

Imagga
created on 2023-10-06

man 19.7
daily 15.9
black 15.6
male 15.6
product 15
old 14.6
people 13.4
person 12.9
machinist 12.5
smoke 12.1
creation 12
newspaper 11.9
world 11.8
adult 11.6
device 11.5
fire 11.2
men 11.2
protection 10.9
danger 10.9
dark 10.8
light 10.7
building 10.6
sky 9.6
industry 9.4
show 9.2
movie 9.1
city 9.1
industrial 9.1
art 8.6
shop 8.4
power 8.4
safety 8.3
vintage 8.3
outdoors 8.2
transportation 8.1
night 8
working 7.9
urban 7.9
work 7.8
disaster 7.8
scene 7.8
ancient 7.8
horror 7.8
portrait 7.8
negative 7.7
car 7.7
equipment 7.6
film 7.5
human 7.5
barbershop 7.4
musical instrument 7.3
aged 7.2
dirty 7.2
face 7.1
machine 7.1
travel 7
respirator 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 92

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Calm 99.1%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%

Feature analysis

Amazon

Adult 98.8%
Male 98.8%
Man 98.8%
Person 98.8%
Car 92.8%
Wheel 70.3%

Categories

Imagga

pets animals 97.3%
paintings art 1.5%

Captions

Microsoft
created on 2018-05-11

a man sitting at a table 75.1%
a man looking at the camera 75%
a man sitting on a table 63.7%

Text analysis

Amazon

on