Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2826

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2826

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Coat 98.1
Hat 96.1
Photography 95
Face 93.8
Head 93.8
Sun Hat 90.1
Handrail 86
Text 81.1
Hardhat 78.2
Helmet 78.2
Railing 77.2
Person 76.8
Reading 71.2
Person 71.2
Portrait 60.9
Fence 57.1
Formal Wear 56.3
Suit 56.3
Garden 55.6
Gardener 55.6
Gardening 55.6
Nature 55.6
Outdoors 55.6
People 55.3
Newspaper 55.3
Worker 55.1

Clarifai
created on 2018-05-10

people 99.7
one 97.2
adult 97.1
monochrome 96.4
man 96.1
wear 91.2
group together 89.6
administration 89
two 88.4
military 87.7
war 87.4
veil 87.4
outfit 85
group 84.9
vehicle 84.6
industry 82.9
lid 79.8
indoors 79.3
street 78.8
actor 75.6

Imagga
created on 2023-10-05

man 36.4
male 25.5
people 23.4
person 23
work 21.2
mask 20.6
worker 19.6
working 18.6
business 16.4
adult 16.2
black 16.2
safety 15.6
industry 15.4
job 15
men 14.6
labor 14.6
building 13.9
construction 13.7
protection 13.6
equipment 13.5
urban 13.1
industrial 12.7
newspaper 12.5
criminal 11.7
city 11.6
steel 11.5
professional 11.1
occupation 11
danger 10.9
holding 10.7
helmet 10.7
welder 9.9
crime 9.7
businessman 9.7
metal 9.7
hat 9.5
world 9.4
suit 9.1
portrait 9.1
product 9
uniform 8.9
computer 8.9
technology 8.9
factory 8.7
using 8.7
architecture 8.6
hand 8.4
looking 8
smiling 8
engineer 7.9
clothing 7.8
skill 7.7
passenger 7.6
site 7.5
street 7.4
security 7.3
businesswoman 7.3
gun 7.2
laptop 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 95.9
black 77

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-54
Gender Male, 50.4%
Calm 59.6%
Confused 30%
Surprised 6.7%
Fear 5.9%
Angry 5.3%
Sad 3.2%
Disgusted 0.6%
Happy 0.6%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Hat 96.1%

Captions

Text analysis

Amazon

ASTO
88
85
M
pum
MOM
WT MOM pum
med M joyos THL
THL
WT
joyos
AIR
med

Google

ASTO
ASTO