Human Generated Data

Title

Untitled (Columbus, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3615

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Columbus, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.8
Human 99.8
Hat 99.4
Apparel 99.4
Clothing 99.4
Person 95
Meal 67.7
Food 67.7
Building 63.9
Kiosk 63.5
Face 61.4
Hardhat 61.3
Helmet 61.3

Imagga
created on 2021-12-15

stall 38.6
seller 32.4
people 21.7
male 21.3
man 20.8
shop 19.6
person 18.7
business 18.2
construction 15.4
building 15.4
looking 15.2
work 15
newspaper 15
house 14.2
adult 13.7
engineer 13.6
transportation 13.4
job 13.3
industrial 12.7
technology 12.6
sea 12.5
sky 12.1
architecture 11.7
businessman 11.5
customer 11.4
plan 11.3
home 11.2
billboard 10.9
mercantile establishment 10.8
product 10.5
sign 10.5
chart 10.5
engineering 10.5
old 10.4
men 10.3
builder 10.2
manager 10.2
smiling 10.1
transport 10
outdoors 9.7
happy 9.4
lifestyle 9.4
industry 9.4
smile 9.3
bar 9.2
horizontal 9.2
one 9
new 8.9
designing 8.9
circuit 8.8
ship 8.8
pensive 8.8
designer 8.7
architect 8.7
water 8.7
project 8.6
structure 8.6
serious 8.6
bookshop 8.5
space 8.5
casual 8.5
design 8.4
office 8.4
portrait 8.4
sale 8.3
signboard 8.3
creation 8.2
bartender 8
draftsman 7.9
documentation 7.9
processing 7.9
technique 7.9
client 7.8
organizer 7.8
manufacturing 7.8
production 7.8
marine 7.6
writing 7.5
boat 7.4
worker 7.4
alone 7.3
equipment 7.2
place of business 7.1
market 7.1
working 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.9
person 99.3
man 98.9
hat 98.6
fashion accessory 95.7
clothing 95.5
outdoor 91.7
fedora 86.4
cowboy hat 83.1
human face 73.7
sun hat 59.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 32-48
Gender Male, 98%
Calm 89.9%
Happy 3.6%
Sad 2.4%
Disgusted 1.4%
Surprised 1.2%
Confused 0.9%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 29-45
Gender Female, 75.3%
Happy 88.4%
Calm 5.8%
Surprised 2.9%
Angry 0.7%
Fear 0.6%
Disgusted 0.6%
Sad 0.6%
Confused 0.4%

AWS Rekognition

Age 41-59
Gender Female, 89.1%
Happy 79%
Calm 14.2%
Sad 2.7%
Angry 1.3%
Surprised 0.8%
Fear 0.8%
Disgusted 0.6%
Confused 0.6%

AWS Rekognition

Age 9-19
Gender Female, 79%
Sad 89.1%
Calm 7.5%
Happy 1.2%
Confused 0.7%
Fear 0.6%
Angry 0.5%
Surprised 0.2%
Disgusted 0.2%

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 99.4%

Captions

Microsoft

a man posing for a photo 86%
a man wearing a hat 80.1%
a man posing for a picture 80%

Text analysis

Amazon

EAT
DRINK
RESTAURANT
S
TOVES
ZOC
Colos
et
AVEVESTEM
50

Google

RESTAURAN
ORINK
TOMS
RESTAURAN TOMS EAT ORINK
EAT