Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2967

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2967

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 98
Person 98
Female 98
Woman 98
Face 89.9
Head 89.9
Gun 88.4
Weapon 88.4
Photography 84.8
Shooting 84.3
Machine 65
Wheel 65
Clothing 62.5
Coat 62.5
Duel 57.9
Advertisement 55.2

Clarifai
created on 2018-05-10

people 100
group 99
group together 99
adult 98.6
military 97.2
vehicle 96.8
aircraft 95.7
man 95.5
war 94.8
three 93.9
two 93.1
five 92.6
many 92.3
administration 92.3
four 91.1
child 90.5
several 90.4
watercraft 90
woman 89.2
education 89.1

Imagga
created on 2023-10-05

man 25.5
person 21.6
male 20.6
people 19.5
brass 18.9
world 18.5
musical instrument 16.8
adult 16.7
stage 16.2
cornet 16.1
wind instrument 15.4
black 14.4
men 13.7
business 12.7
photographer 12.5
human 12
silhouette 11.6
group 11.3
building 11.1
platform 10.8
lifestyle 10.1
industrial 10
city 10
equipment 9.7
professional 9.5
mask 9.5
power 9.2
music 9.1
hand 9.1
danger 9.1
life 8.9
sport 8.8
body 8.8
factory 8.7
device 8.6
work 8.6
youth 8.5
club 8.5
portrait 8.4
dirty 8.1
women 7.9
clothing 7.8
respirator 7.8
model 7.8
concert 7.8
dark 7.5
star 7.5
weight 7.5
leisure 7.5
entertainment 7.4
sports equipment 7.3
protection 7.3
sexy 7.2
activity 7.2
handsome 7.1
love 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

standing 93
posing 62.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Angry 40.7%
Fear 22.2%
Calm 21%
Surprised 12.8%
Sad 2.8%
Happy 2.1%
Confused 1.3%
Disgusted 1.2%

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 93.6%
Surprised 6.5%
Fear 6%
Sad 2.9%
Confused 1.9%
Angry 1%
Happy 0.3%
Disgusted 0.2%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Female 98%
Woman 98%
Wheel 65%
Coat 62.5%

Text analysis

Amazon

WITH
HIS
FEAT
TEE
NAV
PUL
ATTENTION
MOT
ED MOT
ED
FRONT
CK WITH HIS TEE
DE
CAPITOL
of
AN
0-389
DE UNEMPLO
I FEAT FRONT of CAPITOL WASHING
CK
AN IN U.S NAV
UNEMPLO
WASHING
LOR
I
IN U.S

Google

3a9 K WITH HIS TE FEAT FRONT TTENTION PLO
3a9
K
TE
FEAT
WITH
HIS
FRONT
TTENTION
PLO