Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3716

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3716

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

People 99.6
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99
Adult 99
Male 99
Man 99
Formal Wear 98.3
Person 95.7
Clothing 94.9
Suit 94.9
Baseball 93.2
Baseball Glove 93.2
Glove 93.2
Sport 93.2
Accessories 79.6
Tie 79.6
Footwear 77.9
Shoe 77.9
Shoe 76.4
Outdoors 72.4
Shoe 69.4
Shoe 69.4
Face 69
Head 69
Dancing 69
Leisure Activities 69
Shirt 56.8
Hat 56.7
Architecture 56.4
Building 56.4
Wall 56.4
Smoke 55.4
Dress 55.2
Archaeology 55.1
Photography 55.1

Clarifai
created on 2018-05-10

people 100
group together 98.7
adult 98.2
group 97.9
two 96.9
man 96.1
wear 95.1
three 91.5
many 91.1
child 89.7
woman 89
one 88.8
four 87.5
several 87.1
sit 86.3
outfit 86.2
furniture 82.8
leader 82.6
administration 82.1
military 80

Imagga
created on 2023-10-07

negative 30.4
film 25.7
television 22.2
old 21.6
windowsill 17.9
building 17.7
vintage 16.5
grunge 15.3
art 15.1
travel 14.8
sill 14.4
photographic paper 13.9
antique 13.8
history 13.4
black 13.2
man 12.8
architecture 12.5
telecommunication system 12.3
decoration 12
frame 11.7
city 11.6
window 11.5
retro 11.5
grungy 11.4
texture 11.1
monitor 11
rough 10.9
freight car 10.9
dirty 10.8
structural member 10.8
tourism 10.7
people 10
car 9.9
sculpture 9.9
structure 9.9
design 9.6
old fashioned 9.5
culture 9.4
photographic equipment 9.3
screen 9.2
blackboard 9.1
paint 9.1
border 9
support 8.9
color 8.9
pattern 8.9
ancient 8.6
mask 8.6
space 8.5
male 8.5
monument 8.4
head 8.4
historic 8.2
landmark 8.1
graphic 8
light 8
water 8
wall 7.9
scene 7.8
collage 7.7
house 7.5
memorial 7.5
silhouette 7.4
life 7.4
barbershop 7.4
street 7.4
computer 7.3
tourist 7.2
aged 7.2
balcony 7.2
businessman 7.1
paper 7.1
sky 7

Google
created on 2018-05-10

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 98.7%
Sad 54.6%
Calm 35.4%
Disgusted 9.9%
Confused 9.5%
Surprised 9%
Fear 9%
Angry 3.5%
Happy 2.1%

AWS Rekognition

Age 36-44
Gender Male, 94.9%
Calm 61.7%
Confused 24.2%
Surprised 7.3%
Fear 6%
Sad 5.1%
Happy 4%
Angry 1%
Disgusted 0.5%

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Male 99.4%
Man 99.4%
Tie 79.6%
Shoe 77.9%

Categories

Text analysis

Amazon

College
(Harvard
Fellows
Harvard
of
Museums)
and
University
Art
President and Fellows of Harvard College (Harvard University Art Museums)
P1970.3716.0000
President

Google

@ President and Fellows of Harvard College (Harvard University Art Museums) P1970.3716.0000
@
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.3716.0000