Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3853

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3853

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 97.8
Male 97.8
Man 97.8
Person 97.8
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Musical Instrument 94.6
Person 94.6
Baby 94.6
Face 93.7
Head 93.7
Person 86.6
Accordion 86.3
Clothing 82.3
Hat 82.3
Body Part 78.8
Finger 78.8
Hand 78.8
Formal Wear 77.7
Suit 77.7

Clarifai
created on 2018-05-10

people 99.6
adult 97.5
group 96.6
man 96.1
one 94
music 92
group together 91.7
musician 90.4
many 90.1
leader 88.9
two 88.8
monochrome 87.4
woman 87.3
several 86.9
wear 85.7
vehicle 85
administration 83.7
jazz 79.4
instrument 79.1
portrait 79

Imagga
created on 2023-10-07

accordion 100
keyboard instrument 100
wind instrument 100
musical instrument 100
piano 37.4
music 34.3
keyboard 31
instrument 27.9
playing 25.6
play 24.1
musical 24
people 20.6
sound 20.6
man 20.2
musician 19.5
black 18.6
person 17.7
business 17.6
hand 17.5
adult 16.8
keys 16.6
key 15.9
male 15.6
portrait 15.5
businessman 15
education 14.7
corporate 14.6
happy 14.4
face 14.2
professional 13.5
performance 13.4
work 13.3
entertainment 12.9
classical 12.4
learn 12.3
attractive 11.9
old 11.8
smiling 11.6
pianist 10.9
chord 10.9
lifestyle 10.8
child 10.8
song 10.7
success 10.5
home 10.4
paper 10.2
laptop 10
melody 9.8
working 9.7
practice 9.7
indoors 9.7
office 9.6
boy 9.6
classic 9.3
finger 9.3
note 9.2
businesswoman 9.1
suit 9
technology 8.9
ivory 8.9
job 8.8
computer 8.8
looking 8.8
performer 8.8
lesson 8.8
sitting 8.6
close 8.6
one person 8.5
holding 8.3
human 8.3
student 8.1
equipment 8.1
closeup 8.1
tune 7.9
jazz 7.9
happiness 7.8
art 7.8
artist 7.7
fingers 7.6
player 7.6
learning 7.5
leisure 7.5
executive 7.4
successful 7.3
financial 7.1
smile 7.1
worker 7.1
together 7

Microsoft
created on 2018-05-10

person 98.7
accordion 98.3
music 94.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-48
Gender Male, 95.3%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Happy 0.1%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 48-54
Gender Male, 88.8%
Calm 95.8%
Surprised 6.5%
Fear 5.9%
Sad 2.5%
Happy 1.3%
Confused 0.9%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 45-51
Gender Male, 81.9%
Calm 45%
Sad 16.1%
Angry 15.8%
Surprised 9.9%
Confused 7.9%
Fear 6.6%
Happy 6.3%
Disgusted 2%

Feature analysis

Amazon

Adult 97.8%
Male 97.8%
Man 97.8%
Person 97.8%
Baby 94.6%
Hat 82.3%
Suit 77.7%

Captions

Text analysis

Amazon

Harvard
College
and
Museums)
University
Art
(Harvard
President and Fellows of Harvard College (Harvard University Art Museums)
of
Fellows
le
President
P1970.3853.0000
Bea
ERSARY

Google

@ President and Fellows of Harvard College (Harvard University Art Museums) P1970.3853.0000
@
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.3853.0000