Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2091

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2091

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 96.7
Male 96.7
Man 96.7
Person 96.7
Adult 95.9
Male 95.9
Man 95.9
Person 95.9
Person 94.9
Person 93.8
Person 93.3
Face 93.1
Head 93.1
Musical Instrument 93.1
Accordion 83.5

Clarifai
created on 2018-05-10

people 99.8
music 98
piano 97.9
adult 97.8
man 97.6
musician 97.1
jazz 97.1
one 96.2
instrument 96.1
pianist 95.2
portrait 94.2
two 93.6
outfit 87.8
wear 87.6
group 86.1
military 85
administration 81.4
leader 80.2
war 79.7
accordion 78.8

Imagga
created on 2023-10-06

accordion 100
keyboard instrument 100
wind instrument 100
musical instrument 100
piano 45.3
music 41.6
keyboard 38.5
instrument 33.5
musical 29.7
play 28.5
playing 27.4
keys 24.4
sound 23.4
black 22.3
musician 20.5
hand 19.8
key 19.7
people 16.2
classical 15.3
adult 14.2
man 14.1
person 13.6
male 13.5
performance 13.4
entertainment 12.9
education 12.1
computer 12
home 12
pianist 11.9
song 11.7
child 11.7
practice 11.6
business 11.5
indoors 11.4
learn 11.3
finger 11.1
portrait 11
chord 10.9
close 10.9
melody 10.8
hands 10.4
closeup 10.1
happy 10
laptop 10
attractive 9.8
concert 9.7
businessman 9.7
technology 9.7
sitting 9.5
work 9.4
lifestyle 9.4
classic 9.3
professional 9.3
note 9.2
old 9.1
human 9
ivory 8.9
tune 8.9
working 8.8
lesson 8.8
boy 8.7
fingers 8.6
art 8.5
equipment 8.2
student 8.2
office 8
jazz 7.9
face 7.8
performer 7.8
grand 7.8
corporate 7.7
pretty 7.7
player 7.6
one person 7.5
learning 7.5
leisure 7.5
handsome 7.1

Microsoft
created on 2018-05-10

person 100
man 96.7
accordion 89.6
music 85.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 75.9%
Surprised 8.8%
Fear 8.2%
Sad 5.5%
Confused 2.4%
Angry 2.1%
Happy 1.6%
Disgusted 1.5%

AWS Rekognition

Age 42-50
Gender Male, 99.5%
Calm 93.3%
Surprised 6.4%
Fear 6.1%
Sad 3.9%
Confused 0.5%
Angry 0.4%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 18-26
Gender Male, 99.5%
Sad 99.1%
Calm 38.1%
Surprised 6.4%
Fear 6.1%
Angry 0.6%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%

Categories

Imagga

food drinks 96.5%
interior objects 2.3%

Captions

Microsoft
created on 2018-05-10

a man wearing a suit and tie 91.9%
a man in a suit and tie 91.8%
a man holding a microphone 65.7%

Text analysis

Amazon

1934