Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1933-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3150

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1933-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3150

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Adult 98
Male 98
Man 98
Person 98
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Face 96.1
Head 96.1
Person 96.1
Person 93.8
Person 93.8
Musical Instrument 93.7
Accordion 80.6
Person 59.9
Photography 56.4
Portrait 56.4

Clarifai
created on 2018-05-10

people 99.9
adult 98.7
group 98.6
man 97.6
leader 96.6
group together 96.5
administration 95.8
portrait 94.8
two 90.2
many 89.8
wear 88.7
several 88
war 87.3
music 87.2
military 86.5
one 84.7
woman 84.5
three 80.8
musician 79.6
outfit 79

Imagga
created on 2023-10-07

accordion 100
keyboard instrument 100
wind instrument 100
musical instrument 100
concertina 59
free-reed instrument 47.2
man 33.6
male 27.6
adult 23.3
person 22.4
people 19
business 18.2
hand 17.5
businessman 16.8
face 15.6
music 15.3
old 14.6
portrait 14.2
musical 13.4
instrument 13
play 12.9
playing 12.8
device 12.7
happy 11.9
musician 11.7
performance 11.5
sound 11.2
attractive 11.2
men 11.2
culture 11.1
building 11.1
guy 11
professional 11
handsome 10.7
indoors 10.5
office 10.4
corporate 10.3
holding 9.9
one 9.7
looking 9.6
work 9.4
paper 9.4
smiling 9.4
model 9.3
smile 9.3
lifestyle 8.7
education 8.7
sculpture 8.6
sitting 8.6
statue 8.6
casual 8.5
entertainment 8.3
fun 8.2
confident 8.2
religion 8.1
clothing 8
computer 8
hands 7.8
architecture 7.8
art 7.8
performer 7.8
concert 7.8
artist 7.7
youth 7.7
studio 7.6
one person 7.5
keyboard 7.5
human 7.5
technology 7.4
finger 7.4
retro 7.4
executive 7.4
color 7.2
suit 7.2
working 7.1

Microsoft
created on 2018-05-10

person 99.9
accordion 97.9
man 97.3
music 85.2
old 84.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 98.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 23-31
Gender Male, 98.3%
Confused 42.6%
Fear 22.1%
Surprised 11.4%
Calm 8.6%
Angry 6.9%
Disgusted 6.1%
Sad 3.8%
Happy 1.6%

AWS Rekognition

Age 27-37
Gender Male, 100%
Calm 78.4%
Sad 10%
Surprised 6.8%
Fear 6.5%
Confused 4.7%
Angry 1.8%
Disgusted 0.7%
Happy 0.4%

AWS Rekognition

Age 24-34
Gender Female, 96.3%
Sad 100%
Surprised 6.3%
Fear 5.9%
Confused 0.1%
Calm 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 76.5%
Confused 8.1%
Surprised 7.5%
Fear 6.7%
Sad 4.9%
Angry 4.4%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 94.8%
Surprised 6.6%
Fear 5.9%
Sad 2.4%
Happy 1.5%
Confused 1.1%
Disgusted 0.6%
Angry 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99%
Happy 97.9%
Surprised 6.5%
Fear 6%
Sad 2.3%
Disgusted 0.3%
Confused 0.3%
Angry 0.2%
Calm 0.1%

Microsoft Cognitive Services

Age 53
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%

Captions

Microsoft
created on 2018-05-10

an old photo of a man 93.6%
old photo of a man 92.9%
a black and white photo of a man 89.4%

Text analysis

Amazon

Luvin