Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3129

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3129

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Clarifai
created on 2018-03-23

people 99.7
music 99.6
musician 98.9
jazz 98.2
piano 98.1
instrument 97.7
one 97.7
adult 97.1
man 96.9
singer 94.7
two 94.4
microphone 92.2
concert 92
pianist 91.7
monochrome 90.3
portrait 90.1
band 89.6
wear 88.3
accordion 88.2
war 88.2

Imagga
created on 2018-03-23

accordion 100
wind instrument 100
keyboard instrument 100
musical instrument 100
piano 43.7
music 40.6
keyboard 34.8
instrument 33.5
playing 29.2
play 28.5
musical 27.8
musician 22.4
sound 21.6
black 21
keys 20.5
hand 19
key 16.8
performance 14.4
classical 14.3
adult 13.6
man 13.4
people 12.8
close 12.6
pianist 11.9
song 11.7
portrait 11.7
male 11.3
hands 11.3
education 11.3
person 11.2
finger 11.1
entertainment 11
chord 10.9
child 10.8
melody 10.8
practice 10.7
indoors 10.5
human 10.5
fingers 10.5
player 10.4
closeup 10.1
attractive 9.8
equipment 9.8
concert 9.7
classic 9.3
professional 9.3
note 9.2
ivory 8.9
tune 8.9
lesson 8.8
home 8.8
lifestyle 8.7
work 8.6
learn 8.5
old 8.4
technology 8.2
light 8
business 7.9
jazz 7.9
performer 7.8
sitting 7.7
artist 7.7
communication 7.6
happy 7.5
learning 7.5
computer 7.2
cute 7.2
art 7.2

Microsoft
created on 2018-03-23

person 100
music 94.8
accordion 83.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Male, 99.4%
Happy 1.5%
Confused 4.8%
Angry 2.2%
Disgusted 3.6%
Sad 1.8%
Calm 84%
Surprised 2.1%

AWS Rekognition

Age 20-38
Gender Male, 92.2%
Happy 1.7%
Calm 68.7%
Surprised 2.8%
Angry 11%
Confused 3.2%
Disgusted 1.6%
Sad 11%

AWS Rekognition

Age 26-43
Gender Male, 84.6%
Surprised 1.1%
Happy 0.5%
Angry 4.3%
Sad 21.8%
Calm 65.7%
Confused 2.2%
Disgusted 4.5%

AWS Rekognition

Age 26-43
Gender Male, 54.2%
Surprised 3.9%
Happy 8.5%
Angry 4%
Sad 61.9%
Calm 16.8%
Confused 2.4%
Disgusted 2.5%

AWS Rekognition

Age 35-52
Gender Male, 95.5%
Calm 23.1%
Angry 11.7%
Disgusted 1.5%
Sad 54.3%
Surprised 2.3%
Happy 0.7%
Confused 6.4%

AWS Rekognition

Age 27-44
Gender Male, 72.3%
Angry 3.8%
Surprised 7.6%
Happy 8.9%
Sad 67.2%
Calm 10.1%
Disgusted 0.5%
Confused 1.8%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%

Categories

Imagga

food drinks 99.9%
interior objects 0.1%

Captions

Microsoft
created on 2018-03-23

a man in a suit and tie 94.7%
a man wearing a suit and tie 91.9%
a man holding a microphone 68.8%

Text analysis

Amazon

Mlrumnmwl