Human Generated Data

Title

Untitled (Lower East Side, New York City)

Date

1933-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2925

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Lower East Side, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1933-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2925

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-03-29

Accessory 99.4
Tie 99.4
Accessories 99.4
Person 99.1
Human 99.1
Person 98.9
Coat 96.5
Overcoat 96.5
Suit 96.5
Apparel 96.5
Clothing 96.5
Hat 95
Face 61.1

Clarifai
created on 2018-02-09

people 99.9
two 98.9
adult 98.7
three 96.4
woman 95.9
man 95.3
group 94.6
one 94.2
wear 94.1
administration 93.4
portrait 90.9
four 90.5
leader 89.6
group together 88.9
outfit 87.1
doorway 87
child 86.6
veil 85
music 84
five 83.1

Imagga
created on 2018-02-09

groom 25.9
old 25.1
person 21.7
religion 20.6
man 20.2
statue 18.5
male 18.4
waiter 18.2
architecture 18
people 17.3
sculpture 16.5
ancient 16.4
grandfather 15.2
history 15.2
couple 14.8
portrait 14.2
love 12.6
marble 12.6
art 12.4
god 12.4
religious 12.2
stone 11.9
employee 11.5
historical 11.3
vintage 10.8
world 10.5
ruler 10.5
men 10.3
monument 10.3
culture 10.3
church 10.2
worker 10.1
historic 10.1
family 9.8
adult 9.8
pray 9.7
catholic 9.7
antique 9.5
happiness 9.4
happy 9.4
travel 9.2
dress 9
seller 8.7
father 8.7
bride 8.6
face 8.5
mother 8.5
two 8.5
column 8.4
traditional 8.3
temple 8.3
tourism 8.2
fan 8
home 8
building 7.9
scene 7.8
prayer 7.7
patient 7.7
married 7.7
house 7.5
senior 7.5
city 7.5
famous 7.4
wedding 7.4
smiling 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 99.5
man 96.7
standing 93.1
outdoor 90.9
old 88.6
posing 76.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 95.8%
Surprised 4.5%
Calm 59.6%
Confused 9.6%
Happy 4.3%
Sad 11.3%
Disgusted 5.8%
Angry 5%

AWS Rekognition

Age 48-68
Gender Male, 98.6%
Happy 0.1%
Angry 1.6%
Confused 2.4%
Sad 6.5%
Calm 88.3%
Surprised 0.5%
Disgusted 0.6%

AWS Rekognition

Age 23-38
Gender Female, 99.1%
Surprised 3%
Sad 15.2%
Confused 1.8%
Disgusted 8.5%
Happy 9.1%
Calm 53%
Angry 9.5%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Tie 99.4%
Person 99.1%
Suit 96.5%
Hat 95%

Categories