Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2965

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2965

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Furniture 99.5
Chair 99.5
Human 99.4
Person 99.4
Person 97.7
Person 97.6
Performer 90.9
Indoors 84.4
Interior Design 84.4
Crowd 82.3
Apparel 78.5
Clothing 78.5
Face 71.1
Suit 70.3
Overcoat 70.3
Coat 70.3
Text 69.8
Audience 64.7
Room 63.5
Photography 62.9
Portrait 62.9
Photo 62.9
Hair 62.9
Musical Instrument 61
Musician 61
Teacher 56.3
Speech 55.5

Clarifai
created on 2018-03-23

people 99.9
adult 98.5
group 97.3
street 97.2
two 97
woman 96.5
administration 96.2
one 95.9
war 95.8
monochrome 95.4
man 94.9
group together 94.9
three 94
four 92
military 90.3
vehicle 89.3
several 89.1
music 86.7
furniture 86.6
child 86.6

Imagga
created on 2018-03-23

man 31.6
iron lung 30.1
male 26.2
respirator 25
person 24.9
people 20.6
breathing device 18.3
adult 17.8
black 17.4
device 17.3
world 15.9
human 14.2
old 13.9
musical instrument 13.8
men 13.7
barbershop 13.4
portrait 12.9
shop 11.4
vintage 10.7
statue 10.7
soldier 9.8
military 9.6
war 9.6
couple 9.6
face 9.2
suit 9.2
hand 9.1
mercantile establishment 9
one 9
banjo 8.9
sculpture 8.7
antique 8.6
work 8.6
industry 8.5
city 8.3
street 8.3
stringed instrument 8.2
dress 8.1
grandfather 8
body 8
business 7.9
love 7.9
helmet 7.8
army 7.8
ancient 7.8
fashion 7.5
future 7.4
protection 7.3
looking 7.2
worker 7.2
history 7.1
smile 7.1
job 7.1
patient 7

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 96.1%
Angry 53.1%
Confused 2.5%
Disgusted 0.6%
Surprised 0.5%
Sad 10.5%
Calm 32.5%
Happy 0.2%

AWS Rekognition

Age 35-52
Gender Male, 89.1%
Disgusted 9.9%
Calm 76.1%
Happy 1.1%
Angry 2.8%
Sad 3.4%
Confused 3.1%
Surprised 3.6%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Calm 49.7%
Sad 49.7%
Confused 49.5%
Happy 49.6%
Disgusted 49.6%
Angry 49.9%
Surprised 49.6%

AWS Rekognition

Age 12-22
Gender Female, 50.4%
Surprised 49.6%
Happy 49.7%
Angry 49.5%
Sad 49.6%
Calm 49.9%
Confused 49.5%
Disgusted 49.6%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Calm 49.5%
Confused 49.5%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Sad 50.3%

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.5%
Person 99.4%

Text analysis

Amazon

INSTITUTE
ONGRESS
TEETH
OF
NATURAL INSTITUTE
OF NFALATN
NATURAL
NFALATN
CWASHINGOLIC

Google

URAL INSTITUTE TEET NG
URAL
INSTITUTE
TEET
NG