Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1667

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1667

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.6
Person 99.6
Person 99.2
Person 98.9
Person 97.3
Clothing 90.2
Apparel 90.2
Person 75.3
Pants 69.5
Sleeve 69.1
Person 64.2
Long Sleeve 57.4
Kiosk 56.2
Billboard 56.1
Advertisement 56.1

Clarifai
created on 2018-03-23

people 99.9
adult 98.8
group 98.7
man 96.9
child 95.6
one 95.4
two 94.6
many 93.3
woman 93
group together 91.9
war 90.5
administration 88
boy 87.1
monochrome 86.7
wear 85.6
education 85.3
recreation 83.8
street 82.4
several 81
three 80.4

Imagga
created on 2018-03-23

shop 29.4
architecture 23.6
old 23
mercantile establishment 21.3
building 19.2
barbershop 18.2
religion 17
stall 14.6
statue 14.4
place of business 14.1
city 14.1
window 14.1
travel 14.1
sculpture 12.4
vintage 12.4
man 12.1
stone 11.9
people 11.7
ancient 11.2
staff 11
crutch 10.6
god 10.5
brick 10.5
religious 10.3
wall 10.3
culture 10.2
child 10.2
bakery 9.8
art 9.8
structure 9.6
stick 9.2
dirty 9
percussion instrument 8.9
temple 8.8
urban 8.7
person 8.7
grunge 8.5
house 8.4
historic 8.2
world 8.2
life 8.1
history 8
door 8
male 7.8
portrait 7.8
construction 7.7
weathered 7.6
sign 7.5
tourism 7.4
church 7.4
musical instrument 7.3
establishment 7.3
metal 7.2
aged 7.2
transportation 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 98.7
outdoor 98

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 15-25
Gender Male, 50.4%
Happy 49.5%
Disgusted 49.7%
Surprised 49.5%
Calm 49.5%
Angry 49.5%
Sad 50.2%
Confused 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Calm 50.1%
Angry 49.6%
Disgusted 49.6%
Sad 49.6%
Surprised 49.5%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 17-27
Gender Female, 50.5%
Disgusted 49.6%
Surprised 49.6%
Happy 49.7%
Confused 49.5%
Calm 49.8%
Sad 49.7%
Angry 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Sad 49.5%
Surprised 49.5%
Calm 49.6%
Happy 49.5%
Angry 49.7%
Disgusted 50.2%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Disgusted 49.7%
Calm 50.1%
Happy 49.5%
Angry 49.5%
Sad 49.6%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

TUESDAYS
ADMOM
ADMOM C
C

Google

TUESDAYS
TUESDAYS