Human Generated Data

Title

Untitled (Maynardville, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1205

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Maynardville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1205

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Apparel 100
Shoe 100
Footwear 100
Clothing 100
Human 99.7
Person 99.7
Person 95.8
Sleeve 83.1
Shoe 82.4
Overcoat 80.5
Brick 80
Sitting 70.5
Pants 70
Coat 65.6
Path 64.5
Face 60.1
Suit 57.6
Female 57.3

Clarifai
created on 2018-03-23

people 100
one 99.2
adult 99.2
man 97
wear 96.4
administration 95.2
two 94.3
portrait 92.9
street 91.7
child 91.2
woman 88.9
group 88.1
leader 87.1
military 86.3
home 84.1
group together 83
three 80.8
doorway 80.8
outfit 80.2
position 78.4

Imagga
created on 2018-03-23

world 23
man 22.9
silhouette 20.7
person 19.9
people 19.5
sunset 17.1
statue 16.5
black 15.9
male 15.8
adult 15.8
beach 13.5
kin 12.6
sky 12.1
building 11.9
walking 11.4
travel 11.3
street 11
sport 10.8
water 10.7
sculpture 10.6
human 10.5
sun 10.5
stone 10.3
architecture 10.2
dark 10
tourism 9.9
religion 9.9
outdoors 9.7
couple 9.6
lifestyle 9.4
monument 9.3
city 9.1
ocean 9.1
dirty 9
sax 8.8
love 8.7
clothing 8.5
historical 8.5
outdoor 8.4
action 8.4
light 8.2
urban 7.9
boy 7.8
sea 7.8
portrait 7.8
life 7.8
sidewalk 7.7
culture 7.7
old 7.7
dusk 7.6
walk 7.6
landscape 7.4
leg 7.4
alone 7.3
pose 7.3
body 7.2
active 7.2
recreation 7.2
romantic 7.1
posing 7.1
summer 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

building 99.9
outdoor 99
ground 98.8
person 92
black 85.2
old 50.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 95.3%
Sad 30.7%
Disgusted 5.2%
Confused 1.9%
Surprised 2.8%
Calm 42.9%
Angry 15.3%
Happy 1.2%

AWS Rekognition

Age 35-52
Gender Male, 54.2%
Confused 45.2%
Surprised 45.2%
Calm 47.5%
Happy 45.4%
Angry 46%
Sad 45.2%
Disgusted 50.5%

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Shoe 100%
Person 99.7%
Coat 65.6%

Text analysis

Amazon

2