Human Generated Data

Title

Untitled (Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1280

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1280

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Clothing 100
Apparel 100
Human 99.5
Person 99.5
Person 99.2
Person 98.5
Person 97.2
Overcoat 94.1
Coat 94.1
Sun Hat 79.9
Cowboy Hat 78
Hat 77.5
Helmet 66
Person 61.9

Clarifai
created on 2018-03-23

people 100
group together 98.6
group 98.6
adult 98.5
three 96.7
administration 96.4
several 96.3
military 95.5
two 94.8
man 94.7
leader 94.5
four 94.4
uniform 93.8
wear 93
war 91
five 90.1
outfit 88.9
woman 88.4
soldier 86
veil 85.7

Imagga
created on 2018-03-23

hat 79.8
cowboy hat 68
headdress 42.6
man 36.3
clothing 35.2
male 26.9
people 25.6
person 24
men 19.7
two 17.8
covering 17.7
old 17.4
consumer goods 16.5
adult 15.6
cowboy 14.8
senior 14.1
outdoors 13.4
hand 12.9
western 12.6
together 12.3
old-timer 11.9
style 11.9
love 11.8
washboard 11.7
couple 11.3
shirt 11.2
guy 11.2
device 11
portrait 11
weapon 11
outdoor 10.7
face 10.7
helmet 10.6
building 10.4
industry 10.2
harmonica 10.2
uniform 10.1
safety 10.1
leisure 10
worker 9.6
play 9.5
happy 9.4
wind instrument 9.4
grandfather 9.3
musical instrument 9.3
smile 9.3
emotion 9.2
occupation 9.2
travel 9.1
protection 9.1
looking 8.8
standing 8.7
lifestyle 8.7
war 8.7
work 8.6
walk 8.6
outside 8.6
model 8.6
horse 8.5
black 8.4
head 8.4
city 8.3
fashion 8.3
street 8.3
fun 8.2
playing 8.2
free-reed instrument 8.2
industrial 8.2
equipment 8.2
family 8
job 8
hair 7.9
look 7.9
boy 7.8
gun 7.8
world 7.8
culture 7.7
elderly 7.7
enjoy 7.5
dark 7.5
holding 7.4
vacation 7.4
tourist 7.2
women 7.1
child 7

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

outdoor 99.9
person 99.6
people 86
group 79.3
old 53

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-44
Gender Female, 51.9%
Calm 53%
Sad 45.7%
Confused 45.2%
Happy 45.2%
Disgusted 45.2%
Angry 45.3%
Surprised 45.3%

AWS Rekognition

Age 26-44
Gender Male, 98.7%
Surprised 2.7%
Confused 2.3%
Disgusted 1.9%
Calm 87.2%
Angry 1.8%
Sad 1.9%
Happy 2.2%

AWS Rekognition

Age 4-9
Gender Female, 77.4%
Disgusted 0.9%
Confused 5.6%
Angry 4.1%
Surprised 2.2%
Happy 0.5%
Sad 17.4%
Calm 69.3%

AWS Rekognition

Age 26-43
Gender Male, 98.1%
Calm 95.9%
Angry 1.1%
Happy 0.2%
Disgusted 1.1%
Sad 0.6%
Surprised 0.5%
Confused 0.7%

AWS Rekognition

Age 26-43
Gender Female, 53.5%
Calm 52.1%
Disgusted 45.2%
Angry 45.2%
Happy 45.5%
Confused 45.3%
Surprised 46%
Sad 45.8%

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Hat 77.5%
Helmet 66%

Text analysis

Amazon

P.
LDUY
LDUY 1GIO3
1GIO3

Google

LTRT
LTRT