Human Generated Data

Title

Untitled (Crossville, Tennessee)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.2822

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Crossville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.2822

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.8
Human 99.8
Person 99.7
Person 99.7
Person 99.7
Person 99.5
Clothing 98.4
Footwear 98.4
Apparel 98.4
Shoe 98.4
Nature 96.9
Outdoors 95.6
Shoe 94.3
Shoe 93.3
Hat 93.2
Countryside 89
Building 86.5
Person 86
Rural 80.1
People 77.9
Shoe 77.2
Hut 75.8
Shack 75.8
Hat 75.8
Person 73.6
Hat 64.9
Face 64.1
Dugout 62.1
Urban 60.2
Shoe 59.3

Clarifai
created on 2018-03-23

people 99.9
group 98.7
group together 98.3
many 98
child 97.9
adult 96.6
wear 95.8
woman 95.1
several 94
outfit 93.5
man 93.3
boy 93.2
uniform 93.2
administration 92.7
recreation 91.5
offense 89.9
veil 87
military 86.8
sit 85.6
police 85.2

Imagga
created on 2018-03-23

hairdresser 68.3
man 36.9
people 28.4
male 27.7
person 25.8
salon 21.3
adult 18.2
musical instrument 17.3
patient 16.5
men 16.3
indoors 15.8
drum 13.7
shop 13.4
senior 13.1
old 12.5
happy 11.9
barbershop 11.9
medical 11.5
percussion instrument 11.4
wheelchair 11.2
health 11.1
lifestyle 10.8
banjo 10.8
equipment 10.8
chair 10.5
home 10.4
portrait 10.3
sitting 10.3
work 10.2
city 10
smile 10
together 9.6
elderly 9.6
smiling 9.4
two 9.3
mature 9.3
family 8.9
hospital 8.9
stringed instrument 8.8
sick person 8.8
retirement 8.6
case 8.6
illness 8.6
room 8.5
doctor 8.5
inside 8.3
human 8.2
uniform 8
surgeon 8
women 7.9
couple 7.8
mask 7.8
train 7.7
mercantile establishment 7.7
seat 7.6
fashion 7.5
religious 7.5
leisure 7.5
outdoors 7.5
occupation 7.3
industrial 7.3
looking 7.2
religion 7.2
face 7.1
interior 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.9
outdoor 97.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-44
Gender Female, 51.1%
Surprised 38.2%
Disgusted 12.8%
Calm 9.8%
Confused 17.5%
Angry 10.3%
Happy 2.3%
Sad 9.1%

AWS Rekognition

Age 20-38
Gender Female, 63.3%
Angry 15.8%
Sad 54.3%
Calm 11%
Confused 4.7%
Surprised 4.4%
Happy 4.2%
Disgusted 5.5%

AWS Rekognition

Age 26-44
Gender Male, 51.8%
Surprised 5.2%
Happy 1.1%
Angry 11.6%
Sad 4%
Calm 73.2%
Confused 2.1%
Disgusted 2.8%

AWS Rekognition

Age 35-52
Gender Male, 98%
Happy 3.5%
Confused 10.3%
Sad 20.8%
Angry 25.2%
Calm 28.9%
Disgusted 4.5%
Surprised 6.8%

AWS Rekognition

Age 27-44
Gender Male, 54.7%
Happy 45%
Disgusted 54.8%
Angry 45.1%
Surprised 45%
Sad 45.1%
Calm 45%
Confused 45%

AWS Rekognition

Age 26-43
Gender Female, 53.1%
Happy 45.1%
Disgusted 45.4%
Surprised 45.3%
Calm 50.8%
Angry 45.5%
Sad 47.7%
Confused 45.2%

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 47
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 98.4%
Hat 93.2%

Categories