Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935, printed later, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3427

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3427

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Apparel 99.9
Clothing 99.9
Human 99.8
Person 99.8
Person 99.6
Person 99.5
Person 99.5
Hat 99.3
Shoe 97.9
Footwear 97.9
Shoe 97.2
Shoe 94.5
Sun Hat 93.5
Hat 93
Shoe 92.2
Shoe 91
Sitting 89.8
Suit 65.8
Overcoat 65.8
Coat 65.8
Hat 65.1
Face 64.8
Road 62.5
Photography 62.1
Photo 62.1
People 58.2
Portrait 57.7

Clarifai
created on 2018-03-23

people 99.9
adult 99.3
two 99.1
man 98.6
lid 98.3
group 96.8
woman 96.5
veil 96.5
three 94.8
wear 94.8
one 94.8
sit 93.1
group together 90.4
four 85.3
transportation system 85.1
administration 84.8
leader 84.2
recreation 84
portrait 83.4
elderly 82.5

Imagga
created on 2018-03-23

man 41.6
male 29.2
person 28.6
mask 23.2
people 22.9
adult 19.4
worker 14.2
sport 12.9
men 12.9
clothing 12.2
hat 12
danger 11.8
work 11.8
portrait 11.6
stick 11.6
businessman 11.5
helmet 11.4
hand 11.4
construction 11.1
industry 11.1
safety 11
street 11
holding 10.7
working 10.6
sitting 10.3
city 10
criminal 9.8
outdoors 9.7
equipment 9.6
day 9.4
protection 9.1
human 9
job 8.8
crime 8.8
business 8.5
black 8.4
security 8.3
occupation 8.2
ski mask 8.2
active 8.1
looking 8
guy 7.9
outdoor 7.6
hockey stick 7.6
fun 7.5
one 7.5
suit 7.4
player 7.3
laptop 7.3
dirty 7.2
building 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.7
outdoor 99.7
people 70.8
group 59.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 92.5%
Disgusted 0.7%
Sad 3.9%
Calm 86.3%
Surprised 4.3%
Happy 1.1%
Confused 1.8%
Angry 1.8%

AWS Rekognition

Age 35-52
Gender Male, 98.3%
Happy 16%
Confused 6.4%
Sad 45.3%
Angry 5.4%
Calm 13.9%
Disgusted 6%
Surprised 7%

AWS Rekognition

Age 26-43
Gender Female, 53.8%
Happy 7%
Disgusted 8.8%
Angry 8.9%
Surprised 10.5%
Sad 19.5%
Calm 32.6%
Confused 12.7%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 99.3%
Shoe 97.9%