Human Generated Data

Title

Untitled (Kyoto, Japan)

Date

March 14, 1960-April 22, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3200

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Kyoto, Japan)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 14, 1960-April 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3200

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Apparel 99.7
Clothing 99.7
Person 98.9
Human 98.9
Person 98.3
Person 97
Plant 92.7
Tree 92.7
Shorts 86.7
Sleeve 85.3
Long Sleeve 72.1
Helmet 62.3
Coat 59.6
Shirt 59.1
People 58.3
Overcoat 56.2
Female 55

Clarifai
created on 2018-03-23

people 99.8
adult 98.8
group 97.4
man 96.7
group together 96.4
two 96.2
wear 95.9
woman 95.8
many 92.2
several 91.7
three 89.5
child 88.8
four 81.4
administration 80.7
military 79.7
vehicle 79
portrait 78.5
leader 78.2
one 78
war 77.5

Imagga
created on 2018-03-23

man 27.5
person 22
steel drum 21.5
percussion instrument 20.8
people 20.6
musical instrument 19.2
male 18.5
outdoor 17.6
park 17.3
chair 16.1
adult 15.7
outdoors 15.1
old 14.6
tree 14.6
seat 12
outside 12
clothing 11.7
city 11.6
trees 10.7
autumn 10.5
forest 10.4
bench 10.4
portrait 10.3
men 10.3
love 10.3
lifestyle 10.1
building 9.8
barrow 9.5
work 9.5
grass 9.5
sitting 9.4
wheelchair 9.3
two 9.3
religion 9
snow 8.9
couple 8.7
vehicle 8.6
season 8.6
winter 8.5
traditional 8.3
seller 8.2
hat 8.1
world 8.1
farmer 7.8
architecture 7.8
wheeled vehicle 7.8
handcart 7.8
cold 7.7
summer 7.7
field 7.5
parent 7.5
house 7.5
lady 7.3
life 7.3
protection 7.3
danger 7.3
mother 7.2
worker 7.2
suit 7.2
looking 7.2
smile 7.1
family 7.1
to 7.1
working 7.1
spring 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

tree 100
outdoor 99.8
ground 95.5
person 93.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 50.2%
Calm 53.7%
Angry 45.3%
Disgusted 45.2%
Sad 45.3%
Surprised 45.3%
Happy 45.2%
Confused 45.1%

AWS Rekognition

Age 12-22
Gender Male, 52.9%
Surprised 45.1%
Disgusted 45%
Confused 45.1%
Sad 51.8%
Angry 45.3%
Happy 45%
Calm 47.6%

Feature analysis

Amazon

Person 98.9%
Helmet 62.3%

Captions