Human Generated Data

Title

Untitled (clergymen walking down town street outside church)

Date

1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11094

Human Generated Data

Title

Untitled (clergymen walking down town street outside church)

People

Artist: Claseman Studio, American 20th century

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11094

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99
Person 99
Person 98.9
Person 98.9
Person 98.9
Person 98.8
Person 97.8
Person 97.2
Person 96.2
Transportation 95.1
Automobile 95.1
Vehicle 95.1
Car 95.1
Road 93.8
Apparel 91.2
Clothing 91.2
Nature 87.7
Outdoors 86.9
Person 77.4
Pedestrian 77
Urban 75.2
Person 74.7
Field 72.8
People 70.8
Countryside 70.7
Building 61.5
Tarmac 59.7
Asphalt 59.7
Gravel 58.6
Dirt Road 58.6
Coat 58.5
Vegetation 57.7
Plant 57.7
Adventure 57.6
Leisure Activities 57.6
Shorts 57.4
Grassland 57.1

Clarifai
created on 2019-03-25

people 99.9
group together 99.5
group 98.8
adult 98.4
man 96.1
war 96
many 95.7
woman 94.2
child 93.7
two 93.1
military 92.6
administration 91.7
vehicle 91.7
three 91.5
transportation system 89.3
one 87
wear 84.9
several 84.7
four 84.1
boy 78

Imagga
created on 2019-03-25

pen 27.7
outdoor 26
man 24.2
enclosure 23.3
outdoors 22.5
plow 22.3
grass 20.6
tool 19.9
people 19
person 18.7
male 18.4
field 18.4
summer 18
landscape 17.1
sky 16.6
sport 16.3
countryside 15.5
hiking 15.4
park 14.8
farm 14.3
outside 13.7
backpack 13.7
active 13.7
activity 13.4
structure 13.2
boy 13
walking 12.3
sun 12.1
old 11.8
rural 11.5
autumn 11.4
forest 11.3
travel 11.3
stretcher 10.8
leisure 10.8
mountain 10.7
farmer 10.6
agriculture 10.5
farming 10.4
hill 10.3
land 10.3
day 10.2
dairy 10.2
lifestyle 10.1
natural 10
environment 9.9
recreation 9.9
trees 9.8
fall 9.1
vacation 9
child 8.9
country 8.8
hike 8.8
spring 8.6
litter 8.6
happiness 8.6
sunny 8.6
men 8.6
adventure 8.5
adult 8.4
cowboy 8.3
stone 8.2
kid 8
tree 7.8
track 7.8
tourist 7.6
walk 7.6
path 7.6
happy 7.5
mountains 7.4
morning 7.2
road 7.2
conveyance 7.2
meadow 7.2
work 7.1
plant 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

outdoor 99.1
black 71.7
white 70.5
black and white 70.5
monochrome 23.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Female, 51.9%
Sad 47.3%
Calm 47%
Happy 47.4%
Surprised 45.9%
Angry 45.8%
Disgusted 46.2%
Confused 45.5%

AWS Rekognition

Age 60-80
Gender Female, 54.3%
Happy 49%
Angry 45.5%
Sad 48.1%
Confused 45.2%
Disgusted 45.3%
Calm 46.6%
Surprised 45.4%

AWS Rekognition

Age 19-36
Gender Female, 50.8%
Disgusted 45.1%
Confused 45.3%
Surprised 45.7%
Angry 49.1%
Happy 45.1%
Sad 48.3%
Calm 46.2%

AWS Rekognition

Age 35-52
Gender Male, 53.3%
Disgusted 45%
Angry 45.7%
Sad 49.2%
Happy 45.1%
Calm 49.6%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 14-25
Gender Male, 50.5%
Calm 46.7%
Sad 46.6%
Disgusted 46.8%
Happy 47.2%
Confused 45.4%
Angry 46.8%
Surprised 45.6%

AWS Rekognition

Age 26-43
Gender Female, 50%
Disgusted 45.3%
Surprised 45.7%
Calm 50.6%
Happy 45.3%
Sad 46.4%
Confused 46.1%
Angry 45.7%

Feature analysis

Amazon

Person 99%
Car 95.1%

Text analysis

Amazon

38
KODK--2OEL--IRW