Human Generated Data

Title

Untitled (archbishop walking through two lines of men and women on lawn)

Date

1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11017

Human Generated Data

Title

Untitled (archbishop walking through two lines of men and women on lawn)

People

Artist: Claseman Studio, American 20th century

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11017

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99.6
Person 99.6
Person 99.3
Apparel 99.2
Clothing 99.2
Person 99
Person 98.8
Person 98.6
Person 98.6
Person 98.3
Person 97.7
Person 97.3
Person 97.1
Person 96.6
Person 94.2
Person 92.8
Urban 92.4
Dress 90.1
Face 87.1
Female 86.9
Building 86.9
Person 86.6
Person 79.6
Person 77.4
Outdoors 75.6
People 73.3
Child 72.2
Kid 72.2
Person 71.5
Girl 70.2
Shelter 67.3
Nature 67.3
Countryside 67.3
Rural 67.3
Hat 63.1
Town 62.9
City 62.9
Woman 62
Portrait 61.4
Photography 61.4
Photo 61.4
Crowd 58.8
Smile 58.4
Street 57.9
Road 57.9
Shorts 55.9
Person 44.4

Clarifai
created on 2019-03-25

people 100
group 99.9
many 99.7
group together 99.5
child 98.4
adult 96.8
several 96
woman 94.5
wear 94.2
administration 94.1
man 93.4
recreation 93.2
crowd 89.7
outfit 87.8
boy 87.7
vehicle 86.6
five 83.1
spectator 82.6
leader 81.3
war 80.7

Imagga
created on 2019-03-25

percussion instrument 98
musical instrument 90.8
drum 87.7
steel drum 34
people 16.7
traditional 15.8
old 15.3
wind instrument 13.4
architecture 13.3
accordion 13.2
adult 12.4
dress 11.7
man 11.4
celebration 11.2
art 11.1
culture 11.1
holiday 10.7
costume 10.6
keyboard instrument 10.5
religion 9.8
couple 9.6
building 9.5
party 9.4
decoration 9.4
city 9.1
new 8.9
color 8.9
women 8.7
men 8.6
statue 8.5
clothing 8.5
garden 8.4
summer 8.4
human 8.2
style 8.1
history 8
water 8
day 7.8
male 7.8
scene 7.8
portrait 7.8
kin 7.7
outside 7.7
festival 7.6
two 7.6
hand 7.6
head 7.5
religious 7.5
hat 7.5
monument 7.5
tradition 7.4
music 7.2
romantic 7.1
face 7.1
travel 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

outdoor 85.1
old 41.2
clothes 35.9
person 35.9
black and white 35
child 20.2
street 13.1
man 9.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 57-77
Gender Male, 54.7%
Surprised 45.4%
Angry 45.5%
Disgusted 45.4%
Calm 49%
Sad 47.2%
Confused 47.5%
Happy 45.1%

AWS Rekognition

Age 45-66
Gender Male, 50.3%
Sad 49.8%
Surprised 49.5%
Calm 49.7%
Confused 49.5%
Happy 49.8%
Disgusted 49.5%
Angry 49.6%

AWS Rekognition

Age 29-45
Gender Male, 50.6%
Happy 47.5%
Sad 46%
Confused 45.7%
Angry 48.6%
Disgusted 45.5%
Calm 45.9%
Surprised 45.9%

AWS Rekognition

Age 27-44
Gender Male, 53.6%
Happy 45.4%
Calm 49.5%
Angry 46.1%
Disgusted 45.2%
Confused 45.7%
Surprised 45.4%
Sad 47.8%

AWS Rekognition

Age 23-38
Gender Male, 52.7%
Sad 46.6%
Surprised 46.1%
Calm 47%
Confused 45.9%
Happy 47.5%
Disgusted 46%
Angry 45.9%

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

KODVK--2EL--IW