Human Generated Data

Title

Untitled (group of men on porch holding cigars and wearing decorative ribbons on lapel)

Date

c. 1907

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3858

Human Generated Data

Title

Untitled (group of men on porch holding cigars and wearing decorative ribbons on lapel)

People

Artist: Durette Studio, American 20th century

Date

c. 1907

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3858

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.5
Person 99.4
Person 99.4
Clothing 99.1
Apparel 99.1
Person 98.9
Helmet 91.6
Person 87.1
Footwear 83.9
Shoe 83.9
Person 82.8
Overcoat 77.1
Coat 77.1
Helmet 74.8
Shoe 73.3
Stage 73.1
Suit 66.4
Hat 64.7
Shoe 59.8
Meal 58.1
Food 58.1

Clarifai
created on 2019-06-01

people 99.7
group together 99.1
many 98.6
wear 98.3
adult 97.5
group 97.4
outfit 95.1
several 94.7
man 94.3
woman 92.8
music 88.4
administration 85.6
five 82.9
veil 77.7
leader 76.2
uniform 73.2
recreation 71.8
indoors 70.7
facial expression 69.4
education 68.8

Imagga
created on 2019-06-01

military uniform 48.6
uniform 41.1
kin 35.9
clothing 29.5
man 21.5
people 20.6
covering 20.6
consumer goods 19.6
city 16.6
person 15.8
male 15.6
nurse 13.2
black 12.6
photographer 12.1
urban 11.4
adult 11
old 10.4
world 10.4
historic 10.1
family 9.8
portrait 9.7
scene 9.5
commodity 9.5
men 9.4
youth 9.4
mask 9.2
travel 9.1
gun 8.9
architecture 8.6
business 8.5
monument 8.4
dark 8.3
silhouette 8.3
vintage 8.3
life 8.2
dirty 8.1
history 8
women 7.9
military 7.7
war 7.7
statue 7.6
fashion 7.5
happy 7.5
style 7.4
street 7.4
protection 7.3
dress 7.2
transportation 7.2
religion 7.2
posing 7.1
happiness 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 97.6
person 96
window 95.2
man 90
footwear 71.7
black and white 61.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.1%
Angry 45.7%
Surprised 45.5%
Calm 49%
Sad 46.6%
Confused 45.3%
Disgusted 45.3%
Happy 47.6%

AWS Rekognition

Age 23-38
Gender Female, 53%
Disgusted 45.3%
Sad 46.2%
Confused 45.2%
Happy 45.8%
Angry 45.4%
Surprised 45.5%
Calm 51.7%

AWS Rekognition

Age 35-52
Gender Female, 51%
Happy 46.5%
Calm 45.8%
Angry 45.8%
Surprised 45.6%
Disgusted 46.7%
Confused 45.6%
Sad 49%

AWS Rekognition

Age 35-52
Gender Female, 54.4%
Disgusted 48.4%
Sad 46.6%
Calm 47.5%
Confused 45.3%
Surprised 45.4%
Happy 46.1%
Angry 45.8%

AWS Rekognition

Age 35-52
Gender Female, 52.9%
Happy 45.8%
Disgusted 47.3%
Angry 46.5%
Surprised 45.6%
Sad 46.3%
Calm 48.2%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Sad 45.7%
Angry 45.2%
Disgusted 45.4%
Surprised 45.2%
Happy 45.2%
Calm 53.2%
Confused 45.1%

AWS Rekognition

Age 35-52
Gender Male, 51.1%
Angry 46.4%
Surprised 45.4%
Disgusted 45.9%
Sad 48.6%
Calm 48%
Happy 45.4%
Confused 45.3%

AWS Rekognition

Age 20-38
Gender Male, 53.9%
Disgusted 46.3%
Calm 48.9%
Confused 45.3%
Sad 46.4%
Surprised 45.5%
Angry 45.8%
Happy 46.8%

Feature analysis

Amazon

Person 99.5%
Helmet 91.6%
Shoe 83.9%

Categories