Human Generated Data

Title

Untitled (studio portrait of candlepin bowling team)

Date

c. 1905-1915

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3931

Human Generated Data

Title

Untitled (studio portrait of candlepin bowling team)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3931

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 98
Person 98
Apparel 97.1
Clothing 97.1
Person 91.6
Person 88.9
Person 87.7
Person 84.4
Coat 81.7
Scientist 78
Person 75.8
Lab Coat 71.7
People 67.8
Accessory 66.8
Tie 66.8
Accessories 66.8
Tie 64.8

Clarifai
created on 2019-06-01

people 99.9
adult 98.4
wear 98.2
group 96.3
scientist 95.6
man 95.6
medical practitioner 88.8
uniform 86.6
administration 85.1
one 84.8
medicine 84.3
coat 83.1
military 81.6
portrait 81.4
outerwear 81
war 79.7
several 75.7
group together 75.3
retro 73.9
science 72.6

Imagga
created on 2019-06-01

lab coat 91.5
coat 71.7
garment 37.4
clothing 26
picket fence 19.9
people 19
fence 16.5
adult 15.6
old 15.3
building 14.3
city 13.3
person 13.2
black 12.6
art 12.6
business 12.1
male 12
men 12
barrier 11.8
happiness 11.7
groom 11.7
man 11.4
day 11
architecture 10.9
nurse 10.8
religion 10.7
light 10.7
urban 10.5
couple 10.4
wall 10.3
grunge 10.2
color 10
face 9.9
vintage 9.9
attractive 9.8
businessman 9.7
sculpture 9.7
scene 9.5
water 9.3
portrait 9.1
dirty 9
history 8.9
fountain 8.8
happy 8.8
women 8.7
love 8.7
antique 8.6
ancient 8.6
historical 8.5
texture 8.3
structure 8.2
retro 8.2
group 8.1
wet 8
obstruction 8
smile 7.8
stone 7.6
window 7.6
instrument 7.5
monument 7.5
historic 7.3
businesswoman 7.3
aged 7.2
suit 7.2
indoors 7

Google
created on 2019-06-01

Photograph 96.7
Snapshot 82.9
Team 59.4
Uniform 52.8
Gentleman 51.3

Microsoft
created on 2019-06-01

wall 99.6
text 97.1
sketch 95.2
drawing 94.8
person 83.6
black 77
man 76.9
black and white 71.5
white 65.2
clothing 52.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 77.4%
Angry 2.8%
Happy 4.8%
Sad 9.7%
Disgusted 1%
Confused 1.8%
Calm 76.5%
Surprised 3.5%

AWS Rekognition

Age 19-36
Gender Female, 60.4%
Disgusted 3.7%
Calm 71%
Angry 2.1%
Sad 11.7%
Surprised 3.5%
Happy 6.5%
Confused 1.6%

AWS Rekognition

Age 16-27
Gender Female, 65.5%
Disgusted 1.2%
Happy 0.5%
Surprised 0.9%
Sad 1.1%
Angry 0.8%
Confused 0.4%
Calm 95%

AWS Rekognition

Age 17-27
Gender Male, 53.6%
Confused 45.1%
Happy 45.2%
Surprised 45.3%
Angry 45.1%
Disgusted 45.1%
Calm 53.9%
Sad 45.4%

AWS Rekognition

Age 35-52
Gender Female, 52.3%
Happy 45.6%
Calm 50.1%
Angry 45.7%
Confused 45.1%
Surprised 45.1%
Sad 48.3%
Disgusted 45%

Feature analysis

Amazon

Person 98%
Tie 66.8%

Categories

Imagga

interior objects 99.3%

Captions

Text analysis

Amazon

L