Human Generated Data

Title

Untitled (studio portrait of three elderly men and one elderly woman)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6014

Human Generated Data

Title

Untitled (studio portrait of three elderly men and one elderly woman)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6014

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.2
Person 99.2
Person 99.2
Person 97.8
Person 96.9
Person 96.3
Person 96.2
Furniture 93.8
Chair 93.4
Performer 90.8
Stage 86.4
Clothing 84
Apparel 84
Lighting 82.9
Sitting 73.9
Face 71.7
Overcoat 66.3
Coat 66.3
Leisure Activities 64.6
Suit 63.4
Photo 60.8
Photography 60.8

Clarifai
created on 2019-11-16

people 99.8
group 98.4
music 95.2
movie 94.4
adult 94.3
woman 92.7
man 92.5
room 92.4
stage 92.2
group together 91.7
many 89.3
wear 87.5
musician 87.2
outfit 84.6
theater 82.1
indoors 82
portrait 80.6
leader 77.3
singer 76.9
child 75.4

Imagga
created on 2019-11-16

sax 66.8
wind instrument 35.4
musical instrument 29.5
man 24.9
male 21.3
person 20.7
people 19
black 18.7
business 17.6
silhouette 17.4
accordion 16.5
adult 16.3
men 16.3
brass 14.4
keyboard instrument 13.7
businessman 12.4
office 12.1
light 12
music 11.8
human 11.2
women 11.1
symbol 10.8
dark 10
hand 9.9
portrait 9.7
chair 9.7
group 9.7
body 9.6
window 9.4
grunge 9.4
cornet 9.4
modern 9.1
fashion 9
posing 8.9
job 8.8
working 8.8
hair 8.7
sitting 8.6
dance 8.5
suit 8.4
one 8.2
style 8.2
stage 8.1
metal 8
sexy 8
interior 8
professional 7.9
love 7.9
art 7.8
model 7.8
party 7.7
crowd 7.7
world 7.7
youth 7.7
performance 7.7
studio 7.6
relaxation 7.5
entertainment 7.4
indoor 7.3
dirty 7.2
lifestyle 7.2
clothing 7.2
night 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 97.8
text 88.5
standing 86.9
black and white 86.6
person 86.5
man 85
concert 79.3
musical instrument 76.6
footwear 60.4
music 54.8
posing 41.9
store 37.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-61
Gender Male, 54.8%
Sad 45.2%
Disgusted 45%
Confused 45.1%
Surprised 45.3%
Angry 45%
Calm 54.2%
Happy 45%
Fear 45.1%

AWS Rekognition

Age 13-23
Gender Male, 52.8%
Confused 45%
Calm 55%
Sad 45%
Surprised 45%
Happy 45%
Disgusted 45%
Fear 45%
Angry 45%

AWS Rekognition

Age 15-27
Gender Male, 54.1%
Confused 45%
Surprised 45%
Happy 45%
Calm 51.3%
Disgusted 45%
Sad 45%
Angry 48.6%
Fear 45%

AWS Rekognition

Age 38-56
Gender Male, 54.9%
Disgusted 45%
Happy 45%
Surprised 45%
Sad 45.3%
Fear 45%
Calm 54.5%
Angry 45.1%
Confused 45%

AWS Rekognition

Age 32-48
Gender Male, 54.9%
Happy 45.3%
Sad 49.1%
Disgusted 45.1%
Fear 45.6%
Calm 48%
Angry 46.5%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 44-62
Gender Male, 51.9%
Confused 45%
Happy 46.7%
Disgusted 45%
Calm 53.2%
Angry 45%
Sad 45.1%
Surprised 45%
Fear 45%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 93.4%
Suit 63.4%

Categories

Imagga

interior objects 99.2%