Human Generated Data

Title

Untitled (two photographs: man in easy chair by fireplace, reading newspaper; teenager in performance outfit perched on table)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6112

Human Generated Data

Title

Untitled (two photographs: man in easy chair by fireplace, reading newspaper; teenager in performance outfit perched on table)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6112

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Footwear 97.4
Apparel 97.4
Shoe 97.4
Clothing 97.4
Advertisement 84.4
Poster 84.4
Collage 82.8
Shoe 78.1
Leisure Activities 59.6
Sleeve 58.5
Brick 58.3
Musical Instrument 55.7
Musician 55.7
Coat 55.7
Overcoat 55.7
Shorts 55.4

Clarifai
created on 2019-11-16

people 99.6
man 97.4
adult 96.4
street 95.5
one 94.6
monochrome 94.2
woman 92.3
gun 88
movie 87.3
wear 87.1
military 86.1
war 85.7
indoors 85.6
two 84.9
group 84.4
uniform 83.4
offense 83
music 82.3
boy 81
vehicle 80.7

Imagga
created on 2019-11-16

background 28.3
black 25.5
musical instrument 24.9
newspaper 21.5
screen 21.4
accordion 20.2
man 19.5
product 18.2
keyboard instrument 16.4
display 15.9
male 15.6
wind instrument 14.8
creation 14.8
person 14.2
model 14
people 13.9
adult 13.7
silhouette 13.2
style 12.6
dark 12.5
fashion 12.1
portrait 11.6
electronic device 11.3
art 11.3
sexy 11.2
body 11.2
grunge 11.1
one 9.7
sport 9.2
sunset 9
device 8.8
clothing 8.4
studio 8.4
sensuality 8.2
dirty 8.1
water 8
posing 8
hair 7.9
face 7.8
play 7.8
attractive 7.7
window 7.6
guitar 7.5
music 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.3
clothing 98.4
person 97.8
black and white 92.3
street 88.3
man 86.2
footwear 79
monochrome 67.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 9-19
Gender Male, 52.2%
Calm 54.6%
Disgusted 45%
Surprised 45%
Fear 45%
Happy 45%
Angry 45%
Confused 45%
Sad 45.3%

AWS Rekognition

Age 22-34
Gender Female, 53.8%
Confused 45.1%
Surprised 45.1%
Happy 45%
Calm 47.9%
Disgusted 45%
Sad 50.7%
Angry 45.1%
Fear 46.2%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 97.4%