Human Generated Data

Title

Untitled (two photographs: vignetted studio portrait of girl with hands clasped under chin; child dressed as angel holding flowers next to table)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5981

Human Generated Data

Title

Untitled (two photographs: vignetted studio portrait of girl with hands clasped under chin; child dressed as angel holding flowers next to table)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5981

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 96.7
Finger 71
Face 67
Sleeve 65.3
Clothing 65.3
Apparel 65.3
Portrait 58.5
Photo 58.5
Photography 58.5
Priest 56

Clarifai
created on 2019-11-16

people 99.9
monochrome 98
woman 96.7
movie 96.7
portrait 96.3
adult 95.7
music 94.1
man 93.2
theater 92.7
opera 92.4
two 92
wear 91.8
child 91.4
street 91
actress 90.2
stage 89.5
musician 89.3
room 88.3
wedding 85.8
singer 85.7

Imagga
created on 2019-11-16

groom 31.3
person 21.8
world 21.7
man 21.5
people 20.6
male 16.3
adult 14.3
portrait 14.2
musical instrument 14.2
love 14.2
dark 14.2
old 13.2
couple 13.1
body 12.8
black 12
silhouette 11.6
sexy 11.2
fashion 10.6
wind instrument 10.5
model 10.1
attractive 9.8
hair 9.5
accordion 9.4
sensual 9.1
sensuality 9.1
human 9
one 9
history 8.9
romantic 8.9
style 8.9
room 8.8
home 8.8
happy 8.8
happiness 8.6
architecture 8.6
men 8.6
art 8.5
barbershop 8.3
vintage 8.3
historic 8.3
lady 8.1
family 8
posing 8
mask 7.9
holiday 7.9
bride 7.8
scene 7.8
keyboard instrument 7.7
two 7.6
studio 7.6
rain 7.5
passion 7.5
light 7.5
chair 7.5
kin 7.4
dress 7.2
romance 7.1
interior 7.1
travel 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 96.1
clothing 95.8
human face 93.1
black and white 92.4
indoor 91.7
text 58.4
portrait 52.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Female, 92.8%
Fear 0.6%
Confused 6%
Happy 6.3%
Angry 2.2%
Disgusted 2.4%
Calm 72.6%
Surprised 2.3%
Sad 7.6%

AWS Rekognition

Age 2-8
Gender Female, 51.8%
Fear 45.1%
Angry 45.3%
Calm 46.7%
Happy 45%
Sad 52.8%
Disgusted 45%
Confused 45.1%
Surprised 45%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories