Human Generated Data

Title

Untitled (studio portrait of three young women seated and standing)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6012

Human Generated Data

Title

Untitled (studio portrait of three young women seated and standing)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6012

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99
Person 99
Person 98.9
Person 98.3
Clothing 95.1
Apparel 95.1
Person 86.1
Sleeve 83.3
Sitting 72.3
Chair 69.9
Furniture 69.9
Head 67
Female 65.3
Portrait 65
Photography 65
Photo 65
Face 65
Evening Dress 64.8
Fashion 64.8
Gown 64.8
Robe 64.8
Advertisement 64
Art 63.6
Poster 59.8
Collage 59.8
Mannequin 59
Long Sleeve 57
Coat 55.2

Clarifai
created on 2019-11-16

people 100
group 99.2
adult 98.8
woman 98.5
wear 97.3
man 95.9
portrait 95.6
actress 91.6
child 91.6
facial expression 91.5
movie 91.4
music 89.5
outfit 88.9
indoors 88.4
three 84.3
room 84.1
education 83.8
musician 83.3
two 83.1
four 78.5

Imagga
created on 2019-11-16

man 24.8
people 24
person 21.9
kin 21.4
male 20.6
adult 18.6
television 18.1
world 17.6
black 17.1
portrait 16.8
businessman 16.8
love 16.6
business 16.4
couple 15.7
office 15.3
window 14.5
happy 14.4
bride 13.6
dress 13.5
child 11.8
happiness 11.7
room 11.6
men 11.2
groom 10.9
smile 10.7
expression 10.2
telecommunication system 10.2
lady 9.7
sexy 9.6
boy 9.6
smiling 9.4
executive 9.4
wedding 9.2
clothing 9.1
vintage 9.1
attractive 9.1
human 9
group 8.9
brunette 8.7
light 8.7
lifestyle 8.7
chair 8.6
suit 8.5
face 8.5
career 8.5
pretty 8.4
family 8
looking 8
youth 7.7
professional 7.7
one 7.5
silhouette 7.4
holding 7.4
indoor 7.3
body 7.2
romance 7.1
hair 7.1
posing 7.1
job 7.1
monitor 7.1
together 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 99.2
person 98.6
clothing 96.5
human face 93.3
text 92
woman 89.2
indoor 87.2
smile 84
dress 80.6
gallery 75
posing 54.3
blackboard 50.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Male, 52.8%
Calm 53.3%
Sad 45.1%
Fear 45.1%
Confused 45.1%
Angry 45.2%
Happy 45.2%
Disgusted 45.1%
Surprised 45.8%

AWS Rekognition

Age 18-30
Gender Female, 54.3%
Sad 45%
Surprised 45.1%
Confused 45%
Angry 45%
Calm 54.8%
Fear 45%
Happy 45%
Disgusted 45%

AWS Rekognition

Age 16-28
Gender Female, 52.8%
Angry 45.1%
Happy 45.1%
Fear 45%
Disgusted 45%
Sad 45%
Calm 54.7%
Surprised 45%
Confused 45%

AWS Rekognition

Age 13-25
Gender Female, 96.9%
Disgusted 0.1%
Sad 1%
Confused 0.5%
Happy 0.4%
Fear 0.1%
Surprised 0.2%
Calm 97.3%
Angry 0.4%

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 69.9%

Categories