Human Generated Data

Title

Untitled (two photographs: vignetted studio portrait of woman; studio portrait of four young girls, seated and standing)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5998

Human Generated Data

Title

Untitled (two photographs: vignetted studio portrait of woman; studio portrait of four young girls, seated and standing)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5998

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 99.2
Clothing 99.2
Apparel 99.2
Person 99.1
Person 99.1
Person 98.7
Shorts 94.8
Face 89.1
Sleeve 84
Performer 76.6
Female 75.3
People 66.8
Advertisement 65.3
Collage 63.3
Poster 63.3
Long Sleeve 63.2
Portrait 61.7
Photo 61.7
Photography 61.7
Woman 61.4
Head 56.3

Clarifai
created on 2019-11-16

people 100
woman 99.4
group 99.4
adult 99.1
man 97.7
group together 95.5
music 94.3
child 93.1
three 91.1
two 90.3
wear 89.3
family 89.3
four 89.2
room 88.7
actress 87.8
indoors 87.4
musician 87.4
several 85.8
monochrome 85.5
actor 85.4

Imagga
created on 2019-11-16

kin 29.8
man 22.9
world 21.4
people 19
black 18.7
person 18.3
adult 17.5
male 17
portrait 16.8
sexy 14.5
bride 13.7
couple 13.1
silhouette 12.4
room 12.1
love 11.8
happy 11.3
attractive 11.2
home 11.2
youth 11.1
fashion 10.6
hair 10.3
happiness 10.2
face 9.9
businessman 9.7
groom 9.7
dark 9.2
business 9.1
pretty 9.1
dress 9
window 8.9
lady 8.9
brunette 8.7
light 8.7
barbershop 8.6
model 8.6
elegance 8.4
future 8.4
wedding 8.3
vintage 8.3
human 8.2
style 8.2
symbol 8.1
family 8
posing 8
women 7.9
smile 7.8
sensuality 7.3
art 7.2
body 7.2
negative 7.2
night 7.1
blackboard 7.1
shop 7.1
mother 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

human face 98.8
person 98.4
clothing 97.9
text 93.8
smile 79.6
woman 76
black and white 70.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-30
Gender Female, 97.8%
Calm 99.8%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%
Sad 0.1%
Confused 0%
Surprised 0%

AWS Rekognition

Age 5-15
Gender Female, 53%
Surprised 45%
Confused 45%
Calm 54.2%
Happy 45%
Sad 45.7%
Angry 45.1%
Disgusted 45%
Fear 45%

AWS Rekognition

Age 12-22
Gender Female, 55%
Happy 45%
Angry 45%
Disgusted 45%
Calm 54.9%
Fear 45%
Surprised 45%
Confused 45%
Sad 45.1%

AWS Rekognition

Age 13-25
Gender Female, 51.9%
Surprised 45%
Confused 45.1%
Calm 52%
Happy 45%
Sad 47.7%
Angry 45.1%
Disgusted 45%
Fear 45%

AWS Rekognition

Age 13-23
Gender Female, 54.4%
Angry 50.2%
Surprised 45.1%
Disgusted 45.4%
Confused 45.3%
Fear 45.2%
Calm 46.9%
Happy 45%
Sad 47%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories