Human Generated Data

Title

Untitled (two women)

Date

c. 1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.783

Human Generated Data

Title

Untitled (two women)

People

Artist: Durette Studio, American 20th century

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.783

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 97.8
Human 97.8
Person 87.3
Art 79.9
Home Decor 77.7
Painting 77.1
Face 73.8
Clothing 71.9
Apparel 71.9
Head 70.2
Female 69.2
Portrait 65.4
Photography 65.4
Photo 65.4
Hair 63
Text 62.4

Clarifai
created on 2023-10-15

people 99.9
portrait 99.9
two 98.8
adult 98.4
woman 98.2
art 97.4
painting 97.4
facial expression 97.3
wear 96.3
affection 95.9
one 95.3
retro 94.9
girl 94.5
son 94.4
actress 93.2
vintage 93
offspring 92.9
man 92.6
family 92
sepia 91.5

Imagga
created on 2021-12-14

portrait 39.5
adult 33.6
fashion 30.9
hair 30.9
attractive 30.8
model 30.4
sexy 29.7
pretty 28.7
person 27.3
people 26.8
brunette 23.5
face 23.5
love 22.1
happy 21.9
child 20
skin 19.5
style 19.3
couple 19.2
lady 18.7
sensual 18.2
sensuality 18.2
human 18
cute 17.2
black 16
posing 16
dress 15.4
bride 14.6
two 14.4
man 14.1
male 14
sibling 13.7
youth 13.6
make 13.6
gorgeous 13.6
body 13.6
smile 13.5
brother 13.5
women 13.5
long 12.9
elegance 12.6
seductive 12.4
makeup 12
wedding 12
romantic 11.6
lifestyle 11.6
studio 11.4
erotic 11.3
looking 11.2
lips 11.1
expression 11.1
dark 10.9
smiling 10.9
gown 10.7
embracing 10.7
glamor 10.5
boy 10.4
passion 10.3
elegant 10.3
happiness 10.2
pose 10
modern 9.8
clothing 9.8
lovely 9.8
one 9.7
mother 9.7
relaxation 9.2
hot 9.2
outdoor 9.2
home 8.9
hugging 8.8
look 8.8
bridal 8.8
standing 8.7
boyfriend 8.7
girlfriend 8.7
desire 8.7
eyes 8.6
room 8.6
parent 8.4
daughter 8.2
care 8.2
interior 8
sitting 7.7
luxury 7.7
casual 7.6
hand 7.6
joy 7.5
relationship 7.5
innocent 7.5
leisure 7.5
teenager 7.3

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 99.3
clothing 97.8
wall 97
smile 96.9
person 93.9
woman 92.6
posing 89.9
text 88.3
standing 82.9
black 77.7
portrait 75.6
dress 73.8
painting 73.5
picture frame 68.9
retro style 62.1
vintage clothing 62.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-30
Gender Female, 98.8%
Happy 97.5%
Confused 0.6%
Surprised 0.5%
Disgusted 0.4%
Angry 0.4%
Calm 0.3%
Fear 0.3%
Sad 0.1%

AWS Rekognition

Age 16-28
Gender Female, 97.8%
Happy 73.2%
Disgusted 8.3%
Calm 6.3%
Confused 4.8%
Angry 2.7%
Sad 2.3%
Fear 1.4%
Surprised 1%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Painting 77.1%

Categories