Human Generated Data

Title

Untitled (two women in native costume)

Date

1870s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2567

Human Generated Data

Title

Untitled (two women in native costume)

People

Artist: Unidentified Artist,

Date

1870s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Furniture 99.6
Person 97.7
Human 97.7
Clothing 97.2
Apparel 97.2
Person 94.5
Painting 85.1
Art 85.1
Bonnet 80.2
Hat 80.2
Crib 62.7
Cradle 56.3

Imagga
created on 2021-12-15

dress 37
portrait 29.8
adult 27.6
mother 25.8
face 25.6
person 25.1
fashion 24.9
people 24.5
happy 22.6
costume 20.9
kin 18.5
pretty 18.2
sexy 17.7
model 17.1
parent 17.1
clothing 16.5
smile 16.4
attractive 16.1
happiness 15.7
lady 15.4
cute 15.1
hair 15.1
love 15
couple 14.8
cheerful 14.6
smiling 14.5
studio 14.4
bride 14.4
elegance 14.3
fun 13.5
old 13.2
lifestyle 13
child 13
wedding 12.9
two 12.7
women 12.7
style 12.6
posing 12.4
clothes 12.2
culture 12
makeup 11.9
indoor 11.9
traditional 11.6
gown 11.6
outfit 10.9
sensuality 10.9
holding 10.7
romantic 10.7
man 10.1
family 9.8
human 9.7
interior 9.7
brunette 9.6
celebration 9.6
hairstyle 9.5
art 9.5
sensual 9.1
make 9.1
luxury 8.6
antique 8.5
bouquet 8.5
color 8.3
daughter 8.3
vintage 8.3
children 8.2
groom 8.2
gorgeous 8.2
decoration 8
black 7.9
holiday 7.9
male 7.9
look 7.9
masquerade 7.9
bag 7.8
standing 7.8
ancient 7.8
carnival 7.8
husband 7.8
full length 7.8
party 7.7
elegant 7.7
glamor 7.7
marriage 7.6
joy 7.5
relationship 7.5
life 7.5
statue 7.4
closeup 7.4
20s 7.3
sculpture 7.3
girls 7.3
mask 7.3
detail 7.2
looking 7.2
home 7.2
together 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 96.5
wall 96.3
clothing 94.3
dress 93.3
woman 89.6
old 83.5
human face 82.2
text 78.6
vintage clothing 65.1
victorian 56.3
retro style 55.6
smile 53.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-30
Gender Male, 61.7%
Calm 97%
Angry 1.6%
Sad 0.8%
Confused 0.2%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 23-37
Gender Female, 99.8%
Calm 94.2%
Sad 2.8%
Angry 1%
Happy 0.8%
Confused 0.5%
Fear 0.5%
Surprised 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Painting 85.1%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 92.2%
a vintage photo of a man and woman posing for a picture 85.9%
a vintage photo of a man and a woman posing for a picture 85.8%