Human Generated Data

Title

Untitled (debutante kneeling)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19309

Human Generated Data

Title

Untitled (debutante kneeling)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Clothing 99.3
Apparel 99.3
Evening Dress 96.9
Gown 96.9
Fashion 96.9
Robe 96.9
Person 96.2
Human 96.2
Female 89
Woman 79.3
Dress 78.8
Arcade Game Machine 77.4
Text 57.8
Furniture 56.7

Imagga
created on 2022-02-25

groom 77.7
bride 56
dress 48.8
wedding 47.8
love 29.2
fashion 28.6
adult 28.6
marriage 26.6
portrait 26.5
married 25.9
attractive 25.9
sexy 25.7
pretty 25.2
person 24.1
people 24
bouquet 23.6
model 22.5
happiness 21.9
happy 21.9
gorgeous 21.7
couple 19.2
gown 18.5
face 17.8
veil 17.6
celebration 17.5
hair 17.4
bridal 16.5
cute 16.5
wife 16.1
posing 16
women 15.8
flowers 15.6
cheerful 15.4
smiling 15.2
romance 15.2
elegance 15.1
one 14.9
boutique 14.2
brunette 13.9
standing 13.9
lady 13.8
studio 13.7
skin 13.5
smile 12.8
looking 12.8
sensuality 12.7
clothing 12.4
human 12
sensual 11.8
day 11.8
ceremony 11.6
newlywed 11.3
style 11.1
two 11
wed 10.8
eyes 10.3
elegant 10.3
flower 10
lovely 9.8
outdoors 9.7
engagement 9.6
body 9.6
seductive 9.6
husband 9.5
luxury 9.4
man 9.4
passion 9.4
lifestyle 9.4
future 9.3
makeup 9.2
indoor 9.1
make 9.1
romantic 8.9
matrimony 8.9
look 8.8
black 8.5
clothes 8.4
event 8.3
holding 8.3
nice 8.2
20s 8.2
fun 8.2
outside 7.7
joy 7.5
traditional 7.5
domestic 7.5
lips 7.4
pose 7.2
home 7.2
life 7.2
male 7.1
interior 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.3
dress 98
indoor 92.5
clothing 92
woman 91.7
person 91.4
wedding dress 90.8
bride 79.4
picture frame 13

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 98.8%
Calm 46.8%
Sad 27.2%
Angry 12.2%
Confused 7.7%
Disgusted 2.3%
Surprised 2%
Fear 1.3%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.2%

Captions

Microsoft

a person standing in front of a television 44.8%
a person standing in a room 44.7%

Text analysis

Amazon

65
JAN
132

Google

132 JAN • 65
132
JAN
65