Human Generated Data

Title

Untitled (woman kneeling down to adjust bride's train)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12910

Human Generated Data

Title

Untitled (woman kneeling down to adjust bride's train)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 100
Apparel 100
Painting 97.7
Art 97.7
Veil 93.9
Human 92.3
Person 92.3
Gown 84.7
Fashion 84.7
Robe 80.7
Female 77.6
Wedding 74.3
Woman 67.2
Wedding Gown 66.8
Evening Dress 64.3

Clarifai
created on 2019-11-16

people 99.8
veil 99.6
woman 98.1
adult 97.5
wedding 96.1
princess 95.9
art 95.6
dress 95.6
two 95.4
man 94.1
one 93.6
seat 91.9
furniture 91.7
bride 89.2
portrait 88.5
room 87.8
gown 86.6
wear 85.8
groom 84.5
child 83.3

Imagga
created on 2019-11-16

dress 23.5
hair 20.6
adult 20.1
portrait 20.1
fashion 19.6
attractive 19.6
sexy 19.3
person 19.2
black 18.8
groom 18.5
face 18.5
people 17.8
pretty 17.5
bride 15.5
dark 14.2
model 13.2
makeup 12.8
man 12.8
sensual 12.7
love 12.6
sensuality 11.8
art 11.6
male 11.3
eyes 11.2
sitting 11.2
wedding 11
romantic 10.7
blond 10.4
body 10.4
support 10.3
lifestyle 10.1
happy 10
make 10
city 10
veil 9.8
night 9.8
evil 9.7
lady 9.7
couple 9.6
covering 9.4
book jacket 9.3
head 9.2
plastic bag 9
one 9
sculpture 8.8
costume 8.7
bookend 8.7
women 8.7
sad 8.7
happiness 8.6
skin 8.5
hair spray 8.4
device 8.3
room 8.3
vintage 8.3
human 8.2
jacket 8.2
bag 8.1
light 8
looking 8
posing 8
statue 7.9
witch 7.9
erotic 7.8
artistic 7.8
bridal 7.8
toiletry 7.7
expression 7.7
married 7.7
youth 7.7
hairstyle 7.6
two 7.6
style 7.4
indoor 7.3
architecture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wedding dress 99.3
indoor 98.5
bride 97.9
dress 96
wall 95.6
text 89.5
woman 82.6
clothing 67.2
flower 65.5
person 61.7
wedding 57.8

Face analysis

Amazon

Google

AWS Rekognition

Age 17-29
Gender Female, 51.7%
Calm 53%
Disgusted 45%
Surprised 45.1%
Fear 45%
Happy 45%
Angry 45.1%
Confused 45.4%
Sad 46.4%

AWS Rekognition

Age 22-34
Gender Female, 54.7%
Disgusted 45.1%
Fear 45.2%
Confused 45.3%
Calm 50.8%
Surprised 45.1%
Angry 45.4%
Sad 48.1%
Happy 45%

AWS Rekognition

Age 35-51
Gender Female, 50%
Disgusted 49.5%
Fear 49.6%
Surprised 49.5%
Angry 49.7%
Calm 49.8%
Happy 49.5%
Confused 49.5%
Sad 49.8%

AWS Rekognition

Age 31-47
Gender Male, 50.4%
Calm 49.9%
Happy 49.5%
Angry 49.6%
Disgusted 49.5%
Surprised 49.5%
Fear 49.5%
Sad 50%
Confused 49.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 97.7%
Person 92.3%

Captions

Microsoft

a person sitting on a bed 37%
a person sitting in a room 36.9%
a person sitting on a bed 36.8%

Text analysis

Amazon

LBVAE
LBVAE EIrw
EIrw

Google

VCLV MBVIE Lirw
VCLV
MBVIE
Lirw