Human Generated Data

Title

Untitled (two girls playing "dress up" next to bed)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14759

Human Generated Data

Title

Untitled (two girls playing "dress up" next to bed)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Room 98
Indoors 98
Human 97.1
Person 97.1
Person 96.9
Apparel 96.6
Clothing 96.6
Dressing Room 90.7
Evening Dress 86.2
Fashion 86.2
Robe 86.2
Gown 86.2
Female 70.5
Floor 68.4
Costume 66.1
Art 64.6
Painting 64.6
Furniture 63.2
Leisure Activities 60.5
Living Room 59.4
Woman 56.2
Flooring 55.9
Dance Pose 55.7

Imagga
created on 2022-01-29

adult 23.1
people 22.9
person 20.8
shop 19.5
portrait 18.8
man 17.5
barbershop 17.3
clothing 16.9
bride 16.3
home 15.9
dress 15.3
salon 15.2
happy 15
attractive 14.7
teacher 14.6
happiness 14.1
pretty 14
fashion 13.6
women 13.4
interior 13.3
wedding 12.9
sexy 12.8
window 12.8
house 12.5
professional 12.2
male 12.1
indoor 11.9
negative 11.8
mercantile establishment 11.7
cheerful 11.4
lady 11.4
educator 11.2
luxury 11.1
bouquet 10.8
couple 10.4
decoration 10.3
love 10.3
face 9.9
family 9.8
celebration 9.6
room 9.5
smiling 9.4
film 9.3
human 9
style 8.9
dancer 8.8
indoors 8.8
look 8.8
bridal 8.7
performer 8.6
gift 8.6
men 8.6
model 8.6
smile 8.5
art 8.5
two 8.5
sensual 8.2
light 8
posing 8
hair 7.9
life 7.9
brunette 7.8
black 7.8
gown 7.8
place of business 7.8
elegant 7.7
married 7.7
elegance 7.6
domestic 7.5
future 7.4
design 7.3
business 7.3
lifestyle 7.2
looking 7.2
photographic paper 7.2
holiday 7.2
costume 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 97.9
dress 90.5
clothing 89.5
dance 89.2
person 85.5
woman 84.6
window 82.9
wedding dress 77.5
gallery 56.3
bride 53
room 45.1
clothes 21.3

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Female, 64.5%
Calm 71.3%
Happy 12.6%
Surprised 11.8%
Fear 1.1%
Sad 1.1%
Disgusted 1%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 25-35
Gender Female, 95.1%
Calm 67.9%
Surprised 13.6%
Fear 8.3%
Sad 4.3%
Happy 1.7%
Disgusted 1.7%
Angry 1.4%
Confused 1%

Feature analysis

Amazon

Person 97.1%
Painting 64.6%

Captions

Microsoft

a group of people standing in a room 75.4%
a group of people standing next to a window 51.7%
a group of people standing in front of a window 49.8%