Human Generated Data

Title

Untitled (bride next to table holding gifts)

Date

1950

People

Artist: Clement McLarty, American active 1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19717

Human Generated Data

Title

Untitled (bride next to table holding gifts)

People

Artist: Clement McLarty, American active 1960s

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19717

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Room 99.5
Indoors 99.5
Living Room 97.4
Person 95.7
Human 95.7
Furniture 93.8
Clothing 93.6
Apparel 93.6
Dressing Room 91.8
Interior Design 90.7
Person 83.3
Evening Dress 80.4
Fashion 80.4
Robe 80.4
Gown 80.4
Bedroom 79.9
Couch 77.5
Person 76.2
Person 66.2
Female 56.1
Person 49.2

Clarifai
created on 2023-10-22

people 99.9
adult 98.9
woman 98.7
wear 98.5
actress 98.3
one 97.5
monochrome 96.7
furniture 95.8
dress 94.8
room 93.6
portrait 91.4
two 90.3
group 89.9
fashion 88.9
veil 87.8
seat 87.5
dressing room 87.3
sit 86.9
retro 86
chair 84.3

Imagga
created on 2022-03-05

groom 45.6
salon 35.8
bride 32.9
wedding 29.4
kin 29.4
people 29
love 27.6
couple 27
dress 26.2
bouquet 23
celebration 21.5
happy 20.7
women 20.6
person 20.5
happiness 20.4
party 18.9
adult 18
home 17.5
marriage 17.1
interior 16.8
gown 15.8
indoors 15.8
fashion 15.1
indoor 14.6
men 14.6
family 14.2
man 14.1
cheerful 13.8
mother 13.7
ceremony 13.6
male 13.6
table 13.5
elegance 13.4
together 13.1
holiday 12.9
two 12.7
bridal 12.6
decoration 12.5
married 12.5
wife 12.3
lady 12.2
room 12
portrait 11.6
romance 11.6
husband 11.6
restaurant 11.5
dinner 11.5
old 11.1
romantic 10.7
flowers 10.4
life 10.4
wine 10.2
drink 10
wed 9.8
smiling 9.4
rose 9.4
smile 9.3
event 9.2
flower 9.2
attractive 9.1
new 8.9
style 8.9
matrimony 8.9
commitment 8.8
engagement 8.7
lifestyle 8.7
gift 8.6
sitting 8.6
luxury 8.6
face 8.5
senior 8.4
pretty 8.4
church 8.3
tradition 8.3
chair 8.3
vintage 8.3
holding 8.3
clothing 8.2
celebrate 8.1
decor 8
hair 7.9
day 7.8
veil 7.8
hope 7.7
formal 7.6
hand 7.6
clothes 7.5
fun 7.5
future 7.4
glass 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97
person 93.1
clothing 82.8
furniture 70.2
table 60
clothes 16.8
cluttered 10.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 99.3%
Happy 76.7%
Surprised 20.8%
Fear 0.9%
Calm 0.7%
Disgusted 0.4%
Angry 0.2%
Sad 0.2%
Confused 0.1%

AWS Rekognition

Age 22-30
Gender Male, 58.1%
Happy 38%
Calm 33.1%
Sad 10.1%
Confused 7.8%
Angry 4.4%
Surprised 2.3%
Fear 2.3%
Disgusted 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 95.7%
Person 83.3%
Person 76.2%
Person 66.2%
Person 49.2%

Captions

Microsoft
created on 2022-03-05

an old photo of a person 89.4%
old photo of a person 89%
a person standing in a room 88.9%

Text analysis

Amazon

9
KODA
699
Ls
SAFE

Google

CO
CO