Human Generated Data

Title

Untitled (two women and bride eating)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21680

Human Generated Data

Title

Untitled (two women and bride eating)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.2
Human 99.2
Clothing 96.4
Apparel 96.4
Person 94.3
Person 91.8
Female 75.3
People 63.4
Room 59.2
Indoors 59.2
Woman 58.4
Evening Dress 55.3
Fashion 55.3
Gown 55.3
Robe 55.3

Imagga
created on 2022-03-11

shower cap 47.6
cap 38
headdress 29.4
people 26.8
man 24.9
clothing 24.7
dress 22.6
person 21.6
couple 20.9
portrait 20.7
bride 20.1
male 19.9
adult 19.2
love 18.9
surgeon 17.5
wedding 17.5
salon 17.4
patient 17.3
happy 15.7
gown 15.6
fashion 15.1
medical 14.1
doctor 14.1
groom 14
hospital 13.4
men 12.9
smile 12.8
face 12.8
veil 12.7
health 12.5
nurse 12.5
covering 12.2
happiness 11.7
bridal 11.7
pretty 11.2
room 11.1
women 11.1
professional 11.1
consumer goods 10.8
husband 10.7
attractive 10.5
old 10.4
senior 10.3
family 9.8
medicine 9.7
together 9.6
home 9.6
hair 9.5
uniform 9.5
bouquet 9.4
costume 9.3
religion 9
lady 8.9
sculpture 8.8
elderly 8.6
sitting 8.6
illness 8.6
marriage 8.5
wife 8.5
human 8.2
care 8.2
style 8.2
worker 8
indoors 7.9
flowers 7.8
surgery 7.8
ceremony 7.8
art 7.7
married 7.7
two 7.6
traditional 7.5
mask 7.5
holding 7.4
church 7.4
sexy 7.2
lifestyle 7.2
looking 7.2
celebration 7.2
team 7.2
romantic 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 92.9
person 92.3
human face 91.1
clothing 87.4
musical instrument 85.5
woman 76.6
guitar 72.2
black and white 62.9
old 41.3

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 87.7%
Surprised 64.6%
Sad 14.7%
Happy 10.9%
Calm 4%
Angry 2.7%
Confused 1.4%
Disgusted 1.3%
Fear 0.4%

AWS Rekognition

Age 22-30
Gender Female, 55.2%
Calm 85.9%
Sad 12.2%
Angry 0.5%
Fear 0.4%
Confused 0.3%
Surprised 0.2%
Disgusted 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people posing for a photo 87.6%
a group of people posing for the camera 87.5%
a group of people posing for a picture 87.4%

Text analysis

Google

5.-
?
よ5.-?