Human Generated Data

Title

Untitled (two women at tea for D.A.R meeting, prints)

Date

c.1970, printed from earlier negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18585

Human Generated Data

Title

Untitled (two women at tea for D.A.R meeting, prints)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, printed from earlier negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.5
Human 99.5
Apparel 99.2
Clothing 99.2
Person 95.1
Door 82.8
Evening Dress 77.6
Robe 77.6
Gown 77.6
Fashion 77.6
Footwear 75.2
Shoe 75.2
Overcoat 74.1
Coat 74.1
Flooring 63
Floor 61.2
Tuxedo 58.2
Suit 54.7

Imagga
created on 2022-02-25

clothing 27.2
people 26.7
fashion 25.6
man 23.5
person 23.3
adult 23
dress 22.6
garment 20.4
portrait 20
male 19.8
couple 16.5
kimono 16
black 15.5
attractive 15.4
smiling 15.2
lifestyle 15.2
happy 15
model 14
pretty 14
robe 14
women 13.4
happiness 12.5
lady 12.2
standing 12.2
sexy 12
skirt 11.9
two 11.8
musical instrument 11.4
elegant 10.3
street 10.1
accordion 10
smile 10
city 10
outdoor 9.9
costume 9.7
teenage 9.6
hair 9.5
culture 9.4
elegance 9.2
sarong 9.2
makeup 9.1
business 9.1
outdoors 8.9
together 8.8
outfit 8.7
full length 8.7
face 8.5
adults 8.5
clothes 8.4
covering 8.3
one 8.2
girls 8.2
sensual 8.2
style 8.1
urban 7.9
summer 7.7
outside 7.7
old 7.7
casual 7.6
human 7.5
leisure 7.5
holding 7.4
keyboard instrument 7.4
teen 7.3
wind instrument 7.3
cheerful 7.3
stylish 7.2
suit 7.2
cute 7.2
consumer goods 7.1
posing 7.1
love 7.1
indoors 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

person 99.3
text 98.3
floor 97.8
clothing 96.6
dress 92.3
indoor 92.1
standing 83.1
black and white 75.5
footwear 72.5
man 71.3
woman 70.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-50
Gender Female, 99.8%
Happy 99.1%
Surprised 0.4%
Calm 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Sad 0%

AWS Rekognition

Age 16-24
Gender Female, 99.4%
Calm 43.7%
Happy 42.4%
Surprised 5.1%
Sad 3.9%
Confused 2.2%
Angry 1.5%
Disgusted 0.8%
Fear 0.4%

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 75.2%
Suit 54.7%

Captions

Microsoft

a man and a woman standing in a room 92%
a group of people standing in a room 91.9%
a group of people posing for the camera 89.5%