Human Generated Data

Title

Untitled (two women dressed formally passing a tea cup in domestic interior)

Date

May 2, 1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17980

Human Generated Data

Title

Untitled (two women dressed formally passing a tea cup in domestic interior)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

May 2, 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Apparel 99.9
Clothing 99.9
Person 98.3
Human 98.3
Person 98.3
Chair 89.9
Furniture 89.9
Home Decor 84.1
Robe 83.4
Fashion 83.4
Gown 78.4
Evening Dress 77.9
Dress 77.5
Overcoat 77.1
Suit 77.1
Coat 77.1
Face 76.4
Female 74.3
Sleeve 70.6
Door 70
Photo 65.4
Portrait 65.4
Photography 65.4
Wood 62.1
Long Sleeve 61.6
Floor 57.6
Woman 57.4
Shirt 57.2

Imagga
created on 2022-03-04

people 29.6
man 28.2
male 26.4
person 25.7
adult 22.9
shop 22.6
portrait 20.1
happy 19.4
family 17.8
business 17.6
home 15.9
smile 15
life 14.5
fashion 13.6
interior 13.3
clothing 13.2
newspaper 13.2
men 12.9
two 12.7
women 12.6
black 12.6
child 12.3
couple 12.2
grandma 11.7
lifestyle 11.6
indoors 11.4
standing 11.3
human 11.2
old 11.1
grandfather 10.9
smiling 10.8
dress 10.8
mother 10.7
professional 10.7
to 10.6
room 10.5
boy 10.4
teacher 10.3
senior 10.3
corporate 10.3
sale 10.2
handsome 9.8
attractive 9.8
lady 9.7
group 9.7
style 9.6
looking 9.6
clinic 9.6
store 9.4
shopping 9.2
patient 9.2
office 9
kid 8.9
businessman 8.8
together 8.8
product 8.7
expression 8.5
father 8.5
casual 8.5
holding 8.3
worker 8
urban 7.9
happiness 7.8
face 7.8
model 7.8
mercantile establishment 7.7
pretty 7.7
retail 7.6
guy 7.5
care 7.4
indoor 7.3
new 7.3
color 7.2
work 7.1
medical 7.1
modern 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

person 98.9
clothing 93.8
outdoor 89.2
text 81.3
black and white 66
woman 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 83%
Calm 96.4%
Happy 1.8%
Sad 0.8%
Confused 0.4%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a group of people standing next to a man 86.7%
a man standing next to a woman 82.2%
a group of people standing next to a man in a suit and tie 82.1%