Human Generated Data

Title

Untitled (two women with snowman)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15776

Human Generated Data

Title

Untitled (two women with snowman)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Apparel 99.8
Clothing 99.8
Human 99.6
Person 99.6
Person 99.5
Person 98.6
Female 90.8
Woman 78
Face 76.1
Robe 73.6
Fashion 73.6
Coat 73.1
Suit 73.1
Overcoat 73.1
Gown 71.6
Photo 65.4
Photography 65.4
People 65.3
Portrait 64.5
Evening Dress 59.8
Sleeve 58.5
Standing 55.6

Imagga
created on 2022-02-05

brass 66.1
cornet 61.1
wind instrument 55.9
musical instrument 39.3
man 26.2
horn 24.2
male 24.1
device 21.9
person 19.8
sport 18.9
people 18.4
sax 17.9
stage 17.7
adult 14.9
silhouette 14.9
active 14.5
instrumentality 14.3
black 13.8
dance 12.5
businessman 12.4
men 12
fun 12
couple 11.3
motion 11.1
action 11.1
style 11.1
trombone 10.8
outdoor 10.7
body 10.4
platform 10.4
sky 10.2
exercise 10
dress 9.9
business 9.7
artifact 9.5
party 9.5
happy 9.4
energy 9.2
power 9.2
leisure 9.1
portrait 9.1
fashion 9
suit 9
sunset 9
symbol 8.7
hair 8.7
light 8.7
happiness 8.6
art 8.5
attractive 8.4
dark 8.4
clothing 8.3
performer 8.3
human 8.2
freedom 8.2
one 8.2
outdoors 8.2
activity 8.1
success 8
posing 8
lifestyle 7.9
summer 7.7
bride 7.7
elegance 7.6
free 7.5
player 7.3
group 7.3
pose 7.2
professional 7.2
fitness 7.2
love 7.1
job 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.7
person 85.4
posing 76.7
drawing 72.8
black and white 72.5
art 58
old 42.1
net 17.4

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 60.2%
Happy 84.6%
Sad 4.4%
Confused 3.2%
Calm 2.8%
Fear 2%
Angry 1.4%
Surprised 1%
Disgusted 0.6%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 71.5%
Surprised 21.1%
Happy 5.5%
Disgusted 0.6%
Fear 0.5%
Confused 0.3%
Angry 0.3%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man and a woman posing for a photo 46.2%
a man and woman posing for a photo 39.4%
an old photo of a person 39.3%