Human Generated Data

Title

Untitled (women with bride outside)

Date

1946

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19572

Human Generated Data

Title

Untitled (women with bride outside)

People

Artist: Samuel Cooper, American active 1950s

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.8
Apparel 99.8
Human 99.4
Person 99.4
Person 99.3
Person 98.1
Dress 97.9
Female 96
Person 94.1
Transportation 92.6
Vehicle 92.6
Car 92.6
Automobile 92.6
Woman 87.7
Evening Dress 82.3
Fashion 82.3
Gown 82.3
Robe 82.3
Machine 79.3
Wheel 79.3
Road 71.2
People 67.3
Photography 63.3
Portrait 63.3
Face 63.3
Photo 63.3
Girl 62.3
Tarmac 60.6
Asphalt 60.6
Person 51.7

Imagga
created on 2022-03-05

robe 61.6
garment 45.7
clothing 44.8
people 22.9
adult 22
person 21.9
covering 21.5
fashion 15.8
consumer goods 15.1
dress 14.5
portrait 14.2
man 13.4
happy 13.2
face 12.8
costume 12.6
happiness 12.5
old 11.8
lifestyle 11.6
sky 11.5
male 11.3
shovel 11.1
pillory 10.3
women 10.3
bag 10.2
holiday 10
urban 9.6
instrument of punishment 9.4
wall 9.4
hat 9.2
outdoor 9.2
active 9
couple 8.7
umbrella 8.6
clothes 8.4
black 8.4
attractive 8.4
joy 8.3
city 8.3
mask 8.3
fun 8.2
one 8.2
instrument 8.2
style 8.2
lady 8.1
color 7.8
outside 7.7
pretty 7.7
culture 7.7
youth 7.7
winter 7.7
beach 7.6
elegance 7.6
leisure 7.5
tradition 7.4
snow 7.3
indoor 7.3
sexy 7.2
hair 7.1
working 7.1
musical instrument 7
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 97.7
black and white 94.3
text 88.6
clothing 86.1
street 82.3
person 81.3
vehicle 72.3
monochrome 70.7
car 56.2
old 47.2

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 92.1%
Surprised 96.9%
Calm 2.5%
Fear 0.2%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Car 92.6%
Wheel 79.3%

Captions

Microsoft

a group of people in an old photo of a person 68.8%
an old photo of a person 68.7%
a group of people that are standing in the snow 54.4%

Text analysis

Amazon

7