Human Generated Data

Title

Untitled (three girls with bibles, candles and rosaries)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1559

Human Generated Data

Title

Untitled (three girls with bibles, candles and rosaries)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.5
Person 99.1
Apparel 97.1
Clothing 97.1
Nature 89.9
Outdoors 83.4
People 72.6
Female 71.3
Coat 66.5
Girl 64.4
Snow 63.9
Fog 60.4
Ice 59.3
Fashion 55.2
Robe 55.2

Imagga
created on 2021-12-14

kin 41
negative 39.1
bride 31.7
film 30.9
people 29.6
portrait 26.5
photographic paper 23.9
couple 23.5
wedding 23
adult 22.7
dress 22.6
happiness 21.9
person 21.7
happy 20.7
love 19.7
fashion 18.8
groom 17.9
male 17.1
family 16.9
mother 16.6
marriage 16.1
attractive 16.1
photographic equipment 15.9
smiling 15.9
smile 15.7
man 15.6
cheerful 15.4
pretty 15.4
married 14.4
women 14.2
men 13.7
human 13.5
day 12.6
bouquet 12.5
child 12.1
bridal 11.7
romance 11.6
posing 11.6
lady 11.4
face 11.4
old 11.1
two 11
joy 10.9
cute 10.8
nurse 10.6
together 10.5
wife 10.4
home 10.4
parent 10.2
lifestyle 10.1
clothing 10.1
holiday 10
pose 10
husband 9.8
romantic 9.8
celebration 9.6
flowers 9.6
hair 9.5
casual 9.3
gown 8.9
group 8.9
sexy 8.8
art 8.8
looking 8.8
indoors 8.8
model 8.6
togetherness 8.5
black 8.4
elegance 8.4
joyful 8.3
room 8.2
blond 8.2
sculpture 7.9
veil 7.8
ceremony 7.8
sitting 7.7
engagement 7.7
innocence 7.7
loving 7.6
one 7.5
outdoors 7.5
holding 7.4
church 7.4
life 7.3
snow 7.1
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.1
window 95.5
clothing 95.1
person 93.8
posing 87.2
old 78.7
drawing 66.1
sketch 61.8

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 80%
Calm 73.5%
Sad 10.4%
Confused 6.7%
Happy 5.3%
Surprised 2.2%
Fear 0.9%
Angry 0.6%
Disgusted 0.3%

AWS Rekognition

Age 27-43
Gender Female, 68%
Calm 51.2%
Sad 42.5%
Happy 2.3%
Confused 1%
Disgusted 0.9%
Angry 0.9%
Surprised 0.6%
Fear 0.6%

AWS Rekognition

Age 22-34
Gender Female, 91.8%
Happy 53.9%
Calm 23.1%
Sad 13%
Angry 4.5%
Fear 2.5%
Confused 1.6%
Surprised 0.8%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 85.9%
a vintage photo of a group of people posing for a picture 85.8%
a group of people posing for a photo 85.7%

Text analysis

Amazon

FILM
AGEA NITRATE FILM
NITRATE
AGEA
JNJ

Google

AGFANITRATE
AGFANITRATE FILM
FILM