Human Generated Data

Title

Untitled (Seminole mother and child posing behind bucket)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5211

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Seminole mother and child posing behind bucket)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.5
Face 99.5
Person 98.4
Blonde 95.2
Female 95.2
Teen 95.2
Woman 95.2
Girl 95.2
Kid 95.2
Child 95.2
Person 92.7
Head 86.1
Smile 85.7
Outdoors 79.8
Photography 79.5
Portrait 79.5
Photo 79.5
Play 76.9
Bucket 76.1
Clothing 75.8
Apparel 75.8
Costume 72.1
Hat 70.5
People 70
Dress 68.1
Nature 65.6
Sand 63.1
Baby 62.3
Boy 61.5
Water 60.3
Soil 58.2
Laughing 55.3

Imagga
created on 2022-01-23

negative 89.5
film 69.9
photographic paper 54
photographic equipment 36
wedding 33.1
groom 31.9
bride 31
dress 24.4
celebration 21.5
love 19.7
veil 17.6
people 16.2
marriage 15.2
wife 15.2
couple 14.8
person 14.6
ceremony 14.5
married 14.4
happiness 14.1
flowers 13.9
day 13.3
bouquet 12.7
religion 12.5
clothes 12.2
human 11.2
church 11.1
event 11.1
clothing 11.1
traditional 10.8
fashion 10.5
two 10.2
city 10
face 9.9
sculpture 9.9
adult 9.9
wed 9.8
bridal 9.7
portrait 9.7
women 9.5
men 9.4
architecture 9.4
holiday 9.3
tradition 9.2
male 9.2
marble 9
romantic 8.9
decoration 8.9
dishwasher 8.8
man 8.8
husband 8.7
engagement 8.7
elegance 8.4
attractive 8.4
technology 8.2
history 8
romance 8
looking 8
gown 7.9
art 7.9
cute 7.9
roses 7.8
old 7.7
one 7.5
cheerful 7.3
building 7.3
life 7.2
suit 7.2
glass 7.1

Microsoft
created on 2022-01-23

text 99.7
book 97.3
outdoor 92
clothing 64.3
drawing 62.4
person 59
sketch 57
old 52.1

Face analysis

Amazon

AWS Rekognition

Age 13-21
Gender Female, 71%
Calm 30.5%
Surprised 29.7%
Sad 29.2%
Happy 4.2%
Fear 2.1%
Angry 1.7%
Disgusted 1.5%
Confused 1%

AWS Rekognition

Age 39-47
Gender Male, 98.9%
Sad 51.5%
Confused 30.3%
Calm 7.3%
Happy 4.2%
Fear 3.6%
Surprised 1.7%
Disgusted 1%
Angry 0.5%

Feature analysis

Amazon

Person 98.4%
Hat 70.5%

Captions

Microsoft

an old photo of a person 74.1%
old photo of a person 71.4%
an old photo of a girl 50.4%

Text analysis

Amazon

22444.
RODVR-SVEELA