Human Generated Data

Title

Untitled (young boy watching two babies in a small crib)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7830

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young boy watching two babies in a small crib)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7830

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Furniture 100
Person 99.4
Human 99.4
Cradle 89.1
Crib 59.1

Clarifai
created on 2023-10-25

people 99.9
child 99.7
one 97.3
monochrome 95.8
portrait 94.1
wear 93
woman 93
baby 92.1
two 91.1
nostalgia 90.4
adult 89
art 88.3
actress 88.2
dress 88.2
group 88.2
leader 88
man 87.3
offspring 86.7
son 85.9
sit 85.3

Imagga
created on 2022-01-09

furniture 25.5
baby bed 22.5
dress 18.1
cradle 17.8
wedding 17.5
furnishing 17.1
person 17
love 16.6
bride 16.3
people 16.2
celebration 14.3
adult 14.3
portrait 14.2
child 14.2
face 12.8
male 12.8
home 12.8
smiling 12.3
man 12.1
decoration 11.8
happiness 11.7
married 11.5
holiday 11.5
cheerful 11.4
couple 11.3
happy 11.3
table 11
bouquet 10.9
traditional 10.8
dining table 10.8
art 10.5
senior 10.3
family 9.8
flowers 9.6
sculpture 9.5
marriage 9.5
sitting 9.4
clothing 9.3
smile 9.3
balcony 9.2
old 9
wed 8.8
life 8.8
women 8.7
grandfather 8.5
wife 8.5
travel 8.4
fun 8.2
outdoors 8.2
one 8.2
lady 8.1
interior 8
lifestyle 7.9
gown 7.9
indoors 7.9
veil 7.8
grandma 7.8
architecture 7.8
groom 7.7
chair 7.6
church 7.4
indoor 7.3
color 7.2
romance 7.1
to 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.4
wall 96.6
black and white 88
indoor 87.2
person 76.3
clothing 65.6
clothes 38.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 1-7
Gender Male, 52.2%
Sad 71.8%
Calm 26.3%
Confused 1.2%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 20-28
Gender Male, 85.8%
Calm 77.2%
Sad 13.1%
Angry 2.8%
Happy 2.7%
Disgusted 1.3%
Fear 1.2%
Confused 1.1%
Surprised 0.7%

Feature analysis

Amazon

Person 99.4%

Categories

Captions

Microsoft
created on 2022-01-09

a man sitting on a bed 33.8%

Text analysis

Amazon

399
399 80.
80.
JJS

Google

YTコヨA°2-XAGOM 399 8 0. 丁丁S
YT
コヨ
A
°
2
-
XAGOM
399
8
0.
丁丁
S