Human Generated Data

Title

Untitled (young boy watching one baby while woman feeds another baby)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7831

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young boy watching one baby while woman feeds another baby)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7831

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Furniture 99.9
Person 99.1
Human 99.1
Person 99
Chair 83.3
Person 77.2
Cradle 71

Clarifai
created on 2023-10-25

people 99.7
woman 97.1
monochrome 96.9
child 96.7
adult 95.9
man 95.6
two 92.4
group 90.4
sit 86.9
indoors 86.7
family 82.1
wear 81.8
one 80.3
vehicle 80
chair 79.5
three 78.6
elderly 78.1
transportation system 77.1
recreation 76.9
fun 75

Imagga
created on 2022-01-09

senior 33.7
people 31.2
person 30.7
man 30.2
home 29.5
couple 27.9
male 27.8
happy 23.2
elderly 23
grandma 23
indoors 22.8
mature 22.3
adult 21.8
together 21
smiling 21
love 20.5
portrait 20
sitting 19.7
lifestyle 19.5
retirement 18.2
retired 17.4
old 17.4
family 16.9
older 16.5
married 16.3
mother 15.5
patient 15.1
room 14.9
camera 14.8
cheerful 14.6
indoor 14.6
happiness 14.1
casual 13.5
husband 13.4
day 13.3
women 12.6
pensioner 12.6
enjoying 12.3
clothing 12.2
smile 12.1
face 12.1
fun 12
outdoors 11.9
kin 11.7
men 11.2
aged 10.9
having 10.6
age 10.5
one 10.4
wife 10.4
looking 10.4
child 10.3
grandfather 10
60s 9.8
look 9.6
bride 9.6
case 9.3
sick person 9.3
leisure 9.1
dress 9
health 9
daughter 9
70s 8.8
interior 8.8
medical 8.8
1 8.7
attractive 8.4
parent 8.4
house 8.4
wedding 8.3
holding 8.2
care 8.2
alone 8.2
playing 8.2
relaxing 8.2
lady 8.1
active 8.1
gray hair 7.9
pension 7.9
seniors 7.9
half length 7.8
hospital 7.8
daylight 7.8
nurse 7.7
only 7.6
loving 7.6
two 7.6
laughing 7.6
fashion 7.5
horizontal 7.5
relaxed 7.5
glasses 7.4
laptop 7.3
computer 7.2
romance 7.1
table 7
chair 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.1
person 91.1
man 90.8
statue 88.9
black and white 86.4
indoor 86.1
clothing 81.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 98.8%
Sad 67.5%
Calm 31%
Confused 0.6%
Fear 0.2%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 11-19
Gender Female, 84.3%
Sad 71.8%
Calm 20.5%
Happy 2.1%
Angry 1.8%
Confused 1.7%
Disgusted 0.8%
Surprised 0.8%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Text analysis

Amazon

39977
39977 A :
JIS
A :

Google

39977 A: 23 コA2-XATON
39977
A:
23
A2
-
XATON