Human Generated Data

Title

Untitled (couple seated next to baby with toys)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5008

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple seated next to baby with toys)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5008

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Person 99.2
Person 97.3
Chair 89.8
Furniture 89.8
Face 88.5
People 80.6
Clothing 80.5
Apparel 80.5
Room 78.9
Indoors 78.9
Chess 77.7
Game 77.7
Female 70.6
Portrait 69.1
Photography 69.1
Photo 69.1
Meal 68.6
Food 68.6
Girl 60.6
Bed 60.5
Table 60.2

Clarifai
created on 2023-10-26

people 99.6
child 99.1
group 98
man 96.6
family 96.6
woman 95.5
sit 94.9
son 94.7
adult 94.6
offspring 93.9
indoors 92.2
sibling 91.1
three 90.5
two 87.4
four 83.8
room 80.9
portrait 80.2
nostalgia 80
monochrome 79.4
chair 79.2

Imagga
created on 2022-01-22

person 35.9
people 27.9
adult 22.2
man 20.9
patient 17.8
male 17.1
portrait 16.8
negative 16.8
nurse 15.9
couple 14.8
men 14.6
science 14.2
medical 14.1
lady 13.8
film 13.6
professional 13.5
women 13.4
senior 13.1
sexy 12.8
fashion 12.8
face 12.8
mother 12.8
love 12.6
black 12.6
laboratory 12.5
happiness 12.5
happy 12.5
bride 12.5
art 12.4
instrument 12.1
sitting 12
human 12
world 11.9
health 11.8
dress 10.8
team 10.7
lab 10.7
chemistry 10.6
indoors 10.5
equipment 10.2
smile 10
photographic paper 9.8
costume 9.7
scientific 9.7
medicine 9.7
chemical 9.6
style 9.6
home 9.6
hairstyle 9.5
planner 9.5
doctor 9.4
model 9.3
coat 9.2
clothing 9.1
modern 9.1
silhouette 9.1
pretty 9.1
student 9.1
stylish 9
family 8.9
chemist 8.8
life 8.8
look 8.8
hair 8.7
smiling 8.7
test 8.7
work 8.6
party 8.6
research 8.6
salon 8.4
attractive 8.4
hospital 8.4
old 8.4
sick person 8.3
wedding 8.3
groom 8.2
technology 8.2
case 8.1
grandma 8.1
worker 8.1
kin 8
celebration 8
working 7.9
clinic 7.9
biotechnology 7.8
scientist 7.8
technician 7.8
education 7.8
ceremony 7.8
play 7.8
grunge 7.7
youth 7.7
biology 7.6
elegance 7.6
joy 7.5
make 7.3
music 7.2
child 7.1
posing 7.1
together 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.2
text 98.4
clothing 95.7
man 95.6
window 84.7
human face 84
posing 57.4
woman 53.6
smile 53.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 53.8%
Happy 84.7%
Calm 7.9%
Sad 2.6%
Surprised 2.3%
Disgusted 0.7%
Confused 0.7%
Angry 0.6%
Fear 0.4%

AWS Rekognition

Age 25-35
Gender Female, 100%
Happy 59.6%
Surprised 14.6%
Sad 10.5%
Disgusted 5.8%
Calm 4.1%
Angry 2%
Fear 1.9%
Confused 1.5%

AWS Rekognition

Age 6-16
Gender Female, 81.9%
Calm 84.3%
Surprised 10.4%
Sad 1.9%
Happy 1%
Confused 0.9%
Angry 0.5%
Fear 0.5%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chess 77.7%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

ar
11424.
11424

Google

ar 1424- MAMTZA TIN24,
ar
1424-
MAMTZA
TIN24,