Human Generated Data

Title

Untitled (baby seated on chair with phone to ear)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5253

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (baby seated on chair with phone to ear)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Furniture 100
Chair 100
Couch 97.5
Person 95.7
Human 95.7
Person 93.9
Armchair 91.3

Imagga
created on 2022-02-05

sofa 35.5
adult 34.7
couch 31.9
home 28.7
person 28.4
armchair 25.9
happy 25.7
sitting 24.9
people 24.5
attractive 22.4
smile 20.7
lifestyle 20.2
man 20.2
house 19.2
male 18.5
leisure 18.3
smiling 18.1
room 17.4
cheerful 17.1
seat 17
portrait 16.8
interior 16.8
looking 16.8
pretty 16.8
furniture 16.8
fashion 16.6
happiness 16.5
indoor 16.4
one 16.4
chair 16.1
laptop 15.8
couple 15.7
model 15.6
casual 15.2
indoors 14.9
body 14.4
hair 14.3
lady 13.8
relaxing 13.6
relax 13.5
fun 13.5
love 13.4
business 13.4
resting 13.3
sexy 12.8
work 12.7
women 12.7
living 12.3
boy 12.2
human 12
child 11.9
suit 11.8
television 11.7
professional 11.7
jeans 11.5
office 11.4
computer 11.3
enjoy 11.3
joy 10.9
comfort 10.6
technology 10.4
convertible 10.4
studio couch 10.1
cute 10
cradle 9.8
handsome 9.8
comfortable 9.5
husband 9.5
sit 9.5
lying 9.4
youth 9.4
rest 9.3
20s 9.2
family 8.9
worker 8.9
device 8.7
luxury 8.6
elegant 8.6
wife 8.5
erotic 8.5
modern 8.4
floor 8.4
dress 8.1
baby bed 8.1
blond 8.1
kid 8
job 8
working 8
businessman 7.9
together 7.9
expression 7.7
notebook 7.6
relaxation 7.5
hot 7.5
passion 7.5
dark 7.5
executive 7.5
style 7.4
inside 7.4
black 7.3
alone 7.3
sensuality 7.3
romance 7.1
posing 7.1
face 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 97.4
indoor 94.4
furniture 90.4
person 89.5
black and white 84.7
clothing 66.5
chair 64.7

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 96%
Happy 87.3%
Calm 6.3%
Sad 3.3%
Confused 0.9%
Surprised 0.8%
Disgusted 0.5%
Angry 0.5%
Fear 0.4%

Feature analysis

Amazon

Couch 97.5%
Person 95.7%

Captions

Microsoft

a group of people sitting at a table 83.1%
a group of people sitting around a table 83%
a group of people sitting on a table 76.6%

Text analysis

Amazon

MJIR
ATDA
MJIR ЭТАЯТІЙ ATDA
ЭТАЯТІЙ

Google

TARTIA ATDA
ATDA
TARTIA