Human Generated Data

Title

Untitled (woman brushing her hair and woman drinking in bed on circus train)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7089

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman brushing her hair and woman drinking in bed on circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 97.8
Human 97.8
Clothing 91.6
Apparel 91.6
Person 90.5
People 65.3
Face 62.3
Leisure Activities 62.3
Dance 56.2
Photography 56.2
Photo 56.2

Imagga
created on 2022-02-05

person 30.5
adult 25.7
people 24.5
man 22.8
male 19.8
musical instrument 18.3
lifestyle 17.3
senior 15
portrait 14.9
device 14.3
hair 14.3
computer 13.8
face 13.5
room 13.4
mature 13
patient 12.8
wind instrument 12.8
black 12.6
bowed stringed instrument 12.6
stringed instrument 12.6
fashion 11.3
happy 11.3
human 11.2
looking 11.2
attractive 11.2
body 11.2
women 11.1
casual 11
hand 10.6
indoors 10.5
one 10.4
sexy 10.4
health 10.4
business 10.3
men 10.3
violin 10.2
professional 10.2
communication 10.1
smile 10
working 9.7
technology 9.6
sitting 9.4
model 9.3
phone 9.2
studio 9.1
aged 9
case 9
laptop 8.8
happiness 8.6
elderly 8.6
brass 8.5
modern 8.4
indoor 8.2
sick person 8.2
office 8.2
equipment 8.1
businessman 7.9
network 7.9
old 7.7
teacher 7.6
club 7.5
care 7.4
back 7.3
sensuality 7.3
music 7.2
smiling 7.2
chair 7.1
work 7.1
monitor 7.1
microphone 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 99.6
text 96.5
black and white 77.4
concert 60.3

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 97.1%
Calm 64%
Surprised 15.6%
Happy 8.5%
Sad 5.3%
Confused 2.7%
Disgusted 1.3%
Angry 1.3%
Fear 1.2%

AWS Rekognition

Age 23-33
Gender Male, 75.5%
Calm 88.2%
Sad 8.6%
Happy 1.6%
Surprised 0.4%
Confused 0.4%
Disgusted 0.4%
Angry 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%

Text analysis

Amazon

16218
MJI7
MJI7 ЭТАЯТIИ ARAA
ЭТАЯТIИ
ARAA

Google

MJI7
16218.
AR
3TARTIN
16218. 16218. MJI7 3TARTIN AR