Human Generated Data

Title

Untitled (woman putting make-up on young performer)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7700

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman putting make-up on young performer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7700

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.4
Person 98.5
Worker 97.8
Person 96.4
Hairdresser 90.9
Person 85.7
Female 83.2
People 77.1
Clothing 76
Apparel 76
Person 75.9
Photography 65.6
Photo 65.6
Woman 65.4
Face 63.8
Hair 63.7
Girl 62.6
Portrait 62.5
Shoe 59.1
Footwear 59.1
Kid 57
Child 57
Dress 56.6

Clarifai
created on 2023-10-26

people 99.8
child 99.5
group 99.2
group together 97.8
monochrome 97.6
three 97.3
man 95.9
family 95.7
sibling 95.4
four 95.1
adult 94.9
woman 94.9
offspring 94.5
music 94.4
son 94
several 93.7
two 93.3
musician 91.5
recreation 90.8
facial expression 84.3

Imagga
created on 2022-01-09

person 37.3
patient 35.4
people 32.3
salon 29.7
man 28.9
nurse 28.8
male 22.7
crutch 21.4
adult 20.3
medical 20.3
case 20.2
sick person 19.6
hospital 19.3
staff 18.5
health 18.1
senior 17.8
smiling 16.6
kin 16.4
family 16
couple 15.7
portrait 15.5
happy 15
women 14.2
love 14.2
home 13.5
clinic 13.4
illness 13.3
old 13.2
care 13.2
stick 12.4
professional 12.4
lifestyle 12.3
mature 12.1
men 12
two 11.8
room 11.8
dress 11.7
husband 11.4
doctor 11.3
human 11.2
sitting 11.2
happiness 11
equipment 10.7
smile 10.7
indoors 10.5
together 10.5
wife 10.4
work 10.2
face 9.9
mother 9.8
lady 9.7
retired 9.7
medicine 9.7
retirement 9.6
bride 9.6
married 9.6
hair 9.5
groom 9.2
wedding 9.2
pretty 9.1
wheelchair 9
worker 8.9
chair 8.7
elderly 8.6
help 8.4
looking 8
interior 8
mask 7.9
look 7.9
holiday 7.9
surgery 7.8
attractive 7.7
costume 7.6
hand 7.6
bouquet 7.5
fashion 7.5
traditional 7.5
outdoors 7.5
instrument 7.4
father 7.2
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.7
text 95.4
clothing 88.8
woman 78.7
footwear 72.7
dance 50.6
clothes 18.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 53%
Calm 99.6%
Sad 0.2%
Happy 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Male, 63%
Calm 92.7%
Sad 6.5%
Confused 0.4%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 45-51
Gender Male, 82.2%
Calm 98.6%
Sad 0.5%
Angry 0.3%
Happy 0.2%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 59.1%

Categories

Text analysis

Amazon

28437.

Google

乙Eわ82
E
82