Human Generated Data

Title

Untitled (woman applying make-up to young clown)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7708

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman applying make-up to young clown)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.4
Human 98.4
Person 97.7
Clothing 96.9
Apparel 96.9
Chair 93
Furniture 93
Female 91.6
Indoors 88.9
Face 88.8
Woman 77.8
Dress 75.5
Bedroom 75.3
Room 75.3
Tie 73.7
Accessories 73.7
Accessory 73.7
Table 72.6
Girl 70.5
Building 70.2
Housing 70.2
Bed 66.9
Pants 66.4
Photography 66.3
Photo 66.3
Portrait 66.1
People 65.3
Kid 64.5
Child 64.5
Hug 64.2
Man 63.1
Skin 61.5
Plant 61.2
Shorts 57.2
Play 57.1
Baby 56.3
Outdoors 56
Person 44.1

Imagga
created on 2022-01-09

person 31.2
man 30.2
people 26.2
adult 21.1
male 20
nurse 18.8
patient 18.5
men 18
hospital 15.1
medical 15
brass 14.5
professional 14.4
home 14.4
room 13.8
mask 13.7
portrait 13.6
lifestyle 13
work 12.6
worker 12.5
wind instrument 12.3
face 12.1
human 12
health 11.8
medicine 11.4
indoors 11.4
black 11.4
doctor 11.3
dress 10.8
uniform 9.9
interior 9.7
style 9.6
couple 9.6
musical instrument 9.6
device 9.5
surgeon 9.5
life 9.2
occupation 9.2
equipment 9.2
care 9.1
team 9
instrument 8.7
women 8.7
happiness 8.6
art 8.5
old 8.4
photographer 8.3
fashion 8.3
vintage 8.3
sport 8.2
music 8.2
technology 8.2
sick person 8
smiling 8
look 7.9
case 7.9
smile 7.8
play 7.8
modern 7.7
guy 7.6
illness 7.6
happy 7.5
senior 7.5
fun 7.5
traditional 7.5
leisure 7.5
newspaper 7.4
retro 7.4
new 7.3
clothing 7.2
family 7.1
to 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.1
person 97.6
wedding dress 84.7
drawing 83.2
newspaper 82.6
sketch 82
clothing 76.4
bride 74.7
woman 70.1
black and white 66.3
human face 59.9

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Female, 79.9%
Sad 49.6%
Calm 47.4%
Surprised 0.8%
Fear 0.8%
Confused 0.5%
Disgusted 0.4%
Happy 0.3%
Angry 0.3%

Feature analysis

Amazon

Person 98.4%
Tie 73.7%

Captions

Microsoft

a group of people standing in a newspaper 38.8%
a group of people standing on a newspaper 34.8%
a group of people standing next to a newspaper 34.7%

Text analysis

Amazon

28436.
YTERA?
YTERA? TATION
TATION