Human Generated Data

Title

Untitled (woman applying make-up to girl in a rabbbit costume)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7647

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman applying make-up to girl in a rabbbit costume)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7647

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 98.5
Footwear 98.5
Clothing 98.5
Apparel 98.5
Shoe 98.5
Person 98.4
Furniture 82.5
Chair 77.7
Art 73.2
Female 67.7
Girl 60.2
Sitting 59.6
Drawing 56.9
Play 55.4

Clarifai
created on 2023-10-25

people 99.9
group together 97.9
group 97.4
man 97
two 96.7
adult 96.5
woman 94.5
wear 94.4
several 93.5
recreation 90.8
art 88.8
many 88.3
child 88
three 87.8
four 87.6
music 85.2
one 84.1
street 83
monochrome 80.1
actor 76.8

Imagga
created on 2022-01-08

person 19.2
salon 18.9
man 17.5
art 14.5
hairdresser 13.9
adult 13.6
portrait 13.6
face 13.5
people 13.4
statue 13
lifestyle 13
device 13
dress 12.6
black 12
body 12
building 11.9
sculpture 11.8
sexy 11.2
hair 11.1
fashion 10.5
human 10.5
old 10.4
model 10.1
male 9.9
health 9.7
patient 9.7
women 9.5
head 9.2
relaxation 9.2
makeup 9.1
girls 9.1
lady 8.9
room 8.8
indoors 8.8
mask 8.5
smile 8.5
monument 8.4
house 8.4
city 8.3
retro 8.2
sensuality 8.2
style 8.2
clothing 8
interior 8
look 7.9
architecture 7.8
ancient 7.8
shower 7.8
men 7.7
culture 7.7
bride 7.7
skin 7.6
one 7.5
vintage 7.4
equipment 7.4
training 7.4
light 7.3
gorgeous 7.2
religion 7.2
history 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

drawing 92.2
sketch 88.8
text 88.7
person 87.6
cartoon 81.4
clothing 67.1
old 48.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 92.2%
Calm 97.9%
Angry 0.6%
Sad 0.6%
Confused 0.4%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 27-37
Gender Male, 68.6%
Calm 77.5%
Surprised 16.4%
Sad 1.8%
Disgusted 1.5%
Confused 1.2%
Happy 0.6%
Fear 0.5%
Angry 0.4%

Feature analysis

Amazon

Person 99.5%
Shoe 98.5%

Categories

Captions

Text analysis

Amazon

28439A
YT37A8-
YT37A8- YACOX
YACOX

Google

28439
A.
YT37A2- AGOX 28439 A.
YT37A2-
AGOX