Human Generated Data

Title

Untitled (actress applying make-up, Hedgerow Theater, PA)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12029

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (actress applying make-up, Hedgerow Theater, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12029

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 98.6
Furniture 86.3
Face 77.8
Clothing 75.4
Apparel 75.4
Shelf 65.3
Hair 61.2
Room 59
Indoors 59
Drawing 58.9
Art 58.9

Clarifai
created on 2023-10-26

people 99.9
two 99
monochrome 98.3
child 97.5
adult 97.2
one 95.1
woman 94.9
portrait 91.4
man 91.1
administration 91.1
three 90.1
group 90
offspring 89.9
room 88
mirror 87.5
education 86.9
indoors 85.5
family 85.3
music 84.4
scientist 83.6

Imagga
created on 2022-01-15

man 34.9
hand blower 32.3
device 30.3
dryer 27.8
people 26.2
blower 25.8
male 25.6
hairdresser 25.6
appliance 24.6
call 24
happy 23.8
home 22.3
work 21.2
smiling 21
person 20.2
adult 20.1
indoors 16.7
worker 16
working 15
durables 14.7
smile 14.2
office 14
job 13.3
child 13.2
inside 12.9
sitting 12.9
professional 12.8
senior 12.2
computer 12.2
business 12.1
lifestyle 11.6
to 11.5
face 11.4
fun 11.2
men 11.2
chair 11
machine 10.7
equipment 10.7
family 10.7
room 10.6
cheerful 10.6
portrait 10.4
mature 10.2
communication 10.1
house 10
laptop 10
interior 9.7
looking 9.6
repair 9.6
seat 9.6
happiness 9.4
cute 9.3
health 9
human 9
labor 8.8
casual 8.5
phone 8.3
painter 8.1
handsome 8
hospital 8
kid 8
mother 7.9
businessman 7.9
standing 7.8
education 7.8
project 7.7
expression 7.7
vehicle 7.5
car 7.5
holding 7.4
technology 7.4
support 7.4
training 7.4
occupation 7.3
black 7.2
women 7.1
television 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 97.8
indoor 97.7
wall 95.3
text 93.5
human face 86.5
clothing 80.5
black and white 74.4
computer 65.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 96.9%
Calm 98.7%
Sad 0.4%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Text analysis

Amazon

RAD
RAD YE3A
YE3A

Google

DELEMDEB 2VEEIA BV2E
DELEMDEB
2VEEIA
BV2E