Human Generated Data

Title

Untitled (costumed women applying make-up in front of a mirror)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5633

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (costumed women applying make-up in front of a mirror)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5633

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.1
Human 99.1
Person 97.6
Person 96.9
Crowd 94
Person 92.4
Person 92.4
Person 92.2
Chair 91.8
Furniture 91.8
Person 90.9
Clothing 89.2
Apparel 89.2
Person 88.5
Person 83.3
People 80
Festival 73.7
Funeral 73.1
Person 72.4
Person 45.3

Clarifai
created on 2023-10-15

people 99.7
many 98
group 94.8
vehicle 94.5
group together 94.3
man 94.2
music 93.2
crowd 93.1
adult 92.7
transportation system 92
sitting 91.9
monochrome 91
parade 89.9
woman 88.8
child 85.1
war 84.2
administration 83.3
musician 83.1
military 82.9
party 82.7

Imagga
created on 2021-12-15

salon 36.5
person 33.7
man 30.2
male 25.5
people 25.1
patient 21.9
adult 20.2
business 17
shop 16.9
blackboard 15.4
office 15.2
medical 15
professional 14.6
men 14.6
work 14.4
businessman 14.1
room 13.9
hospital 13.3
working 13.2
medicine 13.2
case 12.9
doctor 12.2
computer 12
mercantile establishment 12
technology 11.9
health 11.8
equipment 11.6
smiling 11.6
smile 11.4
education 11.2
human 11.2
modern 11.2
chair 11.2
clinic 10.7
job 10.6
iron lung 10.5
indoors 10.5
illness 10.5
black 10.2
sick person 10.1
occupation 10.1
music 10
group 9.7
instrument 9.5
worker 9.2
student 9.2
portrait 9
team 9
happy 8.8
women 8.7
corporate 8.6
senior 8.4
respirator 8.4
barbershop 8.4
table 8.3
care 8.2
desk 8.2
place of business 8.1
science 8
looking 8
lifestyle 7.9
play 7.8
sitting 7.7
musical 7.7
casual 7.6
studio 7.6
hand 7.6
device 7.6
executive 7.5
meeting 7.5
nurse 7.4
alone 7.3
indoor 7.3
laptop 7.3
restaurant 7.2
surgeon 7.1
musician 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.5
person 92.7
black and white 80.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Male, 76.7%
Calm 66%
Sad 17.5%
Confused 5.9%
Surprised 3.3%
Angry 3.3%
Happy 2%
Disgusted 1.2%
Fear 0.9%

AWS Rekognition

Age 25-39
Gender Female, 88.5%
Sad 45.3%
Calm 33.6%
Happy 10.4%
Confused 3.5%
Fear 3.2%
Angry 2.6%
Disgusted 0.9%
Surprised 0.5%

AWS Rekognition

Age 14-26
Gender Male, 75.8%
Calm 73.6%
Sad 19.8%
Angry 2.2%
Happy 1.8%
Fear 1.6%
Confused 0.6%
Surprised 0.3%
Disgusted 0.1%

AWS Rekognition

Age 24-38
Gender Female, 90.7%
Calm 79.9%
Sad 9.6%
Happy 7.8%
Surprised 1.4%
Confused 0.6%
Angry 0.5%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 25-39
Gender Female, 86.5%
Calm 81.9%
Sad 11.5%
Angry 2%
Happy 1.5%
Confused 1%
Disgusted 1%
Surprised 0.7%
Fear 0.3%

AWS Rekognition

Age 22-34
Gender Male, 97%
Calm 50.4%
Sad 48.3%
Confused 0.5%
Happy 0.3%
Disgusted 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Male, 81.2%
Calm 89.5%
Surprised 3.4%
Confused 2.8%
Sad 1.6%
Happy 1.2%
Angry 1%
Fear 0.5%
Disgusted 0.1%

Feature analysis

Amazon

Person 99.1%

Categories

Text analysis

Amazon

13875

Google

13875. 3875.
13875.
3875.