Human Generated Data

Title

Untitled (make-up artist working on Mask & Wig performer)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7492

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (make-up artist working on Mask & Wig performer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7492

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99
Human 99
Person 98.4
Clothing 88.8
Apparel 88.8
Person 87.9
Person 84.4
Face 83.3
Furniture 74.5
Chair 73.8
Tank 71.6
Military 71.6
Transportation 71.6
Military Uniform 71.6
Armored 71.6
Vehicle 71.6
Army 71.6
Indoors 65.9
Meal 65.6
Food 65.6
People 63.9
Room 63.2
Screen 60.8
Electronics 60.8
Flooring 60.2
Finger 59.7
Table 59.6
Monitor 59.3
Display 59.3
Photography 57.4
Photo 57.4
Man 57.2
Female 55.6
Person 49.9

Clarifai
created on 2023-10-25

people 99.8
adult 98.3
man 98
two 97.6
monochrome 95.3
woman 91.7
group 89.4
group together 85.8
three 83.6
music 79.7
military 79.6
furniture 79.6
actor 78
war 74.3
administration 73.7
sitting 73.7
wear 72.4
chair 70.6
art 69.8
piano 69.3

Imagga
created on 2022-01-08

man 41.7
person 39
male 35.5
office 33.3
people 32.9
adult 28
business 27.9
computer 27.6
professional 25.4
laptop 25.3
sitting 24.9
happy 23.8
desk 22.9
work 22.8
smiling 21.7
working 20.3
businessman 20.3
lifestyle 20.2
disk jockey 19.7
technology 19.3
smile 19.2
corporate 18.9
men 18
table 17.2
job 15.9
indoors 15.8
room 15.8
broadcaster 15.7
businesswoman 15.4
cheerful 15.4
looking 15.2
handsome 15.2
musical instrument 15
monitor 14.6
modern 14
worker 14
portrait 13.6
casual 13.6
suit 13.5
executive 13.3
talking 13.3
percussion instrument 13.3
group 12.9
communicator 12.8
women 12.7
communication 12.6
mature 12.1
one 11.9
indoor 11.9
equipment 11.8
happiness 11.8
classroom 11.6
television 11.4
blackboard 11.3
education 11.3
pretty 11.2
attractive 11.2
home 11.2
phone 11.1
team 10.8
nurse 10.6
teacher 10.5
chair 10.5
success 10.5
businesspeople 10.4
meeting 10.4
teamwork 10.2
electronic equipment 10.2
horizontal 10
human 9.7
colleagues 9.7
class 9.6
30s 9.6
employee 9.6
student 9.5
day 9.4
notebook 9.4
coffee 9.3
device 9.2
house 9.2
successful 9.1
school 9
couple 8.7
standing 8.7
workplace 8.6
face 8.5
keyboard 8.4
senior 8.4
hand 8.4
color 8.3
20s 8.2
confident 8.2
together 7.9
boy 7.8
mid adult 7.7
career 7.6
sit 7.6
manager 7.4
camera 7.4
patient 7.4
occupation 7.3
alone 7.3
lady 7.3
board 7.2
black 7.2
to 7.1
interior 7.1
paper 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 67%
Surprised 90.1%
Happy 5.4%
Fear 2.2%
Calm 0.9%
Sad 0.4%
Disgusted 0.4%
Angry 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Tank 71.6%

Text analysis

Amazon

8638
8
MU7YT37A2
MU7YT37A2 A70A
8633
A70A

Google

MIR
YT33A
8638 MIR YT33A 2 A73A
8638
2
A73A