Human Generated Data

Title

Untitled (women purchasing scarves)

Date

c.1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4516

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women purchasing scarves)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c.1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Clothing 98.5
Apparel 98.5
Person 97.3
Person 93.4
Clinic 81
Hat 79.2
Photography 64.6
Face 64.6
Portrait 64.6
Photo 64.6
Shoe 58.9
Footwear 58.9
Furniture 57.6
Chair 57.6
Doctor 55

Imagga
created on 2022-01-23

coat 45.4
man 44.4
person 40.5
lab coat 38.8
surgeon 38.1
medical 38
doctor 34.8
professional 34
people 31.8
male 30.5
laboratory 28
work 27.5
medicine 26.4
health 26.4
lab 26.2
hospital 24.7
scientist 23.5
clinic 23.5
worker 23.2
working 23
chemistry 22.2
adult 21.7
student 21.2
research 20.9
science 20.5
biology 19.9
patient 19.8
team 19.7
chemical 19.5
office 18.5
test 18.3
education 18.2
scientific 17.4
indoors 16.7
assistant 16.5
occupation 16.5
technology 16.3
technician 15.7
nurse 15.6
equipment 15.5
men 15.5
development 15.2
job 15
researcher 14.8
chemist 14.8
smiling 14.5
human 14.3
businessman 14.1
garment 13.6
business 13.4
care 13.2
happy 13.2
senior 13.1
portrait 12.9
microscope 12.8
looking 12.8
negative 12.7
exam 12.5
instrument 12.4
profession 12.4
study 12.1
teamwork 12.1
sitting 12
biotechnology 11.8
specialist 11.3
film 11
room 10.9
experiment 10.7
illness 10.5
uniform 10.5
computer 10.4
desk 10.4
glasses 10.2
microbiology 9.9
clothing 9.7
concentration 9.7
home 9.6
face 9.2
painter 9
color 8.9
biochemistry 8.9
optical 8.7
table 8.7
confident 8.2
observation 7.9
surgery 7.8
analysis 7.8
expertise 7.8
sick 7.7
photographic paper 7.7
tube 7.7
notes 7.7
elderly 7.7
serious 7.6
indoor 7.3
laptop 7.3
lifestyle 7.2
school 7.2
bright 7.2
paper 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.8
drawing 90.8
person 87.1
sketch 87.1
clothing 81.1
human face 66.6
posing 58.7
clothes 16

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 83.7%
Sad 90%
Calm 6.6%
Angry 0.9%
Confused 0.8%
Happy 0.6%
Disgusted 0.4%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 30-40
Gender Female, 93.8%
Happy 35.3%
Calm 27.9%
Sad 25.6%
Surprised 4.2%
Fear 3.9%
Angry 1.2%
Confused 1.2%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Hat 79.2%

Captions

Microsoft

a group of people posing for a photo 78.6%
a group of people posing for the camera 78.5%
a group of people posing for a picture 78.4%

Text analysis

Amazon

ANCE
21437.
FRANCE
2437
REVEN
send

Google

FRAN
2437
ANCE
3RA2-WAMT2A
3RA2-WAMT2A 2437 ANCE FRAN 21437.
21437.