Human Generated Data

Title

Untitled (three men playing cards on cardboard box aboard train)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5218

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three men playing cards on cardboard box aboard train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5218

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 99.7
Person 99.2
Person 97.7
Person 95.4
Hat 95.2
Clothing 95.2
Apparel 95.2
Clinic 95
Person 91
Operating Theatre 66.9
Hospital 66.9
Outdoors 65.6
Nature 62.4
Portrait 59.4
Photography 59.4
Face 59.4
Photo 59.4
Doctor 58.4
Sailor Suit 55.6
Hat 50.3

Clarifai
created on 2023-10-26

people 99.5
group 97.8
veil 97.6
man 97
monochrome 95.9
adult 95.2
group together 92.6
indoors 87
woman 85
vehicle 81.8
many 77.5
lid 75.9
technology 75.2
education 74.9
interaction 74.4
science 72
wear 69.7
transportation system 68.5
three 66.5
actor 64.9

Imagga
created on 2022-01-23

film 62
photographic paper 46.4
x-ray film 43.4
negative 34.5
photographic equipment 31
technology 27.4
equipment 27
medical 26.5
medicine 25.5
professional 25.4
people 24.5
person 22.8
work 22
doctor 20.7
business 20
man 19.5
computer 19.3
science 18.7
laboratory 18.3
hospital 18.1
working 17.7
adult 17.5
monitor 17
television 16.9
lab 15.5
health 15.3
worker 15.1
male 14.9
hand 14.4
instrument 14.3
human 14.2
care 14
screen 13.5
office 13
student 12.7
chemical 12.6
modern 12.6
scientific 12.6
test 12.5
job 12.4
research 12.4
biology 12.3
digital 12.1
looking 12
occupation 11.9
coat 11
scientist 10.8
chemistry 10.6
businessman 10.6
laptop 10.3
men 10.3
women 10.3
telecommunication system 10
display 9.6
nurse 9.5
newspaper 9.1
black 9
team 9
biotechnology 8.8
glass 8.7
symbol 8.7
electronic equipment 8.7
touch 8.7
profession 8.6
illness 8.6
development 8.6
network 8.5
portrait 8.4
communication 8.4
future 8.4
data 8.2
happy 8.1
button 7.9
clinic 7.9
microbiology 7.9
chemist 7.9
face 7.8
stethoscope 7.7
window 7.6
uniform 7.6
keyboard 7.5
light 7.3
patient 7.3
smile 7.1
information 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
clothing 69.2
player 67.1
person 57.5
old 44.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.7%
Calm 59.1%
Happy 13.1%
Sad 9%
Confused 8.4%
Angry 4.1%
Disgusted 3%
Surprised 2.4%
Fear 0.8%

AWS Rekognition

Age 30-40
Gender Female, 68.9%
Calm 87.7%
Sad 4.8%
Happy 3.3%
Confused 1.3%
Angry 1.1%
Surprised 0.9%
Disgusted 0.7%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Female, 52.3%
Calm 97.2%
Sad 2.6%
Confused 0.1%
Angry 0%
Happy 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Female, 58.5%
Calm 51.7%
Sad 32.1%
Angry 13.1%
Confused 0.8%
Happy 0.8%
Surprised 0.7%
Disgusted 0.6%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 95.2%

Categories

Text analysis

Amazon

16276
15276
LITW
16276. LITW
16276.
is

Google

K276 16271
K276
16271