Human Generated Data

Title

Untitled (two women and child on counter with toys)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4517

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and child on counter with toys)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99
Person 99
Person 98.6
Person 98.5
Apparel 96.3
Clothing 96.3
Person 84.4
Suit 75.3
Overcoat 75.3
Coat 75.3
People 68.3
Gown 63.3
Fashion 63.3
Footwear 61.5
Shoe 61.5
Photo 60.2
Photography 60.2
Female 59.2
Robe 59.1
Worker 57.7
Face 57.1

Imagga
created on 2022-01-23

man 36.3
male 33.3
work 30.2
sax 28.2
person 27.3
people 26.8
worker 23.6
men 23.2
job 21.2
adult 20.6
professional 20.4
business 20
construction 18
businessman 17.6
shop 16.3
wind instrument 16.1
medical 15.9
engineer 15.7
architect 15.4
looking 15.2
musical instrument 15
equipment 14.9
occupation 14.7
industry 14.5
laboratory 14.5
newspaper 14.3
working 13.3
doctor 13.2
builder 13
human 12.7
lab 12.6
technology 12.6
building 12.5
patient 12.1
product 11.7
portrait 11.6
office 11.4
research 11.4
plan 11.3
brass 11.1
architecture 10.9
mercantile establishment 10.9
chemistry 10.6
group 10.5
engineering 10.5
development 10.5
senior 10.3
room 10.2
team 9.9
chemical 9.8
old 9.7
medicine 9.7
success 9.7
test 9.6
project 9.6
case 9.6
chart 9.6
serious 9.5
education 9.5
pencil 9.5
writing 9.5
instrument 9.3
safety 9.2
barbershop 9.1
clinic 9.1
industrial 9.1
student 9.1
science 8.9
scientist 8.8
smiling 8.7
customer 8.6
uniform 8.5
biology 8.5
site 8.4
design 8.4
creation 8.4
manager 8.4
health 8.3
holding 8.2
coat 8
designing 7.9
black 7.8
table 7.8
assistant 7.8
scientific 7.7
designer 7.7
sitting 7.7
helmet 7.7
architectural 7.7
profession 7.6
casual 7.6
percussion instrument 7.5
happy 7.5
new 7.3
lifestyle 7.2
place of business 7.2
drawing 7.2
smile 7.1
paper 7.1
foreman 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98
text 97.1
clothing 92.8
drawing 92.7
man 81.9
black and white 76.3
sketch 72.4
people 67.3
human face 63.9
old 47.6

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 72.8%
Surprised 56%
Calm 42.2%
Fear 0.7%
Sad 0.4%
Angry 0.2%
Disgusted 0.2%
Happy 0.1%
Confused 0.1%

AWS Rekognition

Age 36-44
Gender Female, 99.3%
Calm 85.3%
Sad 13.4%
Fear 0.3%
Surprised 0.3%
Confused 0.3%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Shoe 61.5%

Captions

Microsoft

a group of people standing around each other 80.4%
a group of people standing in a room 80.3%
a group of people standing next to a man 79.2%

Text analysis

Amazon

Wartime
this
New
GER
in
21439,
in Wartime this New Way
21439.
Way
star
-
YOURLES -
YOURLES

Google

Wartime
tal *AGON-YT3RA2- MAMT2A sin Wartime this New Way
New
Way
tal
sin
*AGON-YT3RA2-
MAMT2A
this