Human Generated Data

Title

Untitled (woman reading to children in classroom)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7697

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman reading to children in classroom)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7697

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Classroom 99.8
School 99.8
Room 99.8
Indoors 99.8
Furniture 99.7
Chair 99.7
Person 99.6
Human 99.6
Person 99.1
Person 98.5
Person 98.1
Person 97.6
Person 97.5
Person 96.5
Person 91.8
Person 88.1
Interior Design 85.4
Person 81.6
Person 77.7
Person 74.2
People 71.3
Person 68.4
Workshop 58.7
Shoe 58.7
Clothing 58.7
Apparel 58.7
Footwear 58.7
Kindergarten 57.5
Person 49.7

Clarifai
created on 2023-10-25

people 99.9
group 98.9
many 98.6
adult 98.4
man 97.4
group together 96.4
administration 96.3
woman 95.7
chair 95.3
sit 95
education 94.7
leader 94.1
child 92.9
sitting 92.1
furniture 91.7
monochrome 91.4
teacher 89.8
school 88.9
room 88.6
war 88.5

Imagga
created on 2022-01-09

people 36.8
person 36.8
classroom 36.2
man 33.6
room 33.4
senior 31.8
blackboard 26.2
male 24.1
old 23
school 22
teacher 21.9
adult 21.8
elderly 21
retirement 20.2
nurse 19.6
retired 19.4
sitting 18.9
mature 18.6
home 18.3
student 17.7
portrait 16.8
smiling 16.6
happy 16.3
men 15.4
looking 15.2
work 14.5
class 14.5
indoors 14
table 14
education 13.8
computer 13.7
group 13.7
together 13.1
lifestyle 13
indoor 12.8
business 12.7
laptop 12.6
holding 12.4
chair 12.2
couple 12.2
office 11.9
to 11.5
casual 11
70s 10.8
smile 10.7
hand 10.6
working 10.6
human 10.5
hands 10.4
health 10.4
happiness 10.2
pensioner 10.1
hospital 9.9
teaching 9.7
medical 9.7
building 9.7
older 9.7
studying 9.6
day 9.4
camera 9.2
care 9
job 8.8
lesson 8.8
patient 8.2
child 8.2
board 8.1
life 8.1
family 8
hair 7.9
60s 7.8
desk 7.7
professional 7.6
relaxed 7.5
percussion instrument 7.4
occupation 7.3
cheerful 7.3
businesswoman 7.3
aged 7.2
color 7.2
worker 7.2
holiday 7.2
women 7.1
kid 7.1
musical instrument 7.1
modern 7

Google
created on 2022-01-09

Photograph 94.2
Black 89.6
Black-and-white 86.1
Style 84
Adaptation 79.3
Font 77.5
Monochrome 76.7
Monochrome photography 76
Snapshot 74.3
Chair 69.5
Event 69.5
Room 68.4
Vintage clothing 68.3
Art 64.8
Photo caption 64.5
History 64.1
Stock photography 63.6
Child 60.6
Class 59.6
Suit 58.3

Microsoft
created on 2022-01-09

text 97.9
person 96.8
black and white 63.8
clothing 63.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-48
Gender Female, 95%
Calm 99.5%
Sad 0.3%
Surprised 0.1%
Happy 0.1%
Angry 0%
Disgusted 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 23-33
Gender Male, 99.1%
Calm 97.9%
Sad 1.3%
Confused 0.4%
Happy 0.2%
Disgusted 0.1%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 11-19
Gender Female, 99.5%
Calm 50.5%
Sad 48.7%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 94.4%
Calm 100%
Sad 0%
Angry 0%
Confused 0%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 97.4%
Calm 98.7%
Sad 0.9%
Confused 0.2%
Surprised 0.1%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 94.1%
Calm 96%
Sad 2.1%
Confused 0.9%
Disgusted 0.3%
Happy 0.3%
Angry 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 26-36
Gender Male, 93.6%
Calm 97%
Sad 1.3%
Happy 0.8%
Confused 0.4%
Surprised 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0%

Feature analysis

Amazon

Chair 99.7%
Person 99.6%
Shoe 58.7%

Text analysis

Amazon

24524.
KODVK
VT27A2

Google

24524.
24524.