Human Generated Data

Title

Untitled (four Mask & Wig performers seated at table on stage)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7497

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four Mask & Wig performers seated at table on stage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 99.3
Person 99
Person 92.1
Sitting 89.7
Crowd 85
Person 79.2
Chair 72.4
Furniture 72.4
People 70.6
Clothing 65.4
Apparel 65.4
Female 65.1
Photography 63.2
Photo 63.2
Leisure Activities 63.1
Floor 62.9
Person 62.4
Person 60.6
Drawing 56.7
Art 56.7
Audience 56.3
Text 55.3
Flooring 55.1

Imagga
created on 2022-01-08

brass 30.9
musical instrument 30.5
blackboard 28.9
man 27.5
wind instrument 25.1
male 24.1
people 22.9
person 22.8
group 20.9
business 20
classroom 20
room 18.3
marimba 18
percussion instrument 18
businessman 16.8
adult 16.3
job 15
chair 14.7
men 14.6
teacher 14.1
work 13.5
silhouette 13.2
women 12.6
planner 11.7
sky 11.5
restaurant 11
building 10.9
office 10.4
board 9.9
cheerful 9.7
professional 9.6
boy 9.6
education 9.5
table 9.5
student 9.4
meeting 9.4
day 9.4
smiling 9.4
happy 9.4
two 9.3
city 9.1
black 9
team 9
interior 8.8
couple 8.7
happiness 8.6
sitting 8.6
communication 8.4
holding 8.2
human 8.2
window 8.2
lifestyle 7.9
indoors 7.9
students 7.8
teaching 7.8
glass 7.8
travel 7.7
employee 7.7
modern 7.7
career 7.6
house 7.5
leisure 7.5
cafeteria 7.4
inside 7.4
water 7.3
school 7.3
engineer 7.3
smile 7.1
summer 7.1
working 7.1
architecture 7
together 7

Google
created on 2022-01-08

Chair 84.3
Font 80.3
Adaptation 79.3
Rectangle 76.7
Art 76.6
Tints and shades 74.4
Snapshot 74.3
Vintage clothing 72.8
Room 68.8
Sitting 67
Event 65.5
History 65.2
Stock photography 64.7
Visual arts 64.5
Suit 64.2
Table 64.1
Monochrome 61.3
Illustration 59.1
Photo caption 56.9
Painting 53.3

Microsoft
created on 2022-01-08

text 98.7
clothing 92
outdoor 91.1
person 87.5
white 64.3
woman 58.1
man 55.8
drawing 55.7
group 55.6
blackboard 51.9

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Male, 96.2%
Calm 99.9%
Surprised 0%
Happy 0%
Disgusted 0%
Confused 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Female, 90.3%
Calm 98%
Angry 0.7%
Happy 0.4%
Surprised 0.2%
Disgusted 0.2%
Sad 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.6%
Surprised 61%
Sad 13%
Confused 12.6%
Calm 7.9%
Disgusted 2.6%
Fear 1.3%
Happy 0.8%
Angry 0.8%

AWS Rekognition

Age 27-37
Gender Male, 93.7%
Calm 98%
Surprised 1%
Sad 0.4%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%
Confused 0.1%
Happy 0.1%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in a room 92.5%
a group of people standing in front of a building 85.1%
a group of people in a room 85%

Text analysis

Amazon

8630.
8
8630
MJI7
MJI7 YESTAD BADE
BADE
YESTAD

Google

8630.
2
MJ
8630. 8630. 8630. MJ YT3RA 2 A73A
YT3RA
A73A