Human Generated Data

Title

Untitled (men seated behind table on stage in full auditorium, Old Queens 1766 banner)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4782

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men seated behind table on stage in full auditorium, Old Queens 1766 banner)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Advertisement 96
Poster 94
Human 84.3
Person 84.3
Text 83.5
Indoors 71.2
Person 66.9
Billboard 66
Crowd 62.8
Room 59.3
Collage 58.8
Person 54.9
Person 46.3

Imagga
created on 2022-01-29

blackboard 73.7
classroom 47.8
room 37.2
old 28.6
vintage 27.3
grunge 26.4
aged 23.5
texture 21.5
antique 19.9
retro 19.7
structure 19.4
city 15.8
ancient 15.6
dirty 15.4
art 15.3
grungy 14.2
architecture 14.1
rough 13.7
dark 13.4
damaged 13.4
paper 13.3
university 13.2
black 13.2
design 12.9
building 12.9
frame 12.5
border 11.8
pattern 11.6
material 11.6
urban 11.4
text 11.3
travel 11.3
wall 11.1
graphic 10.9
board 10.8
landscape 10.4
brown 10.3
sky 10.2
card 10.2
letter 10.1
space 10.1
color 10
wallpaper 10
memorial 9.9
man 9.4
tourism 9.1
decoration 9.1
night 8.9
chalk 8.8
textured 8.8
stage 8.8
messy 8.7
rust 8.7
worn 8.6
old fashioned 8.6
web site 8.4
grain 8.3
school 8.2
paint 8.1
film 7.9
artistic 7.8
empty 7.7
stain 7.7
downtown 7.7
weathered 7.6
billboard 7.6
ocean 7.5
backdrop 7.4
historic 7.3
landmark 7.2
platform 7.1
sea 7
modern 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.9
black and white 79.1
handwriting 70.8
crowd 0.9

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 82.3%
Calm 82.3%
Confused 9.1%
Disgusted 2.3%
Surprised 2.1%
Happy 1.6%
Fear 1.1%
Angry 0.8%
Sad 0.7%

AWS Rekognition

Age 23-31
Gender Female, 70.2%
Sad 46%
Angry 14.5%
Calm 11.5%
Happy 10.4%
Disgusted 7.1%
Fear 3.9%
Confused 3.4%
Surprised 3.2%

AWS Rekognition

Age 6-14
Gender Male, 68%
Confused 64.9%
Calm 13.1%
Surprised 7.5%
Disgusted 5.9%
Sad 5.1%
Angry 1.9%
Fear 1.1%
Happy 0.5%

AWS Rekognition

Age 20-28
Gender Male, 91.3%
Calm 73.7%
Sad 16.8%
Disgusted 3.1%
Angry 2.1%
Surprised 2.1%
Fear 1.2%
Confused 0.8%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 91.6%
Happy 36.6%
Calm 30.4%
Sad 23%
Confused 5.8%
Fear 1.9%
Disgusted 1%
Angry 0.8%
Surprised 0.6%

AWS Rekognition

Age 11-19
Gender Male, 92.4%
Calm 49.8%
Sad 34.3%
Confused 8.2%
Angry 2.4%
Happy 2.1%
Fear 1.2%
Disgusted 1.1%
Surprised 0.9%

AWS Rekognition

Age 12-20
Gender Female, 69.8%
Calm 60%
Sad 34.2%
Fear 1.5%
Confused 1.4%
Surprised 1%
Angry 0.9%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 14-22
Gender Male, 93.7%
Sad 32.7%
Calm 20.1%
Disgusted 16.2%
Confused 10.1%
Surprised 7.5%
Angry 7%
Fear 4.8%
Happy 1.6%

AWS Rekognition

Age 14-22
Gender Male, 78.9%
Calm 90.5%
Sad 6.3%
Happy 0.9%
Angry 0.8%
Fear 0.5%
Confused 0.4%
Surprised 0.3%
Disgusted 0.2%

Feature analysis

Amazon

Poster 94%
Person 84.3%

Captions

Microsoft

graphical user interface 52.8%

Text analysis

Amazon

1766
Old
ET
Queens
ILLUSTRA
SOL
SOL IUSTITIAE Old ET OCCIDENTEM Queens ILLUSTRA
OCCIDENTEM
IUSTITIAE
OIIH
OIL

Google

ILLUSTRA
1766
SOL ET OCCIDENTEM ILLUSTRA Old Queens 1766 areptu a is
SOL
Queens
ET
areptu
a
is
OCCIDENTEM
Old