Human Generated Data

Title

Untitled (double exposure: woman backstage with make-up artist and women preforming)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5675

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (double exposure: woman backstage with make-up artist and women preforming)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.4
Person 99.4
Art 89.1
Musician 78.4
Musical Instrument 78.4
Leisure Activities 72.1
Drawing 72
Sketch 68.2
Chair 59.5
Furniture 59.5

Imagga
created on 2021-12-15

chair 37
musical instrument 22.5
interior 22.1
furniture 21.8
seat 18.3
device 16.6
keyboard instrument 16.3
house 15.9
work 15.8
accordion 15.6
room 15.4
building 15.1
table 14.2
old 13.9
modern 13.3
light 12.7
architecture 12.5
industry 11.9
glass 11.7
machine 11.6
wind instrument 11.4
home 11.2
person 10.9
sky 10.8
man 10.7
urban 10.5
inside 10.1
wood 10
city 10
barber chair 9.8
structure 9.8
business 9.7
metal 9.7
people 9.5
sitting 9.4
decoration 9.4
floor 9.3
relaxation 9.2
vintage 9.1
design 9
worker 9
technology 8.9
sun 8.9
equipment 8.8
wooden 8.8
electric chair 8.7
comfortable 8.6
culture 8.5
elegance 8.4
style 8.2
night 8
restaurant 7.9
indoors 7.9
day 7.8
male 7.8
scene 7.8
men 7.7
factory 7.7
luxury 7.7
outside 7.7
construction 7.7
wall 7.7
living 7.6
relax 7.6
iron 7.6
shopping cart 7.5
china 7.5
traditional 7.5
window 7.4
office 7.4
street 7.4
antique 7.4
lifestyle 7.2
instrument of execution 7.2
textile machine 7.2
lamp 7.1
summer 7.1
steel 7.1
working 7.1

Google
created on 2021-12-15

Chair 84.7
Art 81.5
Painting 80.6
Table 76
Drawing 69.8
Illustration 68.9
Room 68.3
Visual arts 67.8
Stock photography 65.4
Sitting 63.8
Font 60
Monochrome 59.7
Vintage clothing 53.7
Rectangle 52.4
Machine 50.3

Microsoft
created on 2021-12-15

text 99
man 92.5
drawing 87.7
outdoor 87.5
black and white 85.6
person 85.4
clothing 67.5
sketch 64.4
old 53.4

Face analysis

Amazon

AWS Rekognition

Age 4-14
Gender Female, 54.6%
Calm 72.7%
Happy 14.3%
Sad 5.7%
Angry 5.2%
Surprised 0.9%
Fear 0.6%
Disgusted 0.3%
Confused 0.3%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

an old photo of a man 87.6%
a man sitting in front of a building 70.6%
old photo of a man 70.5%

Text analysis

Amazon

129271.
T-T-WN
13927Y
13929Y.
УТЗЗА-ИАМТ2А

Google

13927Y MAMT2A TT WN ALTE
WN
MAMT2A
13927Y
TT
ALTE