Human Generated Data

Title

Untitled (miniature bride and groom placeholders from wedding in Germantown, PA)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12158

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (miniature bride and groom placeholders from wedding in Germantown, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.2
Person 99.2
Person 98.7
Indoors 98.3
Interior Design 98.3
Person 95.8
Person 95.4
Person 93.9
Person 93.1
Handrail 91.8
Banister 91.8
Staircase 91.7
Person 91.4
Furniture 85
Person 85
Person 80.4
Face 78.2
Person 78
Stage 77.1
Person 76.6
Chair 76.4
Crowd 75
Person 70.2
People 68.3
Person 68
Room 67
Building 64.4
Person 62.1
Person 60.2
Female 59.2
Person 58.5
Audience 58.4

Imagga
created on 2022-01-22

support 17.5
man 17.2
step 17
people 16.7
city 14.1
device 13.2
architecture 12.1
business 11.5
person 11.1
men 11.1
old 11.1
building 10.9
male 10.6
interior 10.6
urban 10.5
success 10.4
life 10.1
wall 10
travel 9.8
structure 9.7
black 9.6
shop 9.5
art 9.2
modern 9.1
adult 9.1
dancer 9
design 9
shoe shop 8.7
portrait 8.4
performer 8.2
competition 8.2
dress 8.1
decoration 8
light 8
stairs 7.9
shelf 7.8
scene 7.8
run 7.7
athlete 7.7
track 7.7
outdoor 7.6
sport 7.6
house 7.5
group 7.2
fitness 7.2
station 7.1

Google
created on 2022-01-22

Photograph 94.2
Black 89.9
Black-and-white 86.5
Stairs 84.6
Style 84.1
Monochrome photography 78.5
Rectangle 78.4
Art 78.3
Building 76.4
Monochrome 76.4
Snapshot 74.3
Flooring 69
Room 67.4
Event 66.3
Stock photography 65.8
Visual arts 65.6
Font 62.8
Wood 55.1
Door 54.6
Photographic paper 53.5

Microsoft
created on 2022-01-22

text 99.4
black and white 75.8
person 70.3
watching 42.5

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Male, 90.6%
Calm 98.9%
Surprised 0.6%
Sad 0.2%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 14-22
Gender Male, 96.3%
Sad 73.4%
Calm 16.4%
Confused 6.4%
Happy 1.4%
Disgusted 1%
Angry 0.5%
Fear 0.5%
Surprised 0.5%

AWS Rekognition

Age 25-35
Gender Male, 56.5%
Sad 69%
Calm 13.9%
Happy 8.9%
Confused 3.5%
Disgusted 1.4%
Surprised 1.4%
Angry 1.1%
Fear 0.8%

AWS Rekognition

Age 26-36
Gender Female, 53.6%
Calm 67.3%
Surprised 14.8%
Sad 8.6%
Angry 4%
Confused 1.9%
Disgusted 1.8%
Happy 1.1%
Fear 0.4%

Feature analysis

Amazon

Person 99.2%
Staircase 91.7%

Captions

Microsoft

a group of people standing in front of a crowd 53.5%
a group of people posing for a photo 53.4%
a group of people watching a band on stage in front of a crowd 40.5%

Text analysis

Amazon

Sarry
Jane
Bats
Jack
20032.
dreve
nipper
20032,
Ne
Ne from
there
alice
from
Chucky
too
Jeause
usuey
Junice

Google

Carry
20032. Carry
20032.