Human Generated Data

Title

Untitled (two women sitting and talking, Pennsylvania)

Date

1942, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.262

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women sitting and talking, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.262

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.2
Human 98.2
Apparel 95.8
Clothing 95.8
Person 95.4
Furniture 89.8
Couch 74
Art 73.5
Painting 73.5
Hat 60
Lamp 56.1
Coat 55.8
Room 55.7
Living Room 55.7
Indoors 55.7

Clarifai
created on 2023-10-25

people 100
adult 99
woman 99
two 98
man 96.9
group 96.6
portrait 96.2
wear 95.5
furniture 93.9
seat 93.2
offspring 93.2
leader 92.6
gown (clothing) 91.9
three 88.9
facial expression 86
veil 85.9
actress 84.8
chair 84.6
art 84.3
sit 84.2

Imagga
created on 2022-01-08

statue 21
old 16.7
sculpture 16.4
art 16.3
man 15.4
religion 15.2
black 15.2
clothing 14.3
history 13.4
people 13.4
city 13.3
catholic 11.9
vintage 11.6
religious 11.2
monument 11.2
church 11.1
architecture 10.9
garment 10.8
holy 10.6
faith 10.5
building 10.5
historical 10.3
historic 10.1
robe 9.9
tourism 9.9
god 9.6
person 9.4
male 9.2
shop 9.1
covering 8.8
antique 8.7
artistic 8.7
saint 8.7
ancient 8.6
detail 8
marble 7.9
adult 7.8
travel 7.7
culture 7.7
style 7.4
tourist 7.4
window 7.3
portrait 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 93.5
clothing 92
person 87.1
black and white 78.8
white 67.6
human face 61.1
furniture 59.8
old 56.2
clothes 25.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 68-78
Gender Female, 61.8%
Calm 71%
Surprised 17.4%
Confused 8.6%
Fear 0.9%
Angry 0.9%
Sad 0.5%
Disgusted 0.3%
Happy 0.3%

Microsoft Cognitive Services

Age 71
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Painting 73.5%

Categories

Captions

Microsoft
created on 2022-01-08

an old photo of a person 92.2%
an old photo of a person 89.8%
old photo of a person 89.7%

Text analysis

Amazon

THAT
day
SIDE
ATIN
-

Google

SIDE
SIDE