Human Generated Data

Title

Untitled (portrait of woman with three girls on living room couch, Philadelphia)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12073

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (portrait of woman with three girls on living room couch, Philadelphia)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.3
Apparel 99.3
Human 99.2
Person 99.2
Person 98.9
Furniture 96.4
Couch 95.5
Person 93.6
Footwear 87.1
Shoe 87.1
Female 81.1
Home Decor 71.6
Shorts 68.3
Person 64.6
Floor 63.8
Woman 63.7
Chair 62.9
Indoors 61.7
Room 61.7
Living Room 61.7
Sitting 60.5
Girl 57.2
Person 50.7

Imagga
created on 2022-01-15

chair 28.8
architecture 21.2
seat 20.5
building 20.2
old 19.5
wheelchair 16.8
people 16.7
church 16.6
religion 16.1
man 16.1
city 15.8
salon 14.7
ancient 14.7
monument 14
historic 13.7
history 13.4
person 13.4
tourism 13.2
male 13
cathedral 12.5
hairdresser 12.4
adult 12.2
furniture 11.9
interior 11.5
portrait 11
window 10.8
arch 10.7
barber chair 10.6
travel 10.6
culture 10.3
street 10.1
house 10
mother 10
tourist 10
landmark 9.9
couple 9.6
fashion 9
outdoors 9
dress 9
home 8.8
love 8.7
kin 8.6
shop 8.6
men 8.6
business 8.5
art 8.5
throne 8.4
stone 8.4
religious 8.4
black 8.4
famous 8.4
exterior 8.3
room 8.1
life 8.1
groom 8
luxury 7.7
wall 7.7
column 7.5
family 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.5
person 96.7
black and white 95.9
street 90.3
outdoor 85.9
monochrome 83.4
clothing 80.5
furniture 63.8

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 55.4%
Calm 97.9%
Happy 0.8%
Sad 0.5%
Surprised 0.2%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Female, 61.1%
Calm 57.2%
Sad 39%
Fear 1%
Disgusted 1%
Happy 0.9%
Angry 0.4%
Surprised 0.3%
Confused 0.2%

AWS Rekognition

Age 27-37
Gender Male, 91.6%
Calm 99.4%
Fear 0.1%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Angry 0%
Surprised 0%

Feature analysis

Amazon

Person 99.2%
Shoe 87.1%

Captions

Microsoft

a group of people standing in front of a building 85%
a group of people in front of a building 84.9%
a group of people standing on top of a building 80.2%

Text analysis

Amazon

6720
A7DA
MUIT A7DA
MUIT

Google

A73A
MJI7 YT37A2 A73A
MJI7
YT37A2