Human Generated Data

Title

Untitled (portrait of woman with three girls on living room couch, Philadelphia)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12074

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (portrait of woman with three girls on living room couch, Philadelphia)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12074

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 99.3
Person 99.2
Clothing 94.6
Apparel 94.6
Shoe 93.8
Footwear 93.8
Female 93.5
Tie 92.3
Accessories 92.3
Accessory 92.3
People 91.5
Furniture 88.3
Dress 86.7
Shoe 84.1
Person 82.1
Woman 79.6
Family 75.8
Art 72.9
Girl 71.9
Face 70.8
Photography 69
Photo 69
Portrait 68.5
Sitting 58.6
Couch 57.8
Shorts 55.3

Clarifai
created on 2023-10-26

people 99.9
adult 98.5
woman 97.9
two 96.9
man 96
wear 94.8
group 90.6
monochrome 90.6
wedding 87.3
actress 85.7
leader 84.2
sit 83.3
three 83.1
furniture 82.8
indoors 82.7
portrait 82.2
child 82.1
home 81.8
administration 81.7
street 81.6

Imagga
created on 2022-01-15

salon 50.4
chair 41.8
seat 27.7
barber chair 25.5
man 24.2
people 24
person 23.6
portrait 20.1
adult 18.7
male 18.5
room 18.1
furniture 17.1
old 16.7
wheelchair 16.3
home 15.9
hairdresser 15.8
sitting 14.6
interior 14.1
indoors 14
fashion 13.6
family 13.3
happy 13.2
senior 13.1
shop 12.9
smile 12.8
smiling 12.3
lifestyle 12.3
indoor 11.9
health 11.8
lady 11.4
couple 11.3
mature 11.2
women 11.1
love 11
teacher 10.6
patient 10.3
men 10.3
professional 10.1
inside 10.1
dress 9.9
luxury 9.4
architecture 9.4
mother 9.3
house 9.2
life 9.2
aged 9
medical 8.8
looking 8.8
illness 8.6
barbershop 8.3
vintage 8.3
care 8.2
worker 8.2
educator 7.9
happiness 7.8
face 7.8
ancient 7.8
model 7.8
elderly 7.7
sofa 7.7
two 7.6
casual 7.6
window 7.5
one 7.5
style 7.4
furnishing 7.4
business 7.3
mercantile establishment 7.2
history 7.2
posing 7.1
work 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 98.2
text 97
outdoor 88
clothing 82.5
furniture 82.1
wedding dress 77.8
posing 53.8
bride 53.3
black and white 51.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 93%
Calm 94.4%
Surprised 3%
Happy 1%
Sad 0.9%
Confused 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Female, 90.3%
Sad 71%
Happy 23.7%
Calm 3.2%
Surprised 0.5%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%
Confused 0.3%

AWS Rekognition

Age 27-37
Gender Female, 94.2%
Calm 99.3%
Happy 0.4%
Sad 0.1%
Fear 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 30-40
Gender Male, 99.6%
Calm 97.2%
Sad 1%
Happy 0.7%
Disgusted 0.5%
Confused 0.2%
Surprised 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 16-22
Gender Female, 83.5%
Calm 85.6%
Sad 5.7%
Surprised 2.2%
Happy 2%
Fear 1.9%
Disgusted 1.1%
Angry 0.9%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 93.8%
Tie 92.3%

Text analysis

Amazon

6721
A70A
MJI7 YE3 A70A
MJI7
YE3

Google

O MJI3 YT37A2 A73A
O
MJI3
YT37A2
A73A