Human Generated Data

Title

Untitled (woman on steps holding child and bag of groceries)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8788

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman on steps holding child and bag of groceries)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8788

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.9
Apparel 99.9
Person 96.6
Human 96.6
Dress 90.8
Person 90.3
Footwear 87.3
Female 86
Shoe 82.3
Evening Dress 80.3
Fashion 80.3
Robe 80.3
Gown 80.3
Costume 79.8
Woman 69.3
Face 67.4
Door 66.3
Portrait 65.1
Photography 65.1
Photo 65.1
Person 63.1
Girl 61.8
Plant 61.2
Outdoors 58.6
Home Decor 57.6

Clarifai
created on 2023-10-25

people 99.9
one 98.3
woman 98.1
adult 97.9
step 94.8
wear 94.7
wedding 93.8
dress 92.9
two 92.8
child 92.6
portrait 90.4
actress 87.7
home 86.1
veil 84.9
bride 84.8
administration 83
offspring 82.6
street 81.9
leader 81.3
man 77.6

Imagga
created on 2022-01-09

cleaner 27.9
fashion 26.4
person 24.5
portrait 23.3
dress 21.7
sexy 20.9
adult 20.2
model 20.2
attractive 18.2
black 17.3
style 17.1
lady 17
hair 15.9
pretty 14.7
sensuality 14.5
posing 14.2
people 13.9
city 13.3
man 12.8
elegance 12.6
clothing 12.4
human 12
body 12
street 12
one 11.9
sensual 11.8
dark 10.9
interior 10.6
urban 10.5
luxury 10.3
outdoor 9.9
holding 9.9
crutch 9.8
women 9.5
wall 9.4
happy 9.4
face 9.2
statue 9.2
vintage 9.1
outdoors 9.1
dirty 9
suit 9
mask 9
architecture 8.8
cute 8.6
passion 8.5
musical instrument 8.5
danger 8.2
device 7.9
male 7.9
vogue 7.7
old 7.7
staff 7.6
clothes 7.5
emotion 7.4
makeup 7.3
industrial 7.3
art 7.3
stylish 7.2
lifestyle 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.1
outdoor 92.4
black and white 90.1
clothing 89.9
person 88.1
dress 77.8
woman 77.5
statue 52.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 98.9%
Calm 54.3%
Sad 19.3%
Happy 7.7%
Surprised 6%
Fear 5.8%
Confused 3.6%
Angry 2.4%
Disgusted 0.9%

AWS Rekognition

Age 6-12
Gender Female, 99.7%
Calm 98.1%
Happy 0.7%
Surprised 0.5%
Fear 0.2%
Angry 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.6%
Shoe 82.3%

Categories

Text analysis

Amazon

39
39 305.
305.
8SA

Google

8SA YTA2-XA ఆొంక్ అ
8SA
YTA2-XA
ఆొంక్