Human Generated Data

Title

Untitled (men and women seated on steps with photograph)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8587

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated on steps with photograph)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8587

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Apparel 99.9
Clothing 99.9
Person 98.6
Human 98.6
Person 97.6
Person 95.7
Person 94.4
Evening Dress 84.3
Robe 84.3
Gown 84.3
Fashion 84.3
Female 80.6
Room 69.4
Indoors 69.4
Woman 68.5
Dress 65.9
Suit 65.5
Coat 65.5
Overcoat 65.5
Wedding 61.9
Photo 60.2
Portrait 60.2
Face 60.2
Photography 60.2
Wedding Gown 57.8
Footwear 56.5
Shoe 56.5

Clarifai
created on 2023-10-25

people 99.5
monochrome 97.5
man 97.5
woman 94.5
indoors 94.1
two 93.1
adult 92.6
group 91.6
wear 86
music 86
actor 84.7
profile 81.8
fashion 80.5
three 78.2
dancing 78.1
singer 77.9
street 77.8
wedding 76.7
shadow 76
actress 75.5

Imagga
created on 2022-01-09

groom 31.8
man 26.2
hairdresser 25.8
people 24.5
bride 23
adult 23
dress 21.7
couple 20.9
person 20.3
wedding 20.2
male 19.5
two 17.8
happy 17.5
fashion 16.6
love 16.6
portrait 16.2
women 15.8
attractive 15.4
pretty 14.7
men 14.6
happiness 14.1
shop 13.5
black 13.5
sexy 12.8
smile 12.8
face 12.8
business 12.8
romantic 11.6
style 11.1
youth 11.1
suit 11
professional 10.8
together 10.5
marriage 10.4
hands 10.4
bouquet 10.4
room 10.3
life 10.2
model 10.1
clothing 10.1
salon 9.9
family 9.8
corporate 9.4
lifestyle 9.4
mother 9.4
elegance 9.2
modern 9.1
human 9
interior 8.8
building 8.8
looking 8.8
child 8.8
indoors 8.8
ceremony 8.7
husband 8.7
hair 8.7
flowers 8.7
clothes 8.4
window 8.4
traditional 8.3
sensual 8.2
office 8
home 8
businessman 7.9
cute 7.9
urban 7.9
day 7.8
bridal 7.8
boutique 7.8
musical instrument 7.6
retail 7.6
wife 7.6
passion 7.5
future 7.4
shopping 7.3
cheerful 7.3
lady 7.3
girls 7.3
celebration 7.2
romance 7.1
look 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.1
person 98.8
clothing 96.1
black and white 86.7
musical instrument 81.1
man 72.4
guitar 68.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 96.3%
Happy 84.8%
Surprised 6.7%
Fear 2.4%
Disgusted 1.8%
Sad 1.4%
Confused 1.1%
Angry 1.1%
Calm 0.6%

AWS Rekognition

Age 49-57
Gender Female, 99.6%
Surprised 71.8%
Calm 25%
Happy 1%
Sad 1%
Angry 0.5%
Disgusted 0.3%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Happy 89.5%
Confused 4.5%
Surprised 2%
Disgusted 1.1%
Calm 1%
Sad 0.8%
Angry 0.7%
Fear 0.4%

AWS Rekognition

Age 38-46
Gender Male, 86.5%
Happy 58.4%
Surprised 12.8%
Fear 10.4%
Sad 8.8%
Confused 3.5%
Calm 2.5%
Disgusted 1.8%
Angry 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Text analysis

Amazon

17764.
haLLI

Google

17764. 17764.
17764.