Human Generated Data

Title

Untitled (sample wedding album)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.431

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (sample wedding album)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.431

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Clothing 100
Apparel 100
Hat 99.5
Person 99.4
Human 99.4
Person 97.8
Veil 84.9
Hat 71.7
Fashion 63.2
Robe 63.2
Gown 62.2
Wedding 55.7

Clarifai
created on 2019-11-10

people 99.9
adult 99
group 99
two 98.8
woman 98.5
three 96.6
veil 95.9
portrait 95.1
actress 95.1
four 94.2
wear 93.6
man 92.6
facial expression 91.2
administration 90.9
furniture 89.5
group together 89.3
leader 88.6
music 88
movie 87.1
lid 86.5

Imagga
created on 2019-11-10

ruler 71
portrait 31.1
adult 26
person 25.9
fashion 21.9
bride 21.1
people 20.1
model 19.4
dress 19
wedding 18.4
sexy 17.7
love 17.4
man 16.8
attractive 16.8
style 15.6
pretty 14.7
lady 14.6
gown 13.9
covering 13.5
face 13.5
posing 13.3
couple 13.1
male 12.8
veil 12.7
elegance 12.6
old 12.5
human 12
happy 11.9
hair 11.9
groom 11.7
vintage 11.6
cloak 11.2
expression 11.1
statue 10.9
religion 10.8
one 10.4
church 10.2
sensuality 10
pose 10
retro 9.8
clothing 9.8
bridal 9.7
sensual 9.1
hat 9
antique 8.9
romantic 8.9
look 8.8
culture 8.5
black 8.4
monument 8.4
girls 8.2
make 8.2
looking 8
sculpture 8
cute 7.9
brunette 7.8
smile 7.8
luxury 7.7
metropolitan 7.7
cathedral 7.7
jacket 7.7
god 7.7
studio 7.6
catholic 7.5
decoration 7.3
art 7.3
history 7.2
lovely 7.1
architecture 7

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

person 99.6
human face 95.3
fashion accessory 95
cellphone 91.3
dress 90.4
wedding dress 89.5
phone 88.5
clothing 88.5
woman 88
bride 81
text 74.7
hat 67.8
fashion 55.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 98.3%
Angry 0.7%
Disgusted 0.3%
Happy 0.3%
Confused 0.8%
Surprised 0.5%
Fear 0.5%
Calm 90.5%
Sad 6.6%

AWS Rekognition

Age 48-66
Gender Female, 98.8%
Surprised 1.6%
Disgusted 1.1%
Sad 47.6%
Fear 4.9%
Happy 0.3%
Calm 36.8%
Angry 1.9%
Confused 5.9%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 60
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Hat 99.5%
Person 99.4%

Categories

Imagga

paintings art 98.3%
people portraits 1.2%