Human Generated Data

Title

Untitled (woman standing up at table)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.28

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman standing up at table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.28

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99.6
Person 99.6
Person 99.4
Person 98.9
Clothing 97
Apparel 97
People 92.9
Pet 87.6
Animal 87.6
Cat 87.6
Mammal 87.6
Tie 77.4
Accessories 77.4
Accessory 77.4
Female 73.3
Face 69
Family 66.9
Fashion 63
Robe 63
Gown 63
Evening Dress 63
Flower 60.7
Plant 60.7
Blossom 60.7
Art 58.2
Pattern 57.8
Floral Design 57.8
Graphics 57.8
Dress 57.3
Photo 56.8
Photography 56.8
Portrait 56.8
Woman 55.9
Person 47.8

Clarifai
created on 2019-03-25

people 100
group 98.7
actress 98.2
adult 98.1
portrait 97.5
woman 96.9
two 96.2
wear 95.3
man 91.8
three 91.7
wedding 90.4
furniture 90.1
leader 89.7
facial expression 89.4
four 88.7
family 88.6
child 88
administration 86.7
several 85.8
offspring 85.4

Imagga
created on 2019-03-25

groom 62.6
kin 39.1
portrait 27.2
person 23
bride 22.1
people 21.8
statue 21.2
old 20.9
adult 20.7
dress 19.9
couple 19.2
love 19
mother 18.5
wedding 18.4
sculpture 18.2
face 17.1
bow tie 16
two 15.3
women 15
male 15
ancient 14.7
fashion 14.3
man 14.1
art 13.7
happiness 13.3
attractive 13.3
happy 13.2
lady 13
sexy 12.9
home 12.8
pretty 12.6
religion 12.6
monument 12.1
necktie 12.1
veil 11.8
history 11.6
hair 11.1
antique 10.8
romantic 10.7
family 10.7
marble 10.7
married 10.6
bouquet 10.4
celebration 10.4
culture 10.3
decoration 10.2
stone 10.1
model 10.1
clothing 9.9
human 9.8
bridal 9.7
historical 9.4
architecture 9.4
parent 9.3
elegance 9.2
historic 9.2
detail 8.9
looking 8.8
catholic 8.8
sepia 8.8
smiling 8.7
day 8.6
sibling 8.4
church 8.3
vintage 8.3
style 8.2
aged 8.2
world 8.1
garment 8.1
lifestyle 8
smile 7.8
tenderness 7.8
color 7.8
child 7.8
men 7.7
god 7.7
marriage 7.6
females 7.6
head 7.6
senior 7.5
mature 7.4
tourism 7.4
room 7.3
girls 7.3
black 7.2
gown 7.2
posing 7.1
together 7
costume 7
look 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 99.8
indoor 87.2
old 75.8
posing 75.2
group 72.4
people 66.2
child 22.4
boy 14.9
family 7.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Female, 99.9%
Calm 0.1%
Sad 0.3%
Confused 0.5%
Disgusted 0.5%
Angry 0.6%
Happy 96.7%
Surprised 1.3%

AWS Rekognition

Age 26-43
Gender Male, 99.9%
Happy 1.8%
Confused 35.2%
Disgusted 11.2%
Angry 14.7%
Calm 11.9%
Surprised 19.3%
Sad 6%

AWS Rekognition

Age 20-38
Gender Female, 87.9%
Sad 27.7%
Calm 42%
Disgusted 6.5%
Happy 4.3%
Angry 6.9%
Surprised 7.4%
Confused 5.2%

AWS Rekognition

Age 15-25
Gender Female, 99.9%
Happy 1.7%
Sad 9.3%
Surprised 3.9%
Disgusted 2.2%
Angry 32.8%
Calm 40.3%
Confused 9.8%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Cat 87.6%
Tie 77.4%

Categories

Imagga

paintings art 93.9%
people portraits 5.8%