Human Generated Data

Title

Untitled (mother with three children, reading book)

Date

1920

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2141

Human Generated Data

Title

Untitled (mother with three children, reading book)

People

Artist: Hamblin Studio, American active 1930s

Date

1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2141

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 100
Apparel 100
Person 99.3
Human 99.3
Person 98.6
Robe 96.8
Fashion 96.8
Person 96.5
Gown 96
Female 90.2
Person 89.1
Dress 89
Wedding 85.2
Bridegroom 79.4
Evening Dress 79.3
Face 77.6
Woman 77
Wedding Gown 76.7
Suit 73.9
Coat 73.9
Overcoat 73.9
People 68.9
Bride 66.1
Photography 62.4
Photo 62.4
Girl 60.3
Leisure Activities 60.1
Person 59.4
Dance Pose 56.2
Plant 55.9
Flower 55.9
Blossom 55.9
Kimono 55.7

Clarifai
created on 2023-10-25

people 99.8
wedding 98.3
group 97
man 95.8
woman 95.1
veil 95
monochrome 94.2
bride 93.5
family 92.6
adult 92.4
wear 91.7
child 90.1
groom 90
princess 89.2
actress 89.1
three 87.3
several 87.3
two 87
dress 86.8
four 86

Imagga
created on 2021-12-14

negative 26.9
dress 26.2
portrait 24.6
bride 22.4
people 22.3
fashion 21.9
film 21.2
person 21
attractive 18.2
lady 17.9
dancer 17.8
photographic paper 16.4
adult 16
happiness 15.7
wedding 15.6
dance 15.5
elegance 15.1
cute 15.1
pretty 14.7
art 14.6
clothing 13.8
sexy 13.7
style 13.3
posing 13.3
model 13.2
groom 13.1
costume 12.8
happy 11.9
man 11.7
statue 11.4
old 11.1
performer 11
decoration 10.9
photographic equipment 10.9
bouquet 10.8
face 10.7
male 10.6
marriage 10.4
celebration 10.4
women 10.3
love 10.3
gorgeous 10
gown 9.9
interior 9.7
couple 9.6
flowers 9.6
hair 9.5
luxury 9.4
holiday 9.3
smile 9.3
joy 9.2
sculpture 9
human 9
mother 8.9
child 8.9
romantic 8.9
body 8.8
home 8.8
look 8.8
smiling 8.7
seductive 8.6
studio 8.4
traditional 8.3
vintage 8.3
holding 8.3
historic 8.2
makeup 8.2
indoor 8.2
girls 8.2
sensuality 8.2
make 8.2
antique 8.1
looking 8
lifestyle 7.9
gift 7.7
blond 7.7
modern 7.7
married 7.7
world 7.7
tourism 7.4
stylish 7.2
history 7.2
family 7.1
indoors 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.6
clothing 90.4
sketch 87.8
person 85.5
window 84.8
dress 81.2
woman 77.9
player 76.1
black 68.8
old 65.7
drawing 63.6
white 63.3
footwear 62
human face 57.3
vintage 27.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-39
Gender Female, 62%
Sad 65.8%
Calm 29.6%
Surprised 1.8%
Confused 1.3%
Angry 0.7%
Fear 0.5%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 78.8%
Calm 59.8%
Sad 33.5%
Happy 3.6%
Confused 1.5%
Fear 0.7%
Angry 0.4%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 29-45
Gender Female, 77.7%
Calm 98.5%
Sad 1%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 17-29
Gender Female, 94.6%
Calm 72.3%
Happy 16%
Surprised 6.3%
Confused 3%
Angry 0.8%
Disgusted 0.8%
Sad 0.7%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

paintings art 100%