Human Generated Data

Title

Untitled (Moore family women in matching dresses, holding flowers)

Date

c. 1938

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21843

Human Generated Data

Title

Untitled (Moore family women in matching dresses, holding flowers)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21843

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 98.6
Apparel 98.6
Dress 98.2
Person 97.6
Human 97.6
Person 97.3
Person 95.5
Person 91.3
Person 89.3
Person 87.6
Person 85.4
Stage 83.7
Female 80.4
Evening Dress 73.6
Robe 73.6
Fashion 73.6
Gown 73.6
Leisure Activities 70.7
Person 68.3
People 66.7
Person 65.1
Girl 64
Face 62.3
Wedding 62
Dance Pose 61.8
Bridesmaid 61.5
Costume 60.3
Woman 60.3
Chair 57.1
Furniture 57.1
Dance 56.9
Room 55.2
Indoors 55.2
Person 51.1

Clarifai
created on 2023-10-22

people 99.9
group 99.7
adult 97.7
group together 97.4
dancer 96.4
actress 96.3
portrait 96.3
music 95.3
woman 95
veil 93.8
wear 93.8
wedding 93.8
man 93.3
costume 93
dress 92.8
actor 92.1
outfit 92.1
singer 92
many 91.2
musician 90.9

Imagga
created on 2022-03-11

groom 53
fountain 32.4
dress 26.2
boutique 24.8
people 23.4
structure 21.3
bride 21.1
person 20.3
wedding 20.2
dancer 17.3
love 16.6
couple 16.5
adult 16.3
marriage 16.1
celebration 15.9
happy 15
fashion 14.3
performer 14.3
outdoors 14.2
day 14.1
happiness 14.1
two 13.5
women 13.4
park 13.2
man 12.8
outdoor 12.2
scene 12.1
party 12
waterfall 11.6
ceremony 11.6
face 11.4
water 11.3
bouquet 11.3
entertainer 10.7
married 10.5
life 9.9
romantic 9.8
interior 9.7
portrait 9.7
together 9.6
wife 9.5
men 9.4
motion 9.4
glass 9.3
holiday 9.3
elegance 9.2
joy 9.2
attractive 9.1
summer 9
romance 8.9
color 8.9
gown 8.8
bridal 8.7
flowers 8.7
wall 8.5
clothing 8.5
walking 8.5
stream 8.5
travel 8.4
pretty 8.4
city 8.3
landscape 8.2
new 8.1
lifestyle 7.9
costume 7.9
black 7.9
urban 7.9
teacher 7.8
male 7.8
luxury 7.7
old 7.7
husband 7.6
stone 7.6
human 7.5
holding 7.4
church 7.4
cheerful 7.3
lady 7.3
group 7.2
dance 7.2
art 7.2
dinner dress 7.2
river 7.1
spring 7.1
season 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

dress 98.2
wedding dress 96.4
bride 92.3
text 91.8
clothing 89.5
person 85.9
woman 83.2
dance 82.2
wedding 72.1
flower 60.6
group 58.4
old 50.5
clothes 37
line 18.5
several 10.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 95.6%
Happy 93.4%
Calm 3.8%
Sad 1.2%
Confused 0.5%
Surprised 0.4%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 87.8%
Calm 53%
Happy 39.5%
Sad 1.8%
Fear 1.4%
Surprised 1.4%
Confused 1.3%
Disgusted 1%
Angry 0.7%

AWS Rekognition

Age 35-43
Gender Male, 52.6%
Happy 98.6%
Sad 0.4%
Calm 0.4%
Surprised 0.2%
Fear 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 40-48
Gender Female, 84.2%
Happy 97.3%
Surprised 1.7%
Calm 0.4%
Fear 0.3%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Confused 0.1%

AWS Rekognition

Age 37-45
Gender Male, 85.8%
Happy 56.6%
Calm 36.3%
Surprised 4%
Confused 0.8%
Disgusted 0.7%
Sad 0.6%
Fear 0.6%
Angry 0.3%

AWS Rekognition

Age 48-54
Gender Female, 78.6%
Happy 49.6%
Sad 21.6%
Surprised 14.6%
Fear 5.8%
Calm 3.7%
Confused 2.2%
Disgusted 1.6%
Angry 1%

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Surprised 62.1%
Happy 11.5%
Sad 10.8%
Calm 5.7%
Confused 3%
Disgusted 2.4%
Angry 2.3%
Fear 2.2%

AWS Rekognition

Age 47-53
Gender Male, 96.1%
Happy 86.3%
Calm 11.8%
Sad 1.1%
Surprised 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 29-39
Gender Female, 82.3%
Calm 85.5%
Happy 11.8%
Disgusted 0.9%
Sad 0.5%
Confused 0.5%
Fear 0.3%
Surprised 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.6%
Person 97.3%
Person 95.5%
Person 91.3%
Person 89.3%
Person 87.6%
Person 85.4%
Person 68.3%
Person 65.1%
Person 51.1%

Text analysis

Amazon

VCEV
SVEELA
VCEV SVEELA ЫГИ
ЫГИ