Human Generated Data

Title

Untitled (bride and flower girl)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17272

Human Generated Data

Title

Untitled (bride and flower girl)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17272

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.6
Apparel 99.6
Person 98.4
Human 98.4
Person 90.3
Helmet 89.5
Female 86.8
Evening Dress 86.2
Fashion 86.2
Gown 86.2
Robe 86.2
Costume 82.6
Face 81.7
Dress 70.5
Woman 67.7
Portrait 65.8
Photography 65.8
Photo 65.8
People 61.3
Cloak 58.6

Clarifai
created on 2023-10-29

people 99.8
child 98.7
portrait 98
wear 97.6
two 97.5
monochrome 96.3
woman 96.3
veil 96.2
adult 95.9
dress 94.2
baby 93.6
wedding 92.7
family 91.8
girl 90.3
street 90.1
son 89.6
actress 89.4
offspring 88
art 86.1
man 84.4

Imagga
created on 2022-02-26

statue 43.7
cloak 33.5
covering 33.3
sculpture 30.2
architecture 25.8
old 20.9
ancient 20.8
city 19.9
cemetery 18.3
travel 18.3
history 17.9
building 17.5
stone 17.3
monument 16.8
culture 15.4
religion 15.2
historic 14.7
tourism 14
clothing 13.7
art 13.5
landmark 13.5
man 12.9
historical 11.3
famous 11.2
tourist 10.4
religious 10.3
traditional 10
weapon 9.8
mask 9.8
antique 9.6
bearskin 9.5
people 9.5
black 9.3
temple 9.2
bronze 9.1
portrait 9.1
fashion 9
dress 9
hat 8.9
love 8.7
palace 8.7
pedestal 8.6
capital 8.5
figure 8.5
male 8.5
protective covering 7.8
face 7.8
attractive 7.7
outdoor 7.6
place 7.4
style 7.4
carving 7.2
structure 7.1
life 7
coat 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 96.1
hydrant 94.3
fire 85.5
black and white 82
black 81.8
text 79.7
clothing 77.2
person 70.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 80.9%
Happy 61.1%
Calm 32.4%
Surprised 3.7%
Sad 1%
Disgusted 0.6%
Fear 0.6%
Angry 0.4%
Confused 0.2%

AWS Rekognition

Age 27-37
Gender Female, 62.5%
Calm 57%
Happy 23.6%
Surprised 15.4%
Angry 1.7%
Disgusted 1.1%
Confused 0.5%
Sad 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 98.4%
Person 90.3%
Helmet 89.5%

Categories

Text analysis

Amazon

T37A2-
T37A2- "AGOX
"AGOX

Google

ヨヨA2
ヨヨ
A2