Human Generated Data

Title

Untitled (three women, two standing, one seated, full-length)

Date

c.1856 - c.1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3864

Human Generated Data

Title

Untitled (three women, two standing, one seated, full-length)

People

Artist: Unidentified Artist,

Date

c.1856 - c.1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3864

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Clothing 99.8
Apparel 99.8
Person 99
Human 99
Person 98.5
Person 97
Costume 92.9
Painting 89.6
Art 89.6
Face 80.2
Fashion 72.5
Overcoat 72.3
Coat 72.3
Portrait 71.7
Photography 71.7
Photo 71.7
Performer 70.8
Robe 69.4
Gown 65.6
Hat 61.5
Cloak 58.1
Sleeve 57

Clarifai
created on 2021-04-03

people 99.6
child 98.9
portrait 98.3
son 97.5
art 97.2
wear 96.7
man 94.3
baby 93.9
veil 93.8
group 93.7
woman 93.3
adult 92.7
painting 92.3
two 91.4
lid 90
boy 89.8
family 88
one 86.7
religion 85.9
room 84.7

Imagga
created on 2021-04-03

kin 42.2
person 19
world 18.7
dark 18.4
black 17.6
man 17.5
people 17.3
one 16.4
religion 16.1
old 16
vintage 14.9
window 14.6
art 14.4
male 13.5
portrait 12.9
military uniform 12.8
light 12
body 12
adult 11.7
clothing 11.4
human 11.2
ancient 11.2
sexy 11.2
uniform 10.8
fashion 10.5
attractive 10.5
love 10.3
church 10.2
antique 9.5
architecture 9.5
historical 9.4
culture 9.4
dress 9
night 8.9
interior 8.8
room 8.8
cell 8.8
hair 8.7
covering 8.7
passion 8.5
fire 8.4
makeup 8.2
sensuality 8.2
lady 8.1
fantasy 8.1
posing 8
sculpture 7.9
darkness 7.8
face 7.8
model 7.8
prayer 7.7
wall 7.7
saint 7.7
mystery 7.7
statue 7.6
erotic 7.6
religious 7.5
style 7.4
symbol 7.4
history 7.2
device 7

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

text 98.5
clothing 97
human face 95.9
person 95.3
indoor 89.9
retro 70.5
woman 68.7
door 52
picture frame 8.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 95%
Calm 96.6%
Sad 1.3%
Confused 0.5%
Angry 0.4%
Disgusted 0.4%
Happy 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 32-48
Gender Female, 60%
Calm 89%
Sad 5.3%
Confused 1.4%
Angry 1.1%
Fear 1%
Happy 0.9%
Disgusted 0.8%
Surprised 0.5%

AWS Rekognition

Age 20-32
Gender Male, 62.3%
Calm 51.9%
Happy 23.1%
Sad 16.2%
Fear 2.7%
Surprised 2.4%
Angry 1.7%
Confused 1.4%
Disgusted 0.6%

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 22
Gender Male

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99%
Painting 89.6%

Categories