Human Generated Data

Title

Untitled (family portrait with three figures standing and seated, heavy overpainting)

Date

1890s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2901

Human Generated Data

Title

Untitled (family portrait with three figures standing and seated, heavy overpainting)

People

Artist: Unidentified Artist,

Date

1890s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2901

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Painting 99.1
Art 99.1
Person 98.6
Person 98.5

Clarifai
created on 2023-10-25

people 99.5
painting 98.8
art 98.2
man 97.9
religion 94.9
woman 93.6
adult 93.2
group 91.7
lid 91.2
illustration 91.2
print 90.9
priest 89.1
elderly 88.6
three 88.4
Renaissance 88.4
wear 88.3
two 87.7
child 86.7
saint 81.9
winter 81.1

Imagga
created on 2022-01-09

metropolitan 50.2
wicker 22.4
people 20.1
man 18.8
male 18.4
vestment 17.6
portrait 16.8
person 16.6
culture 16.2
work 16.1
clothing 15.8
face 14.9
gown 14.9
couple 13.9
happy 13.8
rattan 13.6
costume 13.5
smiling 13
sitting 12.9
old 12.5
product 12.5
traditional 12.5
holiday 12.2
men 12
adult 11.7
religion 11.6
mother 11.2
dress 10.8
switch 10.8
outerwear 10.6
cheerful 10.6
together 10.5
outdoors 10.4
religious 10.3
senior 10.3
fashion 9.8
father 9.6
love 9.5
lifestyle 9.4
colorful 9.3
church 9.2
art 9.2
lady 8.9
family 8.9
pray 8.7
scene 8.7
creation 8.6
happiness 8.6
faith 8.6
child 8.4
tradition 8.3
tourism 8.2
instrument of punishment 8.2
history 8
home 8
oriental 8
women 7.9
look 7.9
travel 7.7
golden 7.7
holy 7.7
god 7.7
outdoor 7.6
historical 7.5
vintage 7.4
gold 7.4
color 7.2
smile 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 97.9
text 97.8
person 97.1
painting 96.2
old 90.2
woman 84.4
human face 80.7
drawing 67.4
clothes 20.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Male, 55.5%
Calm 77.9%
Disgusted 5.3%
Fear 5.1%
Confused 4.4%
Sad 3.5%
Angry 1.5%
Surprised 1.4%
Happy 0.9%

AWS Rekognition

Age 19-27
Gender Female, 100%
Calm 78.6%
Sad 18.3%
Confused 1.2%
Angry 0.7%
Fear 0.5%
Surprised 0.3%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 61-71
Gender Male, 100%
Calm 96.9%
Sad 2.8%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Happy 0%
Fear 0%
Surprised 0%

Microsoft Cognitive Services

Age 68
Gender Male

Microsoft Cognitive Services

Age 10
Gender Male

Microsoft Cognitive Services

Age 15
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Painting 99.1%

Categories

Captions