Human Generated Data

Title

Untitled (unidentified woman carrying two wicker containers)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.37

Human Generated Data

Title

Untitled (unidentified woman carrying two wicker containers)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.37

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.1
Human 99.1
Art 94.5
Painting 90.1
Sport 60
Sports 60
Lute 57
Musical Instrument 57

Clarifai
created on 2023-10-28

people 99.7
art 99.3
portrait 99.1
one 98.8
painting 98.5
man 97.9
adult 96.7
wear 96.5
print 95.2
retro 92.9
woman 91.9
old 90.3
two 87.7
antique 87.6
music 87.2
vintage 85.4
sepia 84.5
no person 83.9
illustration 82.6
sepia pigment 81.7

Imagga
created on 2022-02-25

statue 55
sculpture 33.9
column 28.8
art 24.8
religion 19.7
architecture 18.7
stone 17.9
old 17.4
ancient 16.4
fountain 16
marble 15.8
body 15.2
religious 15
monument 14.9
structure 14.1
person 13.3
detail 12.9
building 12.7
dress 12.6
sketch 12.5
historical 12.2
travel 12
portrait 11.6
history 11.6
cemetery 11.4
face 11.4
sexy 11.2
people 11.2
figure 11.1
carving 10.9
antique 10.8
catholic 10.7
lady 10.5
god 10.5
wall 10.3
culture 10.3
historic 10.1
fashion 9.8
carved 9.8
human 9.7
adult 9.7
black 9.6
man 9.4
model 9.3
representation 9.3
tourism 9.1
drawing 9
famous 8.4
city 8.3
decoration 8.3
girls 8.2
landmark 8.1
device 7.8
attractive 7.7
head 7.6
elegance 7.6
cross 7.5
traditional 7.5
symbol 7.4
church 7.4
exterior 7.4
hair 7.1
love 7.1
male 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 98.5
drawing 97.6
painting 92.3
sketch 90
indoor 88.1
person 78.2
clothing 65.3
old 63.4
cartoon 56.2
square 17.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-22
Gender Female, 99.9%
Calm 81.8%
Fear 5.2%
Confused 3.6%
Disgusted 3%
Angry 2.8%
Surprised 2.1%
Sad 1.1%
Happy 0.3%

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.1%

Categories

Captions

Microsoft
created on 2022-02-25

an old photo of a vase 28.2%
an old photo of a box 28.1%
an old photo of a bird 25.5%

Text analysis

Amazon

G
R
G R Buttras
Buttras

Google

GR Buttras
GR
Buttras