Human Generated Data

Title

Untitled (family in front of fence and trees)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1224

Human Generated Data

Title

Untitled (family in front of fence and trees)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1224

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.6
Human 98.6
Person 98.5
Person 97.5
People 96.2
Family 93.2
Person 89.4
Face 76.4
Painting 68.4
Art 68.4
Photography 62.9
Photo 62.9
Text 58.4

Clarifai
created on 2023-10-26

people 99.9
child 99.6
portrait 99.5
son 99.4
group 99.4
family 98
offspring 97.7
three 96.9
adult 96.5
four 95.8
sepia 95.4
two 95.1
sibling 94.7
man 94.7
administration 94
documentary 92.3
baby 92
wear 92
monochrome 90.9
nostalgia 89

Imagga
created on 2022-01-23

kin 83.5
statue 38.3
sculpture 30.1
art 23.7
old 23
ancient 21.6
culture 19.7
religion 18.8
architecture 18.8
stone 18.6
world 18.5
history 17.9
vintage 16.5
historical 16
monument 15.9
cemetery 15.6
marble 15.5
religious 15
antique 14.7
face 13.5
people 12.3
memorial 12.2
historic 11.9
portrait 11.7
god 11.5
building 11.1
carving 11
museum 10.8
man 10.8
male 10.6
travel 10.6
detail 10.5
one 10.5
famous 10.2
symbol 10.1
head 10.1
figure 10.1
city 10
decoration 9.4
fountain 9.3
grave 9.1
tourism 9.1
black 9
renaissance 8.8
closeup 8.8
catholic 8.8
person 8.7
sepia 8.7
church 8.3
paintings 7.8
roman 7.8
death 7.7
saint 7.7
traditional 7.5
mask 7.3
painter 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.6
clothing 98.3
person 97.8
human face 96.7
baby 96.3
toddler 94
old 89.1
electronics 85.9
posing 83.8
smile 81.7
picture frame 74.1
boy 67
white 65.2
child 63.1
image 42.4
vintage 26.9
display 26.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Female, 63%
Calm 52.4%
Angry 37.1%
Confused 3.5%
Sad 3%
Disgusted 1.4%
Fear 1.2%
Happy 0.8%
Surprised 0.6%

Feature analysis

Amazon

Person
Painting
Person 98.6%

Categories