Human Generated Data

Title

Untitled (two children eating watermelon)

Date

1970s copy negative from a c. 1935 negative

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21763

Human Generated Data

Title

Untitled (two children eating watermelon)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

1970s copy negative from a c. 1935 negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21763

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Hat 99.9
Clothing 99.9
Apparel 99.9
Person 98.7
Human 98.7
Person 95.9
Hat 95.5
Collage 78.3
Advertisement 78.3
Poster 78.3
Bonnet 72.9
Sun Hat 67.7
Face 66.4
Icing 63.8
Food 63.8
Dessert 63.8
Cake 63.8
Cream 63.8
Creme 63.8
Art 62.9
Kid 61.2
Child 61.2
Baby 56.2
Portrait 55.2
Photography 55.2
Photo 55.2

Clarifai
created on 2023-10-22

people 99.7
group 99.3
food 96.6
lid 96.6
wear 94.1
adult 94.1
vehicle 93.4
no person 93.1
three 92.8
one 92.6
monochrome 92.1
collage 91.9
illustration 91.7
many 91.2
two 90.8
watercraft 90.8
man 90.8
child 89.9
movie 89.3
furniture 88.9

Imagga
created on 2022-03-11

negative 44.7
film 35.4
photographic paper 26.6
newspaper 20.2
photographic equipment 17.8
people 16.2
product 15.8
man 15.5
statue 14.5
creation 12.8
outdoors 12.7
sculpture 11.9
person 11.7
perfume 11.6
sky 11.5
black 10.8
male 10.7
symbol 10.1
religion 9.8
landscape 9.7
art 9.5
toiletry 9.4
background 9.3
park 9.2
old 9
adult 9
light 8.7
love 8.7
water 8.7
travel 8.4
religious 8.4
portrait 8.4
dark 8.3
screen 8.3
color 8.3
human 8.2
clothing 8
face 7.8
architecture 7.8
summer 7.7
glass 7.7
spirituality 7.7
world 7.6
television 7.6
head 7.6
power 7.5
serene 7.5
fashion 7.5
happy 7.5
monument 7.5
lifestyle 7.2
night 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 99.9
book 90.7
window 80.4
screenshot 74.2
hat 62.8
old 48.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Female, 97.4%
Sad 30.1%
Angry 28.7%
Calm 14%
Disgusted 9.7%
Surprised 9.2%
Happy 4.4%
Confused 2.1%
Fear 1.7%

AWS Rekognition

Age 2-10
Gender Male, 98.1%
Calm 99.7%
Happy 0.2%
Sad 0%
Disgusted 0%
Angry 0%
Surprised 0%
Confused 0%
Fear 0%

Feature analysis

Amazon

Hat
Person
Hat 99.9%
Hat 95.5%
Person 98.7%
Person 95.9%

Categories

Imagga

paintings art 94.1%
text visuals 1.7%
food drinks 1.6%

Captions

Microsoft
created on 2022-03-11

an old photo of a person 69.1%
old photo of a person 65.9%
a screen shot of a person 65.8%

Text analysis

Amazon

5