Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (children in field holding vegetables)

Date

early 20th century

People

Artist: Caufield and Shook, American 1903 - 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.165

Human Generated Data

Title

Untitled (children in field holding vegetables)

People

Artist: Caufield and Shook, American 1903 - 1978

Date

early 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.165

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.5
Person 99.3
Outdoors 99.2
Person 99
Person 98.8
Nature 97.6
Person 97.6
Person 94.5
Person 92.4
Person 91.5
Person 90.1
Countryside 89.6
Garden 84.3
Person 82.1
Rural 78.9
Building 78.7
Gardener 77.8
Worker 77.8
Gardening 77.8
Yard 77.1
Urban 69.6
Plant 61.8
Housing 61.5
Agriculture 60.9
Field 60.9
People 60.4

Clarifai
created on 2023-10-15

people 99.9
child 99.8
group 99.6
home 99.5
family 98.8
group together 97.8
boy 97.5
adult 97.2
girl 97
monochrome 97
woman 95.7
man 94.4
portrait 93.8
son 93.7
house 92.3
many 92.3
war 89.3
campsite 88.6
dog 87.8
offspring 86

Imagga
created on 2021-12-14

kin 50.6
old 18.1
people 17.3
man 12.8
building 12.6
groom 11.2
musical instrument 11.2
person 11.1
tree 10.8
vintage 10.7
child 10
adult 9.9
religion 9.9
bride 9.6
love 9.5
dark 9.2
landscape 8.9
home 8.8
couple 8.7
forest 8.7
life 8.6
two 8.5
travel 8.4
house 8.4
outdoor 8.4
church 8.3
outdoors 8.2
new 8.1
track 7.9
black 7.8
male 7.8
antique 7.8
future 7.4
structure 7.4
park 7.4
barbershop 7.4
protection 7.3
dress 7.2
holiday 7.2
trees 7.1
to 7.1
day 7.1
architecture 7
sky 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

sky 99.7
outdoor 97.2
person 92
clothing 86.5
standing 85.5
house 76.4
old 73
white 71.2
black 69.4
farm 69.4
black and white 66.5
group 63.6
woman 51.9
crowd 2.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-28
Gender Female, 97.7%
Happy 99.7%
Calm 0.1%
Angry 0%
Surprised 0%
Disgusted 0%
Confused 0%
Fear 0%
Sad 0%

Feature analysis

Amazon

Person
Person 99.8%

Text analysis

Amazon

OPPRE
peesa