Human Generated Data

Title

Untitled (children in field holding vegetables)

Date

early 20th century

People

Artist: Caufield and Shook, American 1903 - 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.165

Human Generated Data

Title

Untitled (children in field holding vegetables)

People

Artist: Caufield and Shook, American 1903 - 1978

Date

early 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.165

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.5
Person 99.3
Outdoors 99.2
Person 99
Person 98.8
Nature 97.6
Person 97.6
Person 94.5
Person 92.4
Person 91.5
Person 90.1
Countryside 89.6
Garden 84.3
Person 82.1
Rural 78.9
Building 78.7
Gardener 77.8
Worker 77.8
Gardening 77.8
Yard 77.1
Urban 69.6
Plant 61.8
Housing 61.5
Agriculture 60.9
Field 60.9
People 60.4

Clarifai
created on 2023-10-15

people 99.9
child 99.8
group 99.6
home 99.5
family 98.8
group together 97.8
boy 97.5
adult 97.2
girl 97
monochrome 97
woman 95.7
man 94.4
portrait 93.8
son 93.7
house 92.3
many 92.3
war 89.3
campsite 88.6
dog 87.8
offspring 86

Imagga
created on 2021-12-14

kin 50.6
old 18.1
people 17.3
man 12.8
building 12.6
groom 11.2
musical instrument 11.2
person 11.1
tree 10.8
vintage 10.7
child 10
adult 9.9
religion 9.9
bride 9.6
love 9.5
dark 9.2
landscape 8.9
home 8.8
couple 8.7
forest 8.7
life 8.6
two 8.5
travel 8.4
house 8.4
outdoor 8.4
church 8.3
outdoors 8.2
new 8.1
track 7.9
black 7.8
male 7.8
antique 7.8
future 7.4
structure 7.4
park 7.4
barbershop 7.4
protection 7.3
dress 7.2
holiday 7.2
trees 7.1
to 7.1
day 7.1
architecture 7
sky 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

sky 99.7
outdoor 97.2
person 92
clothing 86.5
standing 85.5
house 76.4
old 73
white 71.2
black 69.4
farm 69.4
black and white 66.5
group 63.6
woman 51.9
crowd 2.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-28
Gender Female, 97.7%
Happy 99.7%
Calm 0.1%
Angry 0%
Surprised 0%
Disgusted 0%
Confused 0%
Fear 0%
Sad 0%

AWS Rekognition

Age 3-11
Gender Female, 95.8%
Happy 98.8%
Calm 0.6%
Confused 0.2%
Angry 0.2%
Sad 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 13-23
Gender Female, 64.5%
Happy 98.9%
Calm 0.2%
Fear 0.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Disgusted 0.1%

AWS Rekognition

Age 10-20
Gender Female, 97.5%
Happy 37.2%
Sad 29.2%
Fear 21.8%
Disgusted 3%
Confused 2.8%
Surprised 2.6%
Angry 2.1%
Calm 1.3%

AWS Rekognition

Age 17-29
Gender Male, 99.7%
Angry 52.9%
Calm 28.1%
Sad 13.8%
Happy 2.4%
Fear 1.6%
Confused 0.8%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 23-35
Gender Female, 81.8%
Confused 49%
Sad 17.2%
Happy 16.8%
Fear 8.4%
Calm 4%
Disgusted 2.3%
Surprised 1.5%
Angry 0.9%

AWS Rekognition

Age 17-29
Gender Male, 87.7%
Calm 49%
Surprised 19.5%
Sad 11.5%
Angry 10%
Confused 4.1%
Happy 2.6%
Fear 1.8%
Disgusted 1.6%

AWS Rekognition

Age 26-42
Gender Female, 54.1%
Happy 39.5%
Sad 17.6%
Calm 16.5%
Angry 15.5%
Fear 7.4%
Disgusted 1.8%
Confused 1%
Surprised 0.8%

AWS Rekognition

Age 2-8
Gender Male, 62.4%
Sad 41%
Calm 31.5%
Happy 22.2%
Fear 2%
Angry 1.9%
Confused 0.8%
Surprised 0.4%
Disgusted 0.3%

AWS Rekognition

Age 30-46
Gender Female, 61.5%
Calm 59%
Fear 15.2%
Sad 12.7%
Happy 5.4%
Angry 2.6%
Confused 2.3%
Surprised 2.2%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

OPPRE
peesa