Human Generated Data

Title

Untitled (soldier photographing children)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2202

Human Generated Data

Title

Untitled (soldier photographing children)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2202

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 99.9
Apparel 99.9
Person 99.6
Human 99.6
Person 99.2
Person 99.1
Hat 79.8
Face 74.6
Shoe 72.6
Footwear 72.6
Overcoat 71.9
Coat 71.9
Shoe 71.6
Sun Hat 68.3
Suit 60.9
Shoe 55.5

Clarifai
created on 2023-10-15

people 99.9
child 99.5
monochrome 99.3
street 99.2
two 98.8
portrait 98.5
sepia 98.1
lid 97.5
girl 97
boy 96.8
documentary 96.5
adult 96.2
son 96.1
man 95.8
nostalgia 95.3
wear 95
woman 94.9
veil 92.3
baby 91.7
vintage 91.7

Imagga
created on 2021-12-15

sculpture 28.3
statue 26.2
architecture 21.2
marble 19.3
monument 18.7
column 18.2
history 17.9
art 17.4
old 17.4
people 15.1
building 14.7
culture 14.5
historical 14.1
ancient 13.8
family 13.3
person 13.1
portrait 12.9
stone 12.7
tourism 12.4
barbershop 12.2
male 12.1
face 12.1
home 12
historic 11.9
house 11.7
religion 11.6
city 11.6
room 11.3
detail 11.3
world 11.1
man 10.8
window 10.7
shop 10.3
love 10.3
balcony 10.2
antique 10
travel 9.9
bride 9.6
couple 9.6
god 9.6
decoration 9.4
happiness 9.4
father 9.2
mother 9
landmark 9
adult 8.7
two 8.5
famous 8.4
church 8.3
wedding 8.3
grandfather 8.3
happy 8.1
aged 8.1
kin 8
groom 7.9
catholic 7.8
architectural 7.7
buildings 7.6
religious 7.5
mercantile establishment 7.4
lady 7.3
indoor 7.3
dress 7.2

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

clothing 96.4
person 95

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 80.2%
Calm 91.1%
Surprised 5.9%
Sad 1.5%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%
Happy 0.3%
Fear 0.1%

AWS Rekognition

Age 4-12
Gender Female, 75.7%
Calm 99.6%
Sad 0.2%
Surprised 0%
Angry 0%
Fear 0%
Confused 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 42-60
Gender Male, 88.3%
Calm 96.6%
Angry 2.2%
Sad 0.4%
Happy 0.2%
Fear 0.2%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%

Microsoft Cognitive Services

Age 4
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 72.6%

Categories

Imagga

interior objects 100%