Human Generated Data

Title

Untitled (children outside playhouse)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2201

Human Generated Data

Title

Untitled (children outside playhouse)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2201

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.6
Human 98.6
Person 98.1
Person 96.2
Porch 94.2
Outdoors 88.5
Person 87.6
Face 76.6
Nature 75.8
Shelter 72.9
Building 72.9
Rural 72.9
Countryside 72.9
Patio 71.3
Meal 68.1
Food 68.1
Plant 67.7
People 64.1
Wood 63.9
Furniture 62
Grass 59.6
Table 57.8
Long Sleeve 56.8
Clothing 56.8
Apparel 56.8
Sleeve 56.8
Bed 55.7
Pergola 55.6

Clarifai
created on 2023-10-15

child 99.9
people 99.9
family 99.7
boy 99.1
girl 98.5
home 98.4
son 98.2
group 97.8
woman 97.5
adult 96.4
man 96.2
furniture 95.6
two 94.9
portrait 94.7
offspring 93.6
house 93.6
baby 93.5
sibling 92.9
porch 90.5
room 90.4

Imagga
created on 2021-12-15

swing 54.3
child 45.9
mechanical device 42.5
plaything 42.5
resort area 35.8
mechanism 31.6
area 29.5
region 21.3
architecture 17.9
happy 16.3
portrait 16.2
happiness 15.7
location 15.1
house 15
old 14.6
people 14.5
family 14.2
building 13.8
smile 13.5
person 13.2
smiling 13
park 12.6
adult 12.3
mother 12
sitting 12
man 11.4
outdoors 11.2
structure 10.6
fun 10.5
home 10.4
outdoor 9.9
vintage 9.9
history 9.8
couple 9.6
youth 9.4
face 9.2
male 9.2
childhood 8.9
sky 8.9
love 8.7
water 8.7
outside 8.5
winter 8.5
tree 8.5
pretty 8.4
summer 8.4
city 8.3
girls 8.2
little 7.9
parent 7.9
roof 7.9
wall 7.9
scene 7.8
travel 7.7
bride 7.7
sculpture 7.6
daughter 7.6
two 7.6
pillory 7.6
fashion 7.5
vacation 7.4
children 7.3
dress 7.2
holiday 7.2
romantic 7.1

Google
created on 2021-12-15

Black 89.6
Dress 85.3
Tree 83.4
Table 80.9
Plant 80.7
Suit 80
House 78.5
Tints and shades 77.2
Jacket 75.7
Vintage clothing 72.4
Chair 69.3
Room 69
Sitting 66.9
Cottage 65.5
History 64.8
Monochrome 63.3
Wood 62.7
Child 60.4
Monochrome photography 59.4
Building 57.4

Microsoft
created on 2021-12-15

outdoor 99.4
clothing 93.1
person 92.6
furniture 87.3
house 86.1
table 70
chair 61.9
old 60.4
child 52.8
toddler 50.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-5
Gender Female, 99.3%
Calm 74%
Surprised 25.1%
Happy 0.6%
Confused 0.1%
Sad 0.1%
Fear 0.1%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 2-8
Gender Male, 94.2%
Calm 87.5%
Sad 4.3%
Surprised 3%
Angry 2.7%
Confused 1.1%
Fear 0.8%
Happy 0.6%
Disgusted 0.1%

AWS Rekognition

Age 3-9
Gender Female, 95.1%
Calm 73%
Happy 14%
Sad 8.4%
Surprised 1.9%
Angry 0.9%
Fear 0.7%
Confused 0.7%
Disgusted 0.4%

AWS Rekognition

Age 13-25
Gender Female, 77.3%
Calm 96.7%
Sad 1.7%
Surprised 0.5%
Happy 0.5%
Angry 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Categories