Human Generated Data

Title

Untitled (three girls in front of Christmas tree)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21650

Human Generated Data

Title

Untitled (three girls in front of Christmas tree)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21650

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Dress 99.9
Clothing 99.9
Apparel 99.9
Blonde 99.8
Female 99.8
Girl 99.8
Teen 99.8
Kid 99.8
Woman 99.8
Human 99.8
Child 99.8
Person 99.6
Person 99.1
Shoe 98
Footwear 98
Face 96.8
Chair 91.4
Furniture 91.4
Plant 90.6
Play 88.2
Costume 86.6
Person 81.8
Indoors 81.6
Room 81.2
Person 81.1
Shoe 78.9
Tree 75
Grass 74.9
Shoe 74.3
Table 72.7
Portrait 72.5
Photography 72.5
Photo 72.5
Floor 71.3
Living Room 66.5
Outdoors 65.8
Flower 63.4
Blossom 63.4
People 62.5
Dining Table 60.2
Baby 58.1
Sitting 57

Clarifai
created on 2023-10-22

child 99.7
people 99.6
education 96.8
school 95.2
son 94.4
boy 94.1
group 94
teacher 93.5
adult 92.4
two 92.2
girl 91.2
woman 89.7
chair 83.5
man 81.7
family 79.3
three 79.1
monochrome 78.1
elementary school 77.8
table 75.2
wear 75.2

Imagga
created on 2022-03-05

people 27.3
male 25.6
man 24.9
person 22.7
adult 21.6
portrait 21.3
city 19.9
women 19.8
attractive 18.2
fashion 17.3
urban 15.7
happy 15.7
men 15.4
pretty 15.4
musical instrument 15.4
blond 15.1
together 14.9
lifestyle 14.4
hairdresser 13.9
black 13.6
smile 12.8
youth 12.8
casual 12.7
two 12.7
family 12.4
business 12.1
building 12.1
sitting 12
human 12
child 11.8
clothing 11.5
lady 11.4
couple 11.3
standing 11.3
group 11.3
indoors 10.5
outdoors 10.5
life 10.5
shop 10.3
love 10.3
street 10.1
modern 9.8
looking 9.6
hair 9.5
face 9.2
shopping 9.2
posing 8.9
home 8.8
model 8.6
outside 8.6
walking 8.5
school 8.5
togetherness 8.5
park 8.4
house 8.4
wind instrument 8.3
one 8.2
handsome 8
kid 8
smiling 8
mother 7.9
crutch 7.9
look 7.9
boy 7.8
chair 7.7
travel 7.7
device 7.7
old 7.7
outdoor 7.6
walk 7.6
store 7.6
buy 7.5
senior 7.5
mature 7.4
teamwork 7.4
inside 7.4
teenager 7.3
girls 7.3
children 7.3
sexy 7.2
dress 7.2
world 7.2
interior 7.1
happiness 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-12
Gender Male, 97.1%
Calm 82.3%
Sad 10.8%
Confused 2.7%
Angry 1.4%
Surprised 1.3%
Disgusted 0.7%
Happy 0.5%
Fear 0.3%

AWS Rekognition

Age 2-10
Gender Female, 72.7%
Calm 98.6%
Fear 0.3%
Sad 0.3%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 99.1%
Person 81.8%
Person 81.1%
Shoe 98%
Shoe 78.9%
Shoe 74.3%

Text analysis

Amazon

-
TEA
MISS