Human Generated Data

Title

Untitled (three girls playing in front of Christmas tree)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21647

Human Generated Data

Title

Untitled (three girls playing in front of Christmas tree)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21647

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Play 98.6
Person 98.5
Dress 97.9
Clothing 97.9
Apparel 97.9
Female 95.1
Person 91.7
Face 91.7
Furniture 90.1
Shoe 89.6
Footwear 89.6
Blonde 89.1
Kid 89.1
Girl 89.1
Woman 89.1
Teen 89.1
Child 89.1
Water 88.5
Chair 86.8
Person 86.2
Floor 77.4
Outdoors 74.1
Room 73.1
Indoors 73.1
Plant 72.6
Portrait 71.1
Photography 71.1
Photo 71.1
Living Room 70.7
Tree 69.9
Table 68.7
Baby 61.6
Nature 60
Costume 56.5
Boy 56.1
Dining Table 55.9
Flooring 55.3
Shoe 52.5

Clarifai
created on 2023-10-22

people 99.6
child 99.1
two 95.2
group 92.7
boy 91.7
adult 90.9
woman 90.7
son 88.5
family 87.2
education 86.7
recreation 85.3
three 83.1
man 82.3
wear 80.5
furniture 80
group together 78.6
girl 78.6
monochrome 75.5
room 75
offspring 74.8

Imagga
created on 2022-03-05

man 25.5
adult 24.5
people 22.9
male 22
portrait 21.3
person 21
musical instrument 21
black 18.1
sexy 16.9
percussion instrument 16.6
face 14.2
attractive 14
pretty 13.3
happy 13.2
model 12.4
device 12.4
lifestyle 12.3
smile 12.1
child 12.1
fashion 12.1
marimba 11.8
music 11.8
together 11.4
instrument 11.3
human 11.2
hair 11.1
youth 11.1
couple 10.5
style 10.4
guitar 10.4
women 10.3
family 9.8
lady 9.7
mask 9.5
musician 9.4
two 9.3
blond 9.3
life 9.2
sensual 9.1
gun 8.9
school 8.4
playing 8.2
dress 8.1
group 8.1
looking 8
world 8
hairdresser 7.9
love 7.9
look 7.9
urban 7.9
work 7.8
brunette 7.8
play 7.8
men 7.7
modern 7.7
casual 7.6
joy 7.5
mother 7.4
weapon 7.2
home 7.2
to 7.1

Google
created on 2022-03-05

Black-and-white 85.9
Gesture 85.3
Style 84.1
Art 77.8
Sunglasses 76.3
Monochrome 73.9
Monochrome photography 73.5
Font 70.1
Vintage clothing 69.1
Curtain 67.6
Chair 67.4
Eyewear 67
Boot 64.6
Sitting 64.4
Visual arts 64.2
Toddler 63.6
Room 62.8
Stock photography 61.7
Fur 60.2
Illustration 58.6

Microsoft
created on 2022-03-05

black and white 95.1
text 94.4
person 92
footwear 90
clothing 87.2
furniture 79.7
monochrome 73
street 70.9
table 58.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 96.9%
Happy 98.5%
Calm 0.4%
Surprised 0.3%
Angry 0.2%
Sad 0.2%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 98.5%
Person 91.7%
Person 86.2%
Shoe 89.6%
Shoe 52.5%

Categories

Captions

Microsoft
created on 2022-03-05

a person wearing a costume 68.8%
a person wearing a costume 63.2%
a person wearing a costume 51.5%

Text analysis

Amazon

"
-
p
" -11-22
- and p
and
-11-22