Human Generated Data

Title

Untitled (woman with dog in front of Christmas tree)

Date

1948, printed later

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.398

Human Generated Data

Title

Untitled (woman with dog in front of Christmas tree)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1948, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.398

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Tree 98.9
Plant 98.9
Person 98.8
Human 98.8
Person 93.9
Living Room 93.6
Indoors 93.6
Room 93.6
Couch 85.8
Furniture 85.8
Ornament 76.1
Clothing 58.7
Apparel 58.7
Portrait 58.4
Photography 58.4
Face 58.4
Photo 58.4
Christmas Tree 53.6

Clarifai
created on 2023-10-25

people 99.9
monochrome 97.9
adult 95.8
child 94.5
group 94.3
two 93.9
woman 93.7
portrait 91.4
street 90.5
man 90.4
group together 89.7
guitar 89.5
music 89.4
art 87.2
furniture 84.8
war 84.3
actress 82.6
one 82.3
wear 82
documentary 81.5

Imagga
created on 2022-01-08

mother 22.8
tricycle 20.4
parent 20.3
wheeled vehicle 19.4
old 18.1
building 16
city 15.8
child 15.8
vehicle 15.7
swing 15.6
kin 15.5
street 14.7
people 14.5
chair 14.1
architecture 14.1
tree 13.8
man 13.4
travel 13.4
park 13.3
plaything 12.8
portrait 12.3
outdoors 12.1
mechanical device 12
person 11.9
world 10.9
road 10.8
outdoor 10.7
male 10.5
snow 10.4
antique 10.4
ancient 10.4
conveyance 10.4
seat 10.3
wheelchair 9.4
winter 9.4
dark 9.2
vintage 9.1
dress 9
mechanism 8.9
urban 8.7
scene 8.7
wall 8.6
cold 8.6
tourist 8.6
black 8.4
adult 8.4
window 8.4
sky 8.3
tourism 8.2
furniture 8.2
dirty 8.1
light 8
holiday 7.9
forest 7.8
father 7.8
walk 7.6
stone 7.6
dad 7.4
tradition 7.4
lifestyle 7.2
transportation 7.2
night 7.1
interior 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 89.5
clothing 88.5
christmas tree 87.1
black and white 79.8
text 73.4
group 55.9
people 55.2
woman 53.7
posing 39.3
clothes 18.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Female, 99.8%
Happy 96.2%
Angry 1.2%
Surprised 0.8%
Confused 0.6%
Disgusted 0.6%
Fear 0.3%
Calm 0.2%
Sad 0.2%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%