Human Generated Data

Title

Untitled (woman in plaid dress, holding dog, in front of Christmas tree, surrounded by toys)

Date

c. 1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18365

Human Generated Data

Title

Untitled (woman in plaid dress, holding dog, in front of Christmas tree, surrounded by toys)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18365

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Tree 99.7
Plant 99.7
Person 99.2
Human 99.2
Dog 86.6
Mammal 86.6
Animal 86.6
Canine 86.6
Pet 86.6
Ornament 82.2
Person 82.2
Vegetation 79.6
Dog 72.3
Person 71.3
Portrait 62.7
Photography 62.7
Photo 62.7
Face 62.7
Leisure Activities 62.4
Clothing 60.6
Apparel 60.6
Christmas Tree 60
Woodland 58.1
Nature 58.1
Outdoors 58.1
Land 58.1
Forest 58.1
Costume 56

Clarifai
created on 2023-10-22

people 99.9
group 98
group together 97.3
adult 97.1
woman 96.1
man 95
child 93.8
administration 93.5
furniture 91.6
two 90.5
war 88.2
wear 88.2
vehicle 87.9
many 87.2
actress 86.6
leader 86.4
recreation 85.4
actor 85.3
home 84.6
military 84.6

Imagga
created on 2022-03-04

fountain 62.8
structure 41.4
swing 23.6
mechanical device 17.2
travel 16.9
park 16.8
plaything 15.9
snow 15.4
old 15.3
building 14.3
architecture 14.1
tree 13.1
outdoors 12.8
mechanism 12.8
man 12.8
city 12.5
palanquin 12.3
people 12.3
child 12
outdoor 11.5
winter 11.1
danger 10.9
water 10.7
forest 10.4
litter 10.1
holiday 10
light 10
day 9.4
dark 9.2
mother 9.1
landscape 8.9
cold 8.6
parent 8.6
season 8.6
culture 8.5
conveyance 8.5
vintage 8.3
tourism 8.2
religion 8.1
sun 8
trees 8
weather 7.9
autumn 7.9
art 7.9
ancient 7.8
portrait 7.8
play 7.7
grunge 7.7
woods 7.6
walk 7.6
tourist 7.6
fun 7.5
crutch 7.5
color 7.2

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 96.1
christmas tree 94.1
black and white 84.2
white 62.2
old 59.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 95.9%
Calm 91.8%
Surprised 5.7%
Happy 0.6%
Sad 0.6%
Disgusted 0.4%
Confused 0.4%
Fear 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Christmas Tree
Person 99.2%
Person 82.2%
Person 71.3%
Dog 86.6%
Dog 72.3%