Human Generated Data

Title

Untitled (woman and two babies, sitting in front of Christmas tree with toys)

Date

1948

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18360

Human Generated Data

Title

Untitled (woman and two babies, sitting in front of Christmas tree with toys)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18360

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Tree 99.7
Plant 99.7
Person 97.7
Human 97.7
Person 95.3
Ornament 93.3
Person 93
Christmas Tree 92.8
Dog 87.2
Mammal 87.2
Animal 87.2
Canine 87.2
Pet 87.2
Dog 68.1
Person 43.3

Clarifai
created on 2023-10-22

people 99.8
group 99.1
child 99
group together 98.9
many 96.5
woman 94
furniture 93.7
adult 93.5
recreation 92.8
home 91.9
several 91.8
man 91.3
two 91.1
family 90.5
three 89.4
five 87.6
wear 87.2
four 86.7
canine 85.8
administration 85.7

Imagga
created on 2022-03-04

swing 52.8
mechanical device 44.2
plaything 42.8
mechanism 32.9
tricycle 29.7
wheelchair 26.4
chair 26.2
wheeled vehicle 25.7
seat 20.9
vehicle 18.1
man 16.8
people 15.6
old 15.3
conveyance 13.8
adult 13.6
person 13.5
city 13.3
furniture 12.9
light 12.7
male 12.1
building 12
travel 12
window 11
black 10.8
urban 10.5
portrait 10.3
architecture 10.1
dirty 9.9
tree 9.2
street 9.2
child 9.1
lady 8.9
posing 8.9
sexy 8.8
standing 8.7
cold 8.6
men 8.6
walk 8.6
grunge 8.5
dark 8.3
park 8.3
fashion 8.3
danger 8.2
style 8.2
happy 8.1
work 8.1
world 8.1
life 7.9
day 7.8
wall 7.7
outdoors 7.7
youth 7.7
silhouette 7.4
holding 7.4
retro 7.4
dress 7.2
smile 7.1
women 7.1
night 7.1
sky 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

black and white 93.3
text 91.6
person 72
old 52.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 63.6%
Calm 56.3%
Happy 24%
Surprised 15.7%
Fear 1.4%
Angry 1%
Sad 0.7%
Disgusted 0.6%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Dog
Person 97.7%
Person 95.3%
Person 93%
Person 43.3%
Dog 87.2%
Dog 68.1%

Categories