Human Generated Data

Title

Untitled (two kids by Christmas tree)

Date

1944

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1611

Human Generated Data

Title

Untitled (two kids by Christmas tree)

People

Artist: John Deusing, American active 1940s

Date

1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1611

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.3
Human 98.3
Person 86.1
Room 81.7
Indoors 81.7
Clothing 77.2
Apparel 77.2
People 70.4
Female 68
Urban 67.6
Skin 66.7
Furniture 66.2
Housing 64.8
Building 64.8
Tree 64.6
Plant 64.6
Girl 62
City 61.3
Town 61.3
Vase 60.4
Pottery 60.4
Jar 60.4
Bedroom 59.3
Text 57.2
Performer 56

Clarifai
created on 2023-10-15

people 99.4
child 97.6
adult 94.8
snow 94.2
monochrome 92.3
man 92.1
two 91
winter 88.1
family 87.8
woman 86.4
fun 85.9
tree 84.4
group 82.2
girl 81.2
portrait 77.9
desktop 77.5
chair 77.4
offspring 77.2
sit 76.5
love 74.3

Imagga
created on 2021-12-14

windowsill 40.8
sill 32.7
snow 32.6
winter 25.5
structural member 24.6
cold 20.6
support 18.3
man 16.1
people 15.6
tree 14.6
frost 14.4
ice 14.1
person 13.9
black 13.8
snowy 13.6
frozen 13.4
weather 12
grunge 11.9
device 11.5
season 10.9
park 10.8
male 10.7
scene 10.4
negative 10.2
happy 10
adult 9.9
portrait 9.7
men 9.4
holiday 8.6
smile 8.5
poster 8.5
silhouette 8.3
fun 8.2
pattern 8.2
landscape 8.2
light 8
happiness 7.8
drawing 7.7
wall 7.7
old 7.7
power 7.6
human 7.5
water 7.3
business 7.3
paint 7.2
dirty 7.2
frame 7.2
women 7.1
trees 7.1
cool 7.1
businessman 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

window 99.6
text 91.2
person 79.9
black and white 77.5
human face 55.9
old 55.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-30
Gender Female, 55.4%
Calm 98.6%
Happy 0.7%
Sad 0.5%
Angry 0.1%
Surprised 0.1%
Confused 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Categories

Imagga

paintings art 99.2%

Text analysis

Amazon

a