Human Generated Data

Title

Untitled (two kids by Christmas tree)

Date

1944

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1617

Human Generated Data

Title

Untitled (two kids by Christmas tree)

People

Artist: John Deusing, American active 1940s

Date

1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1617

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99
Human 99
Person 98.9
Furniture 93.8
Tree 90.6
Plant 90.6
Indoors 88.2
Clothing 86.1
Apparel 86.1
Room 86
Drawing 81.6
Art 81.6
Chair 78.9
Table 77.2
Sketch 71.8
Dining Table 70.2
Face 68.4
Bedroom 67.9
Living Room 65.9
People 63.8
Flower 61.6
Blossom 61.6
Female 58.1
Photography 57.1
Photo 57.1
Vase 55.1
Pottery 55.1
Jar 55.1
Girl 55

Clarifai
created on 2023-10-15

people 97.4
snow 93.1
wedding 92.4
winter 89.8
bride 89.3
man 87.2
desktop 87
adult 86
chair 84.7
veil 84.6
woman 84.1
Christmas 83.6
decoration 83
princess 82.3
art 80.1
retro 79.5
window 79
picture frame 78.9
group 78.8
sit 78.4

Imagga
created on 2021-12-14

sketch 37.3
negative 33.6
drawing 31.2
film 26.9
photographic paper 20.1
representation 18
grunge 17.9
design 17.4
art 15
blackboard 13.9
people 13.4
photographic equipment 13.4
style 13.3
pattern 13
snow 12.8
old 12.5
retro 12.3
silhouette 11.6
person 11.5
black 11.4
modern 10.5
business 10.3
graphic 10.2
tree 10
fashion 9.8
decoration 9.6
life 9.6
man 9.5
frame 9.4
vintage 9.1
texture 9
cold 8.6
stucco 8.6
adult 8.4
portrait 8.4
house 8.4
color 8.3
light 8
holiday 7.9
face 7.8
scene 7.8
paper 7.7
winter 7.7
floral 7.7
hand 7.6
elegance 7.6
decorative 7.5
chair 7.5
coffee 7.4
paint 7.2
dirty 7.2
interior 7.1
indoors 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 92.6
drawing 84.2
furniture 82.9
table 76.4
black and white 69.4
chair 66.6
vase 64.1
sketch 52.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Female, 51.8%
Fear 77.8%
Calm 10.3%
Sad 6.3%
Happy 2.9%
Angry 1.3%
Surprised 1.1%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-37
Gender Female, 97.7%
Happy 44.9%
Calm 35.5%
Sad 16.6%
Confused 0.9%
Fear 0.8%
Angry 0.7%
Surprised 0.5%
Disgusted 0.1%

Feature analysis

Amazon

Person 99%

Categories

Captions

Microsoft
created on 2021-12-14

an old photo of a person 36.9%
a group of people in a room 36.8%
an old photo of a person 35.2%