Human Generated Data

Title

Untitled (woman with dolls by Christmas tree)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1634

Human Generated Data

Title

Untitled (woman with dolls by Christmas tree)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1634

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Chair 96.6
Furniture 96.6
Human 95.2
Person 93.2
Face 90.3
Female 86.8
Dress 85.7
Door 78.4
Person 77.7
Suit 77.2
Coat 77.2
Overcoat 77.2
People 74.5
Costume 72.7
Portrait 71.3
Photography 71.3
Photo 71.3
Fashion 70.5
Gown 69.2
Robe 68.6
Girl 67.6
Woman 66.7
Plant 61.9
Kid 58.8
Child 58.8
Tree 58.3
Wedding 57.5
Table 56.7
Bridegroom 56.3

Clarifai
created on 2023-10-25

people 99.2
monochrome 98.3
musician 94.1
veil 92.6
sit 91.8
man 90.9
wear 90.3
art 90.2
dress 89.4
music 89.1
vintage 88.6
retro 88.2
adult 87.9
portrait 87.4
sitting 86
antique 85.8
old 85.8
woman 85.3
illustration 83.6
chair 83.3

Imagga
created on 2021-12-14

negative 39.5
film 30.4
photographic paper 23.5
person 20.8
portrait 18.1
people 17.8
adult 17.1
fashion 16.6
snow 15.7
photographic equipment 15.7
dress 15.4
wall 14.5
sexy 14.4
sketch 14.1
man 13.4
drawing 13.3
black 13.2
lady 13
winter 12.8
building 12.2
face 11.4
happiness 11
city 10.8
vintage 10.7
couple 10.4
one 10.4
cold 10.3
hair 10.3
love 10.3
male 10
cool 9.8
old 9.7
human 9.7
weather 9.7
happy 9.4
water 9.3
model 9.3
smile 9.3
posing 8.9
architecture 8.6
skin 8.5
pretty 8.4
alone 8.2
style 8.2
clothing 8.1
mother 8.1
work 8
newspaper 8
romantic 8
looking 8
art 7.9
urban 7.9
day 7.8
child 7.8
attractive 7.7
house 7.5
clothes 7.5
light 7.3
detail 7.2
cute 7.2
holiday 7.2
women 7.1
fountain 7.1
interior 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.8
black and white 89.6
window 87.9
clothing 81.9
person 77.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-59
Gender Female, 74.7%
Calm 78.5%
Happy 11%
Sad 6.2%
Confused 1.4%
Surprised 1.1%
Angry 0.8%
Fear 0.5%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.2%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

SVEE
DEV SVEE
DEV