Human Generated Data

Title

Untitled (two women seated on bench in studio for portrait)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1585

Human Generated Data

Title

Untitled (two women seated on bench in studio for portrait)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1585

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Dress 99
Person 99
Human 99
Person 98.5
Female 97.1
Woman 89
Face 87.4
People 81.5
Girl 73.7
Smile 73.1
Leisure Activities 72.9
Portrait 70.9
Photography 70.9
Photo 70.9
Dance Pose 63.1
Footwear 61.5
Skirt 60
Shoe 56.3
Coat 55.2
Overcoat 55.2
Suit 55.2

Clarifai
created on 2023-10-15

people 99.9
adult 99.3
print 98.9
woman 97.7
wear 96.7
two 96.5
monochrome 95.6
veil 95.4
man 94.9
portrait 94.9
etching 93.8
illustration 93.5
group 93.3
royalty 92.8
lid 92.7
canine 92.6
sit 89.6
affection 88.4
art 87
engraving 87

Imagga
created on 2021-12-14

negative 37.2
film 31.4
kin 23.5
photographic paper 22.1
portrait 20.1
dress 19.9
sexy 17.7
fountain 17.6
fashion 16.6
posing 15.1
people 15.1
model 14.8
photographic equipment 14.7
art 14.1
lady 13.8
structure 13.7
attractive 13.3
black 13.2
person 13
pretty 12.6
clothing 11.9
hair 11.9
statue 11.6
elegance 10.9
bride 10.8
vintage 10.7
sculpture 10.6
man 10.3
adult 9.9
love 9.5
head 9.2
face 9.2
sensuality 9.1
old 9.1
style 8.9
snow 8.9
outdoor 8.4
human 8.2
body 8
women 7.9
world 7.9
look 7.9
design 7.9
eyes 7.7
grunge 7.7
sport 7.6
city 7.5
park 7.4
decoration 7.4
retro 7.4
wedding 7.4
mother 7.3
makeup 7.3
pose 7.2
lifestyle 7.2
swimsuit 7.2
male 7.2
cute 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.5
sketch 96.6
drawing 93.6
clothing 92.8
person 91.6
posing 84
woman 74.3
human face 72.9
dress 71.7
old 61.9
footwear 61.9
smile 56.1
image 34.2
picture frame 6.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 70.1%
Calm 90.5%
Surprised 6.1%
Sad 1.8%
Happy 0.5%
Confused 0.5%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 19-31
Gender Female, 90%
Angry 70.4%
Fear 9.5%
Calm 8.1%
Surprised 6%
Sad 2.6%
Happy 2.1%
Disgusted 0.7%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

paintings art 98.2%
people portraits 1.6%

Text analysis

Amazon

AGFANITRATEFICA

Google

TRATE
TRATE FILM
FILM