Human Generated Data

Title

Untitled (young girls dressed in costume)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1780

Human Generated Data

Title

Untitled (young girls dressed in costume)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1780

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Face 98.9
Human 98.9
Person 98.7
Smile 97.1
Person 93.9
Person 93.3
Person 89.1
Clothing 85.6
Apparel 85.6
People 82
Head 78.7
Sunglasses 77.8
Accessories 77.8
Accessory 77.8
Female 76.1
Photography 72.4
Photo 72.4
Portrait 72.4
Dog 67.6
Mammal 67.6
Animal 67.6
Canine 67.6
Pet 67.6
Poster 63.5
Advertisement 63.5
Woman 59.3
Drawing 58.2
Art 58.2
Girl 56.9
Performer 56.5

Clarifai
created on 2023-10-15

people 99.9
group 99.8
man 98.2
adult 97.5
wear 97.4
leader 94.9
outfit 93.9
portrait 92.6
print 92.4
veil 91.6
three 91.5
group together 91.3
illustration 90.8
theater 90.3
several 89.9
music 87
art 85.6
five 85.3
retro 83.9
exploration 80.9

Imagga
created on 2021-12-14

grunge 27.2
vintage 22.3
old 22.3
art 18.4
antique 17.3
man 16.8
decoration 15.6
black 15.6
texture 15.3
retro 14.7
drawing 14.7
kin 14.6
negative 14.2
person 12.6
structure 12.4
film 12.2
aged 11.8
frame 11.7
grungy 11.4
sketch 11.3
graphic 10.9
pattern 10.9
men 10.3
world 10.3
wall 10.3
people 10
paint 10
dirty 9.9
design 9.6
symbol 9.4
detail 8.8
paper 8.8
head 8.4
color 8.3
landscape 8.2
dress 8.1
religion 8.1
brass 7.9
business 7.9
textured 7.9
male 7.9
ancient 7.8
memorial 7.8
billboard 7.8
space 7.8
old fashioned 7.6
poster 7.6
graffito 7.5
rough 7.3
cemetery 7.3
border 7.2
face 7.1
architecture 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

drawing 99.3
posing 99.2
text 98.5
sketch 97.6
book 97.6
window 82.9
person 82.7
painting 81.7
clothing 74.6
cartoon 74.2
human face 66.3
old 41.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 83.7%
Calm 63%
Sad 25.7%
Surprised 3.1%
Happy 3%
Angry 2.8%
Fear 1.2%
Confused 1%
Disgusted 0.2%

AWS Rekognition

Age 19-31
Gender Female, 91.1%
Happy 79.5%
Calm 12.1%
Angry 6.2%
Surprised 1%
Sad 0.4%
Confused 0.4%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Female, 83.3%
Happy 33.9%
Calm 30.4%
Surprised 27.2%
Angry 4.3%
Fear 2.2%
Sad 1.3%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 26-40
Gender Female, 54.1%
Angry 93%
Sad 3%
Fear 2.5%
Calm 0.7%
Happy 0.5%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 22-34
Gender Female, 77.3%
Happy 39.5%
Sad 33.9%
Calm 15.6%
Angry 7.9%
Fear 1.5%
Surprised 0.7%
Confused 0.6%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Sunglasses 77.8%
Dog 67.6%
Poster 63.5%

Categories

Imagga

paintings art 96.4%
people portraits 2.2%