Human Generated Data

Title

Untitled (baton twirler)

Date

c. 1952

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16012

Human Generated Data

Title

Untitled (baton twirler)

People

Artist: Jack Gould, American

Date

c. 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16012

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Dance 98.3
Human 98.3
Dance Pose 96.3
Leisure Activities 96.3
Person 95.7
Ballet 95
Ballerina 86.5
Flooring 69.3

Clarifai
created on 2023-10-29

girl 98.9
woman 98.5
wood 98.3
young 97.4
wall 95.7
dancer 95
people 94.4
ballet 92.6
fashion 91.7
glamour 91.6
wooden 91.6
family 91
health 90.8
pretty 90.7
beautiful 89.7
one 88.8
dancing 88.1
floor 87.9
love 87.8
retro 87.6

Imagga
created on 2022-02-11

dancer 31.5
person 28.2
performer 23.6
people 20.6
paper 19.5
adult 19.2
book 19.1
fashion 17.3
attractive 16.1
entertainer 15.7
dress 15.4
old 15.3
body 15.2
vintage 13.2
model 13.2
portrait 12.9
sexy 12.8
grunge 12.8
face 12.1
empty 12
pretty 11.9
retro 11.5
lady 11.4
human 11.2
blank 11.1
style 11.1
hair 11.1
man 10.7
texture 10.4
casual 10.2
clothing 10.1
envelope 9.8
health 9.7
standing 9.6
antique 9.5
lifestyle 9.4
container 9.3
studio 9.1
exercise 9.1
design 9.1
pose 9.1
notebook 8.9
posing 8.9
healthy 8.8
art 8.7
male 8.6
elegance 8.4
relaxation 8.4
page 8.3
color 8.3
fitness 8.1
dirty 8.1
active 8.1
women 7.9
luxury 7.7
one 7.5
document 7.4
action 7.4
sport 7.4
fit 7.4
brown 7.4
note 7.3
figure 7.3
aged 7.2
smiling 7.2
wall 7.2
covering 7.1

Google
created on 2022-02-11

Leg 91.8
Wood 87.5
Entertainment 81.8
Performing arts 81.1
Rectangle 77.4
Fashion design 77.4
Hardwood 75.5
Street fashion 75.3
Waist 74.8
Thigh 74.6
Flooring 74.1
Dance 74
Magenta 73.4
Event 72.3
Human leg 69.2
Sportswear 67.1
Performance art 65.3
Peach 65.1
Font 63.2
Choreography 61.4

Microsoft
created on 2022-02-11

dance 97.9
person 94.4
clothing 94
text 87.3
footwear 86.4
girl 84.1
ballet 82.2
dress 66.4
dancing 52.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Female, 100%
Fear 37.8%
Calm 30%
Happy 23.4%
Confused 3.7%
Surprised 1.6%
Sad 1.4%
Angry 1.3%
Disgusted 0.7%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 95.7%

Categories

Imagga

interior objects 65.8%
paintings art 33.3%

Captions

Microsoft
created on 2022-02-11

a person wearing a costume 47.6%
a person taking a selfie 32.8%
a person posing for a picture 32.7%