Human Generated Data

Title

Untitled (drum majorette, twirling seen from front)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1683

Human Generated Data

Title

Untitled (drum majorette, twirling seen from front)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1683

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Costume 96.8
Human 96.1
Person 92.3
Dance 90.9
Dance Pose 86.6
Leisure Activities 86.6
Figurine 83.3
Clothing 66.5
Apparel 66.5
Head 60.4
Flooring 57.8
Girl 57.7
Female 57.7
Ballet 55.8

Clarifai
created on 2023-10-15

monochrome 98.6
people 98.4
man 97.5
ballet 96.6
fashion 96.5
woman 96.2
dancing 96
girl 94.5
music 94.4
portrait 93.1
dancer 92.9
adult 89.5
model 89.4
art 89
costume 88.9
ballerina 87.7
winter 87
snow 86.4
wear 86.3
one 85.7

Imagga
created on 2021-12-14

3d 34.9
man 29
silhouette 23.2
male 22.7
human 22.5
person 21.3
art 20.9
people 20.1
figure 18.5
anatomy 18.4
dancer 18.1
character 17.9
render 16.5
skeleton 15.6
men 15.5
dance 14.4
cartoon 14.3
science 14.2
graphic 13.9
body 13.6
black 13.3
sport 12.4
baron 11.6
medicine 11.5
action 11.1
back 11
pose 10.9
automaton 10.8
bone 10.7
performer 10.5
hand 10.1
fiction 9.8
skull 9.8
medical 9.7
performance 9.6
biology 9.5
fashion 9.1
design 9
bones 8.8
x ray 8.8
warrior 8.8
business 8.5
concepts 8
spine 7.8
dancing 7.7
muscle 7.7
health 7.6
elegance 7.6
athlete 7.4
sports 7.4
leg 7.4
object 7.3
metal 7.2
sexy 7.2
painting 7.2
conceptual 7.1

Microsoft
created on 2021-12-14

text 99.6
dance 96.7
clothing 88.1
person 86.9
human face 78.7
footwear 70.4
woman 64.5
high heels 53.9
posing 39.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 95.4%
Happy 97%
Calm 2.2%
Sad 0.3%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.3%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

ass

Google

asA
asA