Human Generated Data

Title

Untitled (girl holding baton to her right)

Date

1950

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1689

Human Generated Data

Title

Untitled (girl holding baton to her right)

People

Artist: John Deusing, American active 1940s

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1689

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99
Human 99
Clothing 93.6
Apparel 93.6
Shoe 93.4
Footwear 93.4
Dance Pose 90.2
Leisure Activities 90.2
Dance 86.5
Female 81.1
Dress 78.5
Girl 69.7
Costume 68.8
Outdoors 64.8
Skirt 61.4
Portrait 60.6
Photography 60.6
Face 60.6
Photo 60.6
Shoe 58.3
Ballet 56.7

Clarifai
created on 2023-10-15

monochrome 99.1
fashion 98.6
portrait 98.4
woman 98.1
one 97.7
people 97.6
girl 97.2
wear 96
adult 95.9
dancing 94.4
art 93
model 92.9
glamour 91.3
sexy 90.9
winter 90.9
dress 89.6
costume 89.3
man 88.3
music 88.2
retro 88.1

Imagga
created on 2021-12-14

brass 24
person 22.7
human 22.5
art 21.8
people 20.6
body 20
wind instrument 19.7
black 18.4
dance 18.3
man 17.6
male 17
adult 16.9
fashion 15.8
dancer 15.3
performer 15.2
silhouette 14.9
active 14.4
team 14.3
posing 14.2
group 13.7
fitness 13.5
musical instrument 12.9
sexy 12.8
style 12.6
anatomy 12.6
figure 12.3
attractive 11.9
health 11.8
3d 11.6
portrait 11
elegance 10.9
model 10.9
pose 10.9
skeleton 10.7
sport 10.7
light 10.7
cornet 10.5
men 10.3
exercise 10
biology 9.5
teamwork 9.3
leg 9.2
statue 9.1
dress 9
fun 9
women 8.7
hands 8.7
render 8.6
modern 8.4
pretty 8.4
studio 8.4
hand 8.4
science 8
medical 7.9
sculpture 7.9
outfit 7.9
skull 7.8
sitting 7.7
motion 7.7
stand 7.6
healthy 7.6
creation 7.5
bust 7.5
negative 7.4
lady 7.3
business 7.3
lifestyle 7.2
hair 7.1
happiness 7
costume 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 98.7
text 95.7
clothing 88.9
human face 88
dance 80.7
person 80.6
dress 54.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Female, 87.1%
Calm 48.3%
Happy 35.1%
Confused 6.1%
Surprised 4.2%
Fear 2%
Sad 2%
Disgusted 1.4%
Angry 0.9%

Feature analysis

Amazon

Person 99%
Shoe 93.4%
Skirt 61.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2021-12-14

a person in a vase 42.6%
a person standing next to a vase 35.3%

Text analysis

Amazon

TOUCH