Human Generated Data

Title

Untitled (two drum majorettes)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1684

Human Generated Data

Title

Untitled (two drum majorettes)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1684

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.4
Human 98.4
Clothing 94.9
Apparel 94.9
Dress 94.1
Person 93.4
Female 90.3
Dance 89.8
Costume 84.4
Shoe 78
Footwear 78
Dance Pose 77.5
Leisure Activities 77.5
Ballet 74.9
Girl 74.3
Skirt 66.2
Portrait 65.6
Face 65.6
Photography 65.6
Photo 65.6
Woman 64.5
Stage 63.7
Kid 56.6
Child 56.6
Ballerina 56.1

Clarifai
created on 2023-10-15

monochrome 99.3
people 99.2
dancing 98.1
man 97.6
ballet 97.5
music 96.8
girl 96.2
woman 96.1
dancer 95.9
couple 95.7
wedding 95.1
ballerina 94.8
portrait 93
adult 91.4
black and white 90.9
fashion 90.4
dress 89.8
bride 89.7
love 88.6
model 87.9

Imagga
created on 2021-12-14

silhouette 38.9
man 31
art 30
dance 28.7
people 25.7
person 24
male 22
men 20.6
3d 17.8
team 17
group 16.9
human 16.5
sport 16.5
sorcerer 16.3
business 15.8
creation 15.1
dancer 14.8
black 14.7
graphics 14
outfit 13.6
silhouettes 13.6
figure 12.2
party 12
body 12
fashion 11.3
fun 11.2
teamwork 11.1
active 10.8
cartoon 10.7
design 10.7
bust 10.4
action 10.2
sculpture 10
businessman 9.7
success 9.7
graphic 9.5
symbol 9.4
adult 9.1
music 9
suit 9
performer 8.8
play 8.6
playing 8.2
dress 8.1
shadow 8.1
render 7.8
stage 7.8
crowd 7.7
performance 7.7
power 7.6
stock 7.5
training 7.4
graphic art 7.3
competition 7.3
game 7.1
women 7.1
boy 7.1
model 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

dance 98.4
text 98.1
wall 96.3
dress 85.7
posing 75.6
clothing 73.1
person 57.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 83%
Happy 90.3%
Calm 5.8%
Surprised 2.1%
Fear 0.9%
Sad 0.3%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 35-51
Gender Female, 97.5%
Calm 65.1%
Happy 21.7%
Confused 4.3%
Surprised 4.2%
Sad 2%
Fear 1.4%
Angry 0.7%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Shoe 78%
Skirt 66.2%

Categories

Imagga

paintings art 100%