Human Generated Data

Title

Untitled (dance class; girls cross stage)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.478

Human Generated Data

Title

Untitled (dance class; girls cross stage)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.478

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Dance 99.2
Human 99.2
Ballet 98.2
Ballerina 97.1
Dance Pose 96.9
Leisure Activities 96.9
Person 96.3
Person 96.3
Person 96.1
Person 92.7
Person 92.3
Person 91
Person 89.1
Person 85.5
Stage 71.1

Clarifai
created on 2023-10-26

people 99.5
child 99
group 99
dancer 98.8
group together 97.8
dancing 97.2
ballet 95.6
ballerina 94.5
ballet dancer 94.2
music 94.1
woman 93.4
monochrome 92.6
wear 92.4
son 92.3
family 92.1
rehearsal 92
girl 92
man 91.9
adult 89.7
boy 87.9

Imagga
created on 2022-01-23

dancer 63.7
performer 49.3
person 44.1
entertainer 36.9
people 32.3
adult 30.9
women 24.5
portrait 21.3
male 20.6
sitting 19.7
fashion 19.6
attractive 19.6
man 18.8
happy 18.8
sexy 18.5
happiness 18
room 17.7
pretty 16.8
kin 15.9
indoors 15.8
fun 15.7
men 15.4
lady 14.6
smiling 14.5
smile 14.2
group 13.7
indoor 13.7
two 13.5
family 13.3
leisure 13.3
interior 13.3
model 13.2
together 13.1
couple 13.1
cheerful 13
youth 12.8
hair 12.7
professional 12.4
brunette 12.2
teacher 12
body 12
home 12
style 11.9
love 11.8
relaxation 11.7
lifestyle 11.6
blond 10.9
black 10.9
dress 10.8
casual 10.2
teen 10.1
cute 10
teenager 10
exercise 10
studio 9.9
handsome 9.8
human 9.7
world 9.7
teenage 9.6
relax 9.3
elegance 9.2
mother 9.2
sport 9.1
girls 9.1
clothing 9
looking 8.8
passion 8.5
house 8.4
joyful 8.3
20s 8.2
sensual 8.2
gorgeous 8.2
stylish 8.1
suit 8.1
team 8.1
child 8
businessman 7.9
business 7.9
look 7.9
urban 7.9
husband 7.7
couch 7.7
sit 7.6
legs 7.5
enjoy 7.5
friends 7.5
educator 7.5
one 7.5
inside 7.4
sensuality 7.3
posing 7.1
face 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

dance 99.2
dress 92.5
person 92.1
sport 85.7
ballet 84.4
clothing 84.3
woman 84.2
group 72.5
dancing 67.8
posing 61.6
dancer 53.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-10
Gender Female, 100%
Confused 91.8%
Happy 5.4%
Calm 1%
Surprised 0.4%
Angry 0.4%
Disgusted 0.4%
Sad 0.3%
Fear 0.3%

AWS Rekognition

Age 1-7
Gender Male, 97.9%
Happy 76.7%
Sad 16.6%
Angry 2.2%
Calm 1.5%
Fear 1.4%
Surprised 0.7%
Disgusted 0.6%
Confused 0.3%

AWS Rekognition

Age 0-6
Gender Female, 100%
Happy 86.6%
Surprised 8.6%
Confused 2.4%
Calm 0.6%
Fear 0.5%
Disgusted 0.5%
Angry 0.4%
Sad 0.3%

AWS Rekognition

Age 20-28
Gender Female, 77.6%
Calm 96.3%
Sad 1.8%
Confused 0.7%
Surprised 0.6%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 0-6
Gender Male, 81%
Calm 93.2%
Surprised 3.6%
Confused 2%
Happy 0.4%
Fear 0.4%
Disgusted 0.3%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 2-8
Gender Female, 99.9%
Happy 50.5%
Calm 29.8%
Fear 9.5%
Sad 4.7%
Angry 2.4%
Disgusted 1.5%
Surprised 1%
Confused 0.6%

AWS Rekognition

Age 2-8
Gender Female, 61.9%
Calm 95.9%
Fear 2.5%
Surprised 0.5%
Confused 0.3%
Disgusted 0.2%
Sad 0.2%
Happy 0.2%
Angry 0.2%

AWS Rekognition

Age 22-30
Gender Female, 99.8%
Calm 98.3%
Sad 1.3%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Happy 0%

AWS Rekognition

Age 4-10
Gender Female, 78.1%
Calm 98.1%
Happy 0.6%
Sad 0.5%
Fear 0.4%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Male, 97.9%
Calm 40.5%
Surprised 22.9%
Fear 12.2%
Disgusted 7.2%
Angry 5.9%
Sad 5.2%
Confused 3.7%
Happy 2.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Person 96.3%
Person 96.3%
Person 96.1%
Person 92.7%
Person 92.3%
Person 91%
Person 89.1%
Person 85.5%

Categories

Imagga

pets animals 99.9%