Human Generated Data

Title

Untitled (children dancing in classroom)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17007

Human Generated Data

Title

Untitled (children dancing in classroom)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.9
Apparel 99.9
Human 99.7
Person 99.7
Dress 99.2
Person 99.2
Female 99.1
Person 99
Person 98.7
Person 98.6
Person 98.6
Person 98.5
Dance Pose 97.6
Leisure Activities 97.6
Skirt 95.1
Person 94.6
Shorts 94.2
Woman 93.1
Footwear 85.3
Shoe 85.3
Stage 84.3
Shoe 81.8
Person 80.6
Girl 74.6
People 70.2
Portrait 68.7
Photography 68.7
Photo 68.7
Face 68.7
Coat 65.9
Suit 65.9
Overcoat 65.9
Person 60.9
Kid 59.8
Child 59.8
Floor 58.7
Shoe 58.4
Icing 55.7
Food 55.7
Dessert 55.7
Creme 55.7
Cream 55.7
Cake 55.7
Play 55.2
Performer 55.2

Imagga
created on 2022-02-26

dancer 47.6
performer 39.6
brass 37.1
people 29.6
wind instrument 29.2
person 28.5
silhouette 27.3
group 26.6
entertainer 24.7
sport 22.9
musical instrument 21.6
man 21.5
team 21.5
men 19.7
dance 19.3
adult 18.7
black 17.4
male 16.3
body 16
active 15.8
trombone 15.1
women 15
play 14.6
lifestyle 13
competition 12.8
figure 12.7
teacher 12.3
fashion 12.1
action 12
style 11.9
music 11.8
fitness 11.7
professional 11.7
recreation 11.6
player 11.6
human 11.2
fun 11.2
party 11.2
motion 11.1
art 11.1
teenager 10.9
exercise 10.9
athlete 10.8
concert 10.7
silhouettes 10.7
posing 10.7
happy 10.6
dancing 10.6
performance 10.5
sexy 10.4
bass 10.4
ball 10.2
girls 10
attractive 9.8
business 9.7
crowd 9.6
guitar 9.6
musician 9.2
stage 9.1
lady 8.9
singer 8.9
standing 8.7
happiness 8.6
sitting 8.6
training 8.3
pose 8.2
shadow 8.1
activity 8.1
together 7.9
educator 7.8
photographer 7.7
run 7.7
jump 7.7
youth 7.7
musical 7.7
athletic 7.7
outline 7.6
power 7.6
leisure 7.5
entertainment 7.4
light 7.3
teen 7.3
game 7.1
portrait 7.1
businessman 7.1
drawing 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

tennis 99.1
text 96.6
person 95.5
outdoor 94.8
clothing 91.1
footwear 89.5
dance 84.5
dress 83.7
woman 77.5
girl 53.3

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 95.5%
Happy 87.1%
Calm 6.7%
Sad 3.7%
Fear 1%
Angry 0.5%
Confused 0.4%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 33-41
Gender Male, 100%
Surprised 63.8%
Sad 20.6%
Fear 7.2%
Happy 3.3%
Angry 1.8%
Disgusted 1.5%
Calm 0.9%
Confused 0.9%

AWS Rekognition

Age 20-28
Gender Female, 50.5%
Sad 88.1%
Calm 8.7%
Angry 1.4%
Fear 0.6%
Confused 0.4%
Surprised 0.4%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 18-24
Gender Male, 98.4%
Calm 69.7%
Disgusted 13.5%
Fear 4.4%
Surprised 4%
Sad 3.2%
Happy 1.9%
Angry 1.7%
Confused 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 85.3%

Captions

Microsoft

a group of people on a court with a racket 68.7%
a group of people standing on a court with a racket 67.1%
a group of people standing on a court 67%

Text analysis

Amazon

15

Google

MJIA- -YTERA┬░2
MJIA-
-YTERA┬░2