Human Generated Data

Title

Untitled (group shot of dance class)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18403

Human Generated Data

Title

Untitled (group shot of dance class)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18403

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.5
Human 99.5
Person 97.4
Person 97.2
Person 97
Person 94.2
Clothing 92.9
Apparel 92.9
Person 92.1
Helmet 90.7
Female 88
Person 87.7
Chair 83.1
Furniture 83.1
Person 77.7
Indoors 74.8
Girl 70
Room 67
Person 66.9
People 66.5
Floor 66
Woman 65.5
Person 64.5
Crowd 64.5
Dance Pose 64.2
Leisure Activities 64.2
Shorts 64.2
Face 63.1
Portrait 63
Photography 63
Photo 63
Suit 62.7
Coat 62.7
Overcoat 62.7
Kid 60.5
Child 60.5
Person 60.5
Play 59.8
Dance 58
Stage 58

Clarifai
created on 2023-10-22

people 99.2
group together 98.9
adult 96.6
child 96.6
many 96.3
adolescent 95.7
motion 95.7
group 95.5
action energy 94.5
fun 93.7
jumping 93.5
woman 93.3
recreation 92.6
competition 92.5
sports equipment 92
man 91.5
sport 90.6
several 89.1
boy 86.7
exercise 86.6

Imagga
created on 2022-03-04

person 44.5
planner 41.8
sport 40.5
ball 26.5
man 26.2
fun 24.7
action 24.1
active 23
athlete 23
people 21.7
male 21.3
jump 21.1
player 20.3
basketball 17.4
teenager 17.3
play 17.2
team 17
body 16.8
dance 16.4
exercise 16.3
happy 16.3
adult 16.3
fitness 16.3
fashion 15.8
athletic 15.3
equipment 15
style 14.8
men 14.6
jumping 14.5
dancer 13.8
lifestyle 13.7
motion 13.7
performer 13.4
game 13.4
black 12.6
basketball equipment 12.5
sports equipment 11.7
recreation 11.6
dancing 11.6
cool 11.5
outdoor 11.5
boy 11.3
sports 11.1
competition 11
playing 10.9
power 10.9
model 10.9
silhouette 10.8
activity 10.7
healthy 10.7
posing 10.7
studio 10.6
game equipment 10.6
performance 10.5
youth 10.2
freedom 10.1
smile 10
attractive 9.8
human 9.7
outdoors 9.7
run 9.6
artist 9.6
guy 9.4
fly 9.3
training 9.2
teen 9.2
leisure 9.1
one 9
stadium 8.9
sky 8.9
wheeled vehicle 8.9
group 8.9
aerobics 8.8
goal 8.6
happiness 8.6
portrait 8.4
summer 8.4
pose 8.1
spectator 8.1
sexy 8
women 7.9
grass 7.9
cute 7.9
together 7.9
art 7.8
kick 7.8
football 7.7
grunge 7.7
jeans 7.6
enjoying 7.6
elegance 7.5
legs 7.5
field 7.5
dark 7.5
joy 7.5
fit 7.4
street 7.4
girls 7.3
stylish 7.2
vehicle 7.2
park 7.1
face 7.1
runner 7
modern 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.2
cartoon 64.2
person 61.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 100%
Surprised 41.5%
Calm 39.4%
Sad 7%
Angry 3.6%
Happy 3.4%
Fear 2.5%
Disgusted 1.8%
Confused 0.9%

AWS Rekognition

Age 31-41
Gender Male, 99.6%
Calm 92.3%
Sad 1.8%
Happy 1.8%
Surprised 1.6%
Angry 0.8%
Disgusted 0.8%
Fear 0.5%
Confused 0.4%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Surprised 70%
Calm 15.1%
Happy 9.2%
Fear 2.4%
Sad 1.5%
Disgusted 0.7%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Surprised 60%
Calm 33.9%
Sad 1.8%
Happy 1.4%
Disgusted 0.9%
Fear 0.8%
Angry 0.8%
Confused 0.5%

AWS Rekognition

Age 18-26
Gender Male, 99.7%
Happy 63.6%
Surprised 24.8%
Calm 5.3%
Disgusted 1.6%
Sad 1.5%
Angry 1.4%
Confused 1.3%
Fear 0.5%

AWS Rekognition

Age 22-30
Gender Male, 98.8%
Calm 65%
Happy 19.1%
Sad 9.4%
Surprised 2.4%
Confused 2%
Disgusted 0.9%
Angry 0.8%
Fear 0.4%

AWS Rekognition

Age 1-7
Gender Male, 98.6%
Calm 68.8%
Happy 26.2%
Fear 3.3%
Surprised 0.4%
Sad 0.4%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Happy 63.7%
Sad 16.3%
Calm 14.4%
Fear 2.7%
Angry 1.2%
Surprised 0.8%
Confused 0.6%
Disgusted 0.4%

AWS Rekognition

Age 31-41
Gender Female, 99.1%
Happy 46.3%
Fear 28.2%
Surprised 18.7%
Calm 1.7%
Sad 1.5%
Angry 1.3%
Disgusted 1.2%
Confused 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Chair
Person 99.5%
Person 97.4%
Person 97.2%
Person 97%
Person 94.2%
Person 92.1%
Person 87.7%
Person 77.7%
Person 66.9%
Person 64.5%
Person 60.5%
Helmet 90.7%
Chair 83.1%

Categories

Captions

Microsoft
created on 2022-03-04

a group of people jumping in the air 62%

Text analysis

Amazon

NACO
rap