Human Generated Data

Title

Moving Skip Rope

Date

1952

People

Artist: Harold Edgerton, American 1903 - 1990

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of The Harold and Esther Edgerton Family Foundation, P1996.90

Human Generated Data

Title

Moving Skip Rope

People

Artist: Harold Edgerton, American 1903 - 1990

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of The Harold and Esther Edgerton Family Foundation, P1996.90

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 93.2
Stage 85.1
Leisure Activities 76.9
People 67.9
Dance 66.6
Painting 64.6
Art 64.6
Dance Pose 58.9
Person 42.7

Clarifai
created on 2023-10-15

art 99.7
wear 99
graphic 98
dancer 98
painting 97.9
illustration 97.6
cavalry 97.5
design 97
desktop 96.6
vector 95.6
animal 95.5
abstract 93.7
group 93.3
music 92.4
ballet 91.5
spider 91.4
people 91.3
human 90.6
dancing 90.3
mammal 90

Imagga
created on 2021-12-14

chandelier 100
lighting fixture 100
fixture 78.1
design 24.2
light 22
black 19.8
art 18.8
color 18.3
pattern 17.1
graphic 16.8
night 16
shape 14.9
texture 14.6
dark 14.2
curve 14
element 13.2
wallpaper 13
bright 12.9
star 12.6
glowing 12
motion 12
energy 11.8
holiday 11.5
digital 11.3
flame 11.2
smoke 11.1
glow 11.1
fireworks 10.8
colorful 10.7
explosion 10.6
backgrounds 10.5
festival 10.5
fractal 10.1
space 10.1
celebration 9.6
fire 9.4
display 9.3
lights 9.3
flower 9.2
swirl 9.2
decorative 9.2
celebrate 9
futuristic 9
fun 9
new 8.9
decoration 8.7
harvestman 8.6
illuminated 8.6
plant 8.5
floral 8.5
sky 8.3
arachnid 8.3
vintage 8.3
wire 8.2
retro 8.2
technology 8.2
science 8
explode 7.9
independence 7.8
line 7.8
render 7.8
wave 7.8
television 7.7
party 7.7
silhouette 7.4
backdrop 7.4
form 7.4
generated 7.4
effect 7.3
yellow 7.3
computer 7.2
day 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.3
drawing 95.4
sketch 92.3
black and white 85.8
art 65.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 98.6%
Calm 52.9%
Happy 39.5%
Sad 5%
Angry 1%
Disgusted 0.8%
Fear 0.3%
Surprised 0.3%
Confused 0.2%

AWS Rekognition

Age 17-29
Gender Female, 71.7%
Calm 93.8%
Angry 1.7%
Sad 1.2%
Surprised 1.2%
Happy 0.8%
Fear 0.6%
Disgusted 0.5%
Confused 0.2%

AWS Rekognition

Age 22-34
Gender Female, 71.9%
Calm 97.4%
Sad 1.7%
Happy 0.3%
Surprised 0.3%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Confused 0%

AWS Rekognition

Age 22-34
Gender Female, 63.3%
Sad 51.8%
Calm 44.3%
Fear 1%
Surprised 1%
Happy 0.8%
Angry 0.5%
Confused 0.5%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 69.9%
Sad 33%
Happy 23%
Calm 21.8%
Fear 17.8%
Angry 2.2%
Surprised 1.7%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 32-48
Gender Female, 63.5%
Calm 48.9%
Sad 47.9%
Angry 2.3%
Fear 0.4%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 22-34
Gender Female, 84.1%
Sad 46%
Calm 30.2%
Happy 9.3%
Angry 7.9%
Fear 3.5%
Confused 1.4%
Surprised 1.3%
Disgusted 0.3%

AWS Rekognition

Age 25-39
Gender Female, 54.5%
Sad 82.5%
Calm 15.1%
Fear 1.1%
Surprised 0.5%
Angry 0.3%
Confused 0.3%
Happy 0.2%
Disgusted 0%

AWS Rekognition

Age 22-34
Gender Male, 73%
Calm 61.3%
Sad 35.6%
Angry 2%
Fear 0.3%
Happy 0.3%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Male, 58.9%
Surprised 47.1%
Calm 36.1%
Fear 5%
Sad 3.3%
Confused 3%
Angry 2.9%
Happy 1.8%
Disgusted 0.8%

AWS Rekognition

Age 26-42
Gender Female, 79.7%
Calm 60.5%
Sad 36.6%
Fear 0.9%
Happy 0.7%
Angry 0.6%
Surprised 0.6%
Confused 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 64.6%
Person 42.7%

Categories

Imagga

pets animals 59.8%
nature landscape 27.3%
paintings art 10.3%