Human Generated Data

Title

Untitled (woman leaping)

Date

1887

People

Artist: Eadweard Muybridge, British 1830 - 1904

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.337

Human Generated Data

Title

Untitled (woman leaping)

People

Artist: Eadweard Muybridge, British 1830 - 1904

Date

1887

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 95.9
Person 95.9
Person 95.6
Person 95.5
Person 94.9
Person 94.2
Person 94.1
Person 92.6
Person 92.1
Person 91.4
Person 88.3
Person 86.1
Person 83.5
Text 82.6
Person 80.8
Person 80.1
Advertisement 76.5
Person 75.8
Person 74.5
Poster 73
Animal 72.6
Bird 72.6
Person 72
Person 70.7
Person 69.4
Art 68.3
Person 67.9
Person 67.3
Person 66.8
Person 66.7
Person 66.1
Person 65.7
Person 65.7
Person 64.8
Alphabet 61.9
Person 61.5
Word 58.3
Collage 56.4
Person 54.6

Imagga
created on 2022-01-15

abacus 52.5
equipment 43.7
calculator 41.4
film 32.1
device 27.4
negative 27.3
sequencer 24.7
old 23
equalizer 21.7
electronic equipment 21
apparatus 19.7
frame 18.6
retro 18
movie 17.4
paper 17.3
art 17.1
vintage 16.5
business 16.4
black 16.2
grunge 16.2
camera 15.7
technology 15.6
strip 15.5
digital 15.4
cinema 14.6
design 14.6
texture 13.9
graphic 13.9
border 13.6
pattern 13
board 12.5
dirty 11.7
roll 11.4
office 11.3
screen 11.2
money 11.1
object 11
filmstrip 10.8
slide 10.7
antique 10.7
tape 10.6
computer 10.5
damaged 10.5
blank 10.3
line 10.3
close 10.3
data 10
cinematography 9.9
photograph 9.8
information 9.7
photography 9.5
closeup 9.4
memory 9.4
space 9.3
finance 9.3
instrument 9.3
entertainment 9.2
rough 9.1
currency 9
bank 9
backgrounds 8.9
color 8.9
35mm 8.9
photographic 8.8
text 8.7
grungy 8.5
card 8.5
rich 8.4
dollar 8.4
track 7.9
3d 7.7
video 7.7
modern 7.7
edge 7.7
industry 7.7
pay 7.7
note 7.4
cash 7.3
wealth 7.2
reel 7.2
textured 7

Google
created on 2022-01-15

Rectangle 85.4
Font 82.6
Art 74.6
Symmetry 66
Metal 62.4
Visual arts 61
Event 58.6
History 56.9
Collection 53.9

Microsoft
created on 2022-01-15

text 98.7
old 45.6

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Female, 78.4%
Sad 90%
Calm 4.9%
Confused 1.6%
Angry 1.1%
Happy 0.8%
Fear 0.7%
Disgusted 0.6%
Surprised 0.4%

AWS Rekognition

Age 35-43
Gender Male, 79.8%
Calm 61.1%
Angry 19.3%
Sad 13.6%
Disgusted 3.2%
Happy 1.3%
Surprised 0.8%
Fear 0.4%
Confused 0.3%

AWS Rekognition

Age 23-33
Gender Male, 66.7%
Calm 41.4%
Sad 24.4%
Confused 11.9%
Angry 7.4%
Disgusted 5.6%
Surprised 3.5%
Happy 3.2%
Fear 2.6%

AWS Rekognition

Age 22-30
Gender Male, 99%
Calm 95.6%
Surprised 1.3%
Sad 1%
Disgusted 0.7%
Angry 0.6%
Happy 0.4%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Calm 42.6%
Fear 19.8%
Happy 17.9%
Sad 13.2%
Confused 2.7%
Angry 2.1%
Disgusted 0.9%
Surprised 0.8%

AWS Rekognition

Age 20-28
Gender Male, 69.2%
Happy 37.2%
Angry 17.8%
Calm 17.1%
Sad 13.7%
Confused 5.6%
Surprised 3.7%
Fear 2.6%
Disgusted 2.3%

Feature analysis

Amazon

Person 95.9%
Bird 72.6%

Captions

Microsoft

an old photo of a person 53.2%
an old photo of a person 51.9%
old photo of a person 50.6%

Text analysis

Amazon

PLATE
by
ANIMAL
All
rights
EADWEARD
ANIMAL LOCOMOTION PLATE
LOCOMOTION
Copyright,
Copyright, 1887. by EADWEARD MOVERIDGE All rights receved
MOVERIDGE
1887.
120
receved

Google

ANIMAL
Copyright,
ANIMAL LOCOMOTION. PLATE. Copyright, 1887, by EADWEARD MUVBRIDGE. All rights reservd.
PLATE.
MUVBRIDGE.
by
All
rights
1887,
LOCOMOTION.
EADWEARD
reservd.