Human Generated Data

Title

Figures at Equestrian Event

Date

1896

People

Artist: Herbert Haseltine, American 1877 - 1962

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. Frank D. Haines, 1972.64

Human Generated Data

Title

Figures at Equestrian Event

People

Artist: Herbert Haseltine, American 1877 - 1962

Date

1896

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. Frank D. Haines, 1972.64

Machine Generated Data

Tags

Amazon
created on 2020-05-01

Person 99.2
Human 99.2
Person 98.2
Person 98
Person 93.6
Animal 91.9
Horse 91.9
Mammal 91.9
Person 89
Book 86.1
Comics 86.1
Horse 84.6
Person 82.4
Art 81.8
Person 80.4
Horse 79.9
Drawing 76.8
Person 71.2
Text 71.2
Person 70.7
Person 66.2
Sketch 58.1
Poster 56.2
Advertisement 56.2
Person 45.7

Clarifai
created on 2020-05-01

people 99.9
group 99.5
adult 98.7
man 98.4
illustration 98.2
print 97.7
art 96.1
woman 92.9
many 92.5
leader 92.1
wear 90.5
several 88.3
engraving 88.1
chalk out 87.6
administration 86.4
painting 86.3
veil 86.2
two 85.9
vintage 82.8
one 82.8

Imagga
created on 2020-05-01

drawing 42
envelope 33.7
sketch 29.1
design 25.4
frame 25
vintage 24.9
art 22.5
retro 21.3
cartoon 19.7
representation 19.4
container 19.4
grunge 18.8
paper 18.2
border 18.1
antique 17
decorative 16.7
card 16.6
pattern 16.4
floral 16.2
old 16.1
map 15.5
graphic 15.3
decoration 15
clip art 14.8
business 14.6
money 14.5
black 12.6
element 12.4
outline 12.3
style 11.1
banner 11
ornate 11
certificate 10.8
texture 10.4
icon 10.3
page 10.2
set 10.2
swirl 10.2
symbol 10.1
man 10.1
flower 10
silhouette 9.9
treasury 9.7
ornament 9.5
blank 9.4
bank 9.3
finance 9.3
travel 9.2
facility 9.1
sign 9
menu 8.8
ancient 8.7
book 8.4
dollar 8.4
elements 8.4
template 8.3
collection 8.1
currency 8.1
history 8.1
decor 8
leaf 7.8
stamp 7.7
geography 7.7
atlas 7.7
wallpaper 7.7
grungy 7.6
document 7.4
backdrop 7.4
cash 7.3
depository 7.3
holiday 7.2
financial 7.1
modern 7

Google
created on 2020-05-01

Microsoft
created on 2020-05-01

drawing 99.2
sketch 98.9
cartoon 93.7
text 91
illustration 82.3
person 59.9
clothing 55.8
picture frame 6.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-30
Gender Female, 52.9%
Angry 45.1%
Happy 45.4%
Confused 45.1%
Surprised 45.7%
Calm 53%
Fear 45.5%
Sad 45.3%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 50.8%
Happy 45%
Angry 45%
Surprised 45%
Confused 45%
Calm 55%
Sad 45%
Disgusted 45%
Fear 45%

AWS Rekognition

Age 21-33
Gender Female, 53.6%
Fear 45.2%
Angry 51%
Confused 45.1%
Disgusted 45%
Calm 48.3%
Surprised 45.3%
Happy 45%
Sad 45.1%

AWS Rekognition

Age 18-30
Gender Female, 54.6%
Surprised 45.1%
Disgusted 45%
Happy 45.1%
Calm 54.1%
Fear 45.1%
Angry 45.3%
Sad 45.1%
Confused 45.1%

AWS Rekognition

Age 19-31
Gender Female, 51.8%
Calm 54.8%
Surprised 45%
Fear 45%
Disgusted 45%
Angry 45.1%
Sad 45%
Happy 45%
Confused 45%

AWS Rekognition

Age 9-19
Gender Male, 50.3%
Angry 49.5%
Sad 49.5%
Surprised 49.6%
Disgusted 49.5%
Happy 49.5%
Confused 49.5%
Calm 50.2%
Fear 49.6%

AWS Rekognition

Age 19-31
Gender Female, 50.3%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Disgusted 49.5%
Fear 49.5%
Sad 49.6%
Surprised 49.5%
Angry 50.3%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Confused 50%
Calm 49.7%
Sad 49.5%
Disgusted 49.5%
Surprised 49.7%
Fear 49.5%
Angry 49.5%
Happy 49.5%

AWS Rekognition

Age 22-34
Gender Female, 50.1%
Happy 49.6%
Confused 50.2%
Calm 49.5%
Sad 49.6%
Surprised 49.5%
Disgusted 49.5%
Angry 49.6%
Fear 49.5%

AWS Rekognition

Age 14-26
Gender Female, 50.3%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Disgusted 49.9%
Confused 49.5%
Calm 49.9%
Happy 49.6%
Sad 49.5%

AWS Rekognition

Age 24-38
Gender Female, 50.2%
Disgusted 49.5%
Sad 49.5%
Confused 49.5%
Calm 49.5%
Fear 49.5%
Surprised 49.5%
Angry 50.4%
Happy 49.5%

Feature analysis

Amazon

Person 99.2%
Horse 91.9%
Poster 56.2%

Categories