Human Generated Data

Title

Plate Ten

Date

19th century

People

Artist: Eugène Lepoittevin, French 1806 - 1870

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, George R. Nutter Fund, 1978, M15852

Human Generated Data

Title

Plate Ten

People

Artist: Eugène Lepoittevin, French 1806 - 1870

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, George R. Nutter Fund, 1978, M15852

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Human 95.7
Person 91.5
Person 90.6
Painting 89.6
Art 89.6
Horse 87.3
Animal 87.3
Mammal 87.3
People 83.7
Vehicle 81.5
Transportation 81.5
Person 81.2
Person 77.7
Horse 74.9
Person 74.6
Person 72
Text 66.1
Person 65.9
Person 63.2
Bicycle 62.7
Bike 62.7

Clarifai
created on 2023-10-26

people 100
group 99.8
print 99.7
adult 99.5
art 98.8
furniture 98.3
woman 97.9
many 97.8
mammal 97.4
wear 97.4
two 96.9
man 96.8
canine 96.7
child 95.7
several 95.7
veil 95.5
illustration 95.4
one 94.9
seat 93.8
cavalry 93.6

Imagga
created on 2022-01-16

sketch 100
drawing 100
representation 76.5
vintage 40.5
retro 40.1
design 35
floral 32.3
decoration 31.7
art 30.2
frame 30
grunge 28.1
pattern 26.7
element 26.5
flower 26.1
graphic 25.5
decorative 24.2
silhouette 23.2
antique 21.9
ornament 20.7
leaf 20.2
old 19.5
banner 17.5
texture 17.4
border 17.2
plant 17.2
wallpaper 16.1
card 15.4
curl 15.2
decor 14.1
stamp 14.1
ornate 13.7
set 13.6
symbol 13.5
creative 13.2
style 12.6
backdrop 12.4
holiday 12.2
artistic 12.2
paper 11.8
swirl 11.1
black 10.8
collection 10.8
gold 10.7
scroll 10.5
ancient 10.4
foliage 10.1
color 10
cartoon 9.8
royal 9.7
curve 9.6
spring 9.4
season 9.4
elegance 9.2
letter 9.2
artwork 9.2
aged 9.1
dirty 9
deer 8.8
arabesque 8.6
die 8.6
grungy 8.5
bird 8.4
clip art 8.3
paint 8.1
painting 8.1
engraving 7.9
elegant 7.7
classic 7.4
elements 7.4
template 7.3
valentine 7.3
graffito 7

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

wall 98.5
gallery 97.2
drawing 96.5
cartoon 95.4
person 94.7
sketch 91.3
text 83.6
scene 73.9
clothing 73.8
posing 72
room 68
horse 65.2
different 44.6
old 41.8
several 11.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Female, 99.2%
Calm 36.7%
Confused 32%
Surprised 17.8%
Happy 6.3%
Angry 3%
Sad 2.3%
Disgusted 1.2%
Fear 0.8%

AWS Rekognition

Age 20-28
Gender Male, 57.6%
Sad 61.4%
Calm 30.6%
Fear 3.8%
Surprised 1.7%
Confused 1.2%
Disgusted 0.5%
Happy 0.5%
Angry 0.4%

AWS Rekognition

Age 21-29
Gender Male, 100%
Angry 85.9%
Calm 7.2%
Fear 2.1%
Sad 1.9%
Surprised 1.6%
Disgusted 0.7%
Happy 0.4%
Confused 0.3%

AWS Rekognition

Age 43-51
Gender Male, 99.6%
Calm 88.2%
Happy 8.5%
Confused 1.1%
Surprised 1.1%
Fear 0.3%
Disgusted 0.3%
Sad 0.3%
Angry 0.2%

AWS Rekognition

Age 21-29
Gender Female, 98.9%
Fear 33.3%
Disgusted 22.4%
Sad 13.4%
Surprised 11.2%
Happy 8.3%
Calm 5.4%
Angry 3.5%
Confused 2.5%

AWS Rekognition

Age 13-21
Gender Female, 100%
Angry 27.5%
Surprised 27.4%
Fear 20.4%
Disgusted 10.1%
Calm 7.8%
Confused 3.7%
Sad 2.2%
Happy 1%

Feature analysis

Amazon

Person 91.5%
Painting 89.6%
Horse 87.3%

Categories

Imagga

paintings art 97.8%
interior objects 1.9%

Captions

Text analysis

Amazon

N° 10
10
2
32.
-
des - - Answer
des
Answer