Human Generated Data

Title

Subordination

Date

19th century

People

Artist: Emil Ebers, German 1807 - 1884

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R6674

Human Generated Data

Title

Subordination

People

Artist: Emil Ebers, German 1807 - 1884

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R6674

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Human 99.3
Person 99.3
Person 99.2
Art 97.2
Person 91.3
Person 85.9
Person 68.9
Painting 60

Clarifai
created on 2019-08-10

people 99.6
adult 99.1
art 99
painting 98.7
illustration 98.6
man 97
two 96.8
group 96.7
print 95.3
wear 95.3
woman 93.2
one 91.3
furniture 90.8
museum 90.8
veil 85.5
facial hair 84.7
chalk out 84.3
portrait 84.2
seat 83.8
religion 83

Imagga
created on 2019-08-10

sketch 100
drawing 100
representation 80.8
vintage 32.3
old 31.4
grunge 28.1
retro 22.9
art 21.5
antique 20.8
ancient 20.7
paper 18.8
decoration 18.3
history 17.9
texture 17.4
aged 17.2
stamp 16.9
graffito 16.1
design 15.9
frame 15
architecture 14.8
sculpture 13.5
wall 13
letter 12.8
historic 11.9
pattern 11.6
symbol 11.4
historical 11.3
travel 11.3
artwork 11
global 10.9
postage 10.8
states 10.6
world 10.6
mail 10.5
structure 10.5
united 10.5
bill 10.5
old fashioned 10.5
stone 10.2
money 10.2
card 10.2
note 10.1
border 9.9
dirty 9.9
landmark 9.9
tourism 9.9
currency 9.9
postmark 9.9
international 9.5
post 9.5
rusty 9.5
building 9.5
monument 9.3
decorative 9.2
cash 9.2
black 9
style 8.9
postal 8.8
culture 8.5
grungy 8.5
finance 8.4
wallpaper 8.4
famous 8.4
backgrounds 8.1
religion 8.1
detail 8
material 8
graphic 8
envelope 7.8
artistic 7.8
floral 7.7
north 7.6
damaged 7.6
statue 7.6
communication 7.6
earth 7.4
brown 7.4
paint 7.2
collection 7.2
bank 7.2
surface 7.1
country 7

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

gallery 100
room 100
scene 100
drawing 98.5
person 96.4
clothing 96
wall 95.4
indoor 93.5
sketch 91.7
old 84.1
art 73.1
text 72.1
woman 59.7
posing 49.8
vintage 40.2
painting 36.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Male, 54.1%
Confused 45.1%
Happy 45.3%
Angry 45.3%
Sad 45.1%
Disgusted 47.5%
Calm 51.4%
Fear 45.1%
Surprised 45.2%

AWS Rekognition

Age 38-56
Gender Female, 50.2%
Angry 49.8%
Disgusted 49.5%
Sad 49.6%
Surprised 49.5%
Calm 49.5%
Fear 49.8%
Happy 49.7%
Confused 49.5%

Feature analysis

Amazon

Person 99.3%
Painting 60%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

Tbeorg.

Google

ers all
ers
all