Human Generated Data

Title

Architect Presenting his Plans for the Tower of Babel

Date

18th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Walter Bauer, 1965.569.3

Human Generated Data

Title

Architect Presenting his Plans for the Tower of Babel

People

Artist: Unidentified Artist,

Date

18th century

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Art 95.3
Person 95
Human 95
Drawing 94
Sketch 87.3
Painting 82.8
Person 66.2

Clarifai
created on 2019-06-01

people 99.9
group 99.3
adult 99.1
art 99.1
print 97.8
veil 97.4
man 97
illustration 96.7
wear 95.9
portrait 94.7
interaction 92.8
leader 91.4
one 91
two 89.2
furniture 88.9
painting 88.9
woman 88.8
many 85.4
three 84.1
several 83.3

Imagga
created on 2019-06-01

sketch 100
drawing 79.4
representation 62.1
ancient 26
old 23.7
art 23.4
vintage 21.5
decoration 18.8
grunge 18.7
sculpture 18.5
history 17
architecture 16.6
retro 16.4
antique 15.8
money 15.3
texture 15.3
carving 15.3
currency 15.3
cash 14.6
finance 14.4
pattern 14.4
paper 14.3
design 14.2
monument 14
banking 12.9
bank 12.5
financial 12.5
artistic 12.2
bill 11.4
statue 11.4
detail 11.3
style 11.1
culture 11.1
stone 11.1
stucco 11
landmark 10.8
frame 10.8
wealth 10.8
marble 10.5
black 10.2
carved 9.8
pay 9.6
temple 9.5
historical 9.4
savings 9.3
dollar 9.3
religion 9
stamp 9
brown 8.8
textured 8.8
ornament 8.6
close 8.6
graffito 8.5
business 8.5
travel 8.5
rich 8.4
famous 8.4
decorative 8.4
figure 8.3
note 8.3
letter 8.3
wall 8.3
historic 8.3
investment 8.2
aged 8.1
structure 8.1
building 7.9
flower 7.7
tile 7.7
exchange 7.6
element 7.4
economy 7.4
exterior 7.4
graphic 7.3

Google
created on 2019-06-01

Drawing 94.6
Sketch 87.9
Text 85.2
Art 80
Illustration 76.8
Figure drawing 76
Artwork 74.8
Painting 57.1
Visual arts 55
History 54.1

Microsoft
created on 2019-06-01

drawing 99.7
sketch 99.7
child art 80.6
illustration 73.9
art 73.7
cartoon 62.9
stone 27.4
painting 20

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Male, 95.3%
Disgusted 8.8%
Sad 6.6%
Happy 7.8%
Surprised 5.9%
Calm 52.8%
Angry 6.3%
Confused 11.8%

AWS Rekognition

Age 20-38
Gender Female, 64.3%
Confused 8.9%
Surprised 8.5%
Calm 6.3%
Sad 39.2%
Happy 22.1%
Disgusted 5.6%
Angry 9.4%

AWS Rekognition

Age 26-44
Gender Female, 97.1%
Surprised 36.1%
Sad 14.4%
Happy 19.1%
Angry 8%
Disgusted 7.8%
Confused 9.4%
Calm 5.2%

AWS Rekognition

Age 20-38
Gender Female, 54.7%
Angry 45.1%
Surprised 45.2%
Disgusted 45.1%
Sad 45.2%
Calm 54.3%
Happy 45%
Confused 45.1%

AWS Rekognition

Age 23-38
Gender Female, 53.7%
Sad 51%
Angry 45.4%
Surprised 45.2%
Calm 47.8%
Disgusted 45.1%
Happy 45.2%
Confused 45.4%

AWS Rekognition

Age 19-36
Gender Female, 52.7%
Angry 45.1%
Sad 45.2%
Disgusted 45.1%
Happy 54.2%
Calm 45.1%
Surprised 45.3%
Confused 45.1%

AWS Rekognition

Age 48-68
Gender Male, 86.9%
Confused 0.4%
Surprised 0.5%
Calm 97.9%
Sad 0.6%
Happy 0.2%
Disgusted 0.2%
Angry 0.2%

Feature analysis

Amazon

Person 95%

Captions

Microsoft

a painting on the wall 80.1%
a painting on a wall 79.6%
a painting of a person 71.6%