Human Generated Data

Title

Bacchanalian Scene

Date

1834

People

Artist: George Thomas Doo, British 1800 - 1886

Artist after: Nicolas Poussin, French 1594 - 1665

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R11971

Human Generated Data

Title

Bacchanalian Scene

People

Artist: George Thomas Doo, British 1800 - 1886

Artist after: Nicolas Poussin, French 1594 - 1665

Date

1834

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Painting 96.7
Art 96.7
Person 95.3
Human 95.3
Person 90.1
Person 87.7
Person 80.4
Person 79
Person 70.7
Person 52.6
Person 47.9
Person 44.7

Imagga
created on 2022-01-28

vintage 45.5
old 44.6
grunge 41.7
envelope 40.4
blackboard 36.4
texture 34.1
antique 33.8
retro 29.5
aged 28.1
wall 26.6
container 25.2
ancient 24.2
dirty 23.5
rusty 22.9
paper 22.7
book jacket 20.2
frame 20
damaged 19.1
material 18.8
design 18.6
textured 17.5
structure 17
board 16.9
jacket 16.7
pattern 16.4
money 16.2
binding 16.2
chalkboard 14.7
blank 14.6
art 14.5
stamp 14.5
finance 14.4
wallpaper 13.8
rough 13.7
border 13.6
currency 13.5
grungy 13.3
grime 12.7
stain 12.5
worn 12.4
bill 12.4
surface 12.4
page 12.1
empty 12
wrapping 11.9
crumpled 11.7
rust 11.6
backdrop 11.5
parchment 11.5
aging 11.5
obsolete 11.5
concrete 11.5
black 11.4
floral 11.1
artwork 11
faded 10.7
backgrounds 10.5
business 10.3
brown 10.3
letter 10.1
cash 10.1
covering 9.9
bank 9.9
chalk 9.7
close 9.7
messy 9.7
detail 9.7
decay 9.6
text 9.6
spot 9.6
old fashioned 9.5
weathered 9.5
graphic 9.5
decoration 9.4
nobody 9.3
space 9.3
flower 9.2
grain 9.2
banking 9.2
book 9.1
drawing 9
gray 9
world 9
color 8.9
rotting 8.9
scratched 8.8
ragged 8.8
fracture 8.8
torn 8.7
spotted 8.7
communication 8.4
dollar 8.4
note 8.3
mottled 7.8
stains 7.8
states 7.7
geography 7.7
united 7.6
stone 7.6
textures 7.6
learn 7.6
map 7.5
savings 7.5
symbol 7.4
earth 7.3
message 7.3
global 7.3
financial 7.1
architecture 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

scene 99.8
room 99.7
gallery 99.6
person 92.2
indoor 89.9
old 88.4
clothing 70.8
white 70.7
text 62.8
drawing 62.3
posing 48.4
painting 38.3

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 100%
Happy 74.4%
Calm 18.5%
Angry 3.4%
Disgusted 1.6%
Surprised 0.9%
Confused 0.6%
Fear 0.4%
Sad 0.2%

AWS Rekognition

Age 10-18
Gender Male, 99.6%
Sad 81.6%
Calm 11.3%
Angry 3.6%
Disgusted 1.4%
Surprised 0.7%
Happy 0.5%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 1-7
Gender Female, 97.6%
Calm 98.6%
Angry 0.7%
Fear 0.2%
Surprised 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 19-27
Gender Female, 99.9%
Happy 54.1%
Sad 23.3%
Calm 8.3%
Disgusted 5%
Surprised 3.4%
Angry 3.2%
Fear 1.8%
Confused 1%

AWS Rekognition

Age 21-29
Gender Male, 54.6%
Angry 50.5%
Calm 25.2%
Happy 15.5%
Fear 3.4%
Disgusted 2.4%
Surprised 2.1%
Sad 0.7%
Confused 0.2%

Feature analysis

Amazon

Person 95.3%

Captions

Microsoft

a vintage photo of a painting 64.3%
an old photo of a painting 64.2%
a vintage photo of a painting on the wall 56.5%

Text analysis

Amazon

350
A
open-letter
A BACORANAGER KEENE
open-letter provide
4.00
BACORANAGER
provide
OFF 4.00
KEENE
OFF

Google

pam
rer
pam lllen rer
lllen