Human Generated Data

Title

The Maenads Attack Orpheus

Date

c. 1640

People

Artist: Johann Wilhelm Baur, German 1607 - 1641

Classification

Prints

Human Generated Data

Title

The Maenads Attack Orpheus

People

Artist: Johann Wilhelm Baur, German 1607 - 1641

Date

c. 1640

Classification

Prints

Machine Generated Data

Tags

Amazon

Human 98
Person 98
Person 97.6
Art 95.9
Painting 91.1
Person 85.9
Person 83.6
Person 72.4
Person 71.7
Mammal 69
Animal 69
Horse 69
Person 66.8
Person 61.2

Clarifai

people 100
print 99.7
engraving 99.5
group 99.1
adult 98.5
illustration 98.3
man 97.9
art 97.9
etching 95.6
military 95.1
weapon 94.9
many 94.2
soldier 93.9
interaction 91.3
pictorial 89.3
cavalry 89.2
war 89.1
wear 88.6
leader 88.5
skirmish 87.4

Imagga

grunge 48.6
texture 46.6
old 39.8
pattern 36.3
antique 35.3
aged 35.3
dirty 32.6
vintage 30.6
wallpaper 29.1
wall 27.3
retro 27.1
rough 25.6
paper 25.2
surface 24.7
material 23.2
weathered 22.8
textured 22.8
grungy 22.8
paint 22.7
design 22
backgrounds 21.9
canvas 21.8
art 20
backdrop 19.8
detail 19.3
aging 19.2
damaged 19.1
lace 18.7
snow 18.4
border 18.1
frame 17.8
structure 17.6
stain 17.3
empty 17.2
brown 17
messy 16.5
decay 16.4
parchment 16.3
rusty 16.2
ancient 15.6
burnt 15.6
blank 15.5
stained 15.4
stucco 14.7
graphic 14.6
torn 14.5
text 14
obsolete 13.4
worn 13.4
black 13.2
rust 12.5
drawing 12.4
color 12.3
space 11.7
dirt 11.5
close 11.4
sheet 11.3
decoration 11.1
weather 11.1
decorative 10.9
artistic 10.4
page 10.2
grain 10.2
stone 9.9
history 9.9
scratch 9.8
ice 9.7
edge 9.6
architecture 9.4
floral 9.4
crystal 9.3
metallic 9.2
letter 9.2
effect 9.1
gray 9
creative 8.8
materials 8.8
grime 8.8
mottled 8.8
crack 8.7
natural 8.7
detailed 8.7
age 8.6
flower 8.5
envelope 8.4
iron 8.4
note 8.3
industrial 8.2
closeup 8.1
metal 8.1
light 8
cracked 7.8
layers 7.7
spotted 7.7
construction 7.7
cardboard 7.7
spot 7.7
concrete 7.7
textures 7.6
poster 7.6
dark 7.5
document 7.4
style 7.4
book 7.3

Google

Microsoft

tree 99.1
outdoor 98.2

Face analysis

Amazon

AWS Rekognition

Age 35-53
Gender Male, 52%
Angry 45.5%
Sad 48.8%
Disgusted 45.3%
Surprised 45.3%
Happy 46.5%
Calm 48.5%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 50%
Sad 49.5%
Surprised 49.5%
Angry 49.6%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Disgusted 50.3%

AWS Rekognition

Age 27-44
Gender Female, 52.2%
Disgusted 45.8%
Happy 45.3%
Calm 46.7%
Confused 45.4%
Surprised 45.4%
Angry 46.3%
Sad 50.2%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Disgusted 45.6%
Angry 45.6%
Happy 45.5%
Calm 48.3%
Surprised 45.4%
Confused 45.5%
Sad 48.9%

AWS Rekognition

Age 26-43
Gender Male, 51.8%
Surprised 45.7%
Angry 45.9%
Confused 45.6%
Sad 45.4%
Disgusted 51.5%
Calm 45.8%
Happy 45.2%

Feature analysis

Amazon

Person 98%
Painting 91.1%
Horse 69%

Captions

Microsoft

a group of people standing next to a tree 66.4%
a close up of a tree 66.3%
a person standing next to a tree 63.4%

Text analysis

Amazon

o1