Human Generated Data

Title

Saint Verdiana Carrying Food to the Poor

Date

c. 1711

People

Artist: Giovanni Camillo Sagrestani, Italian 1660 - 1731

Classification

Paintings

Human Generated Data

Title

Saint Verdiana Carrying Food to the Poor

People

Artist: Giovanni Camillo Sagrestani, Italian 1660 - 1731

Date

c. 1711

Classification

Paintings

Machine Generated Data

Tags

Amazon

Person 99.4
Human 99.4
Art 97.1
Painting 97.1
Person 96.4
Person 96.4
Canine 83.2
Mammal 83.2
Pet 83.2
Dog 83.2
Animal 83.2
Person 73.6

Clarifai

people 100
group 99.6
adult 98.9
administration 96.2
man 95.4
woman 95.3
art 95.2
group together 95.1
two 94.9
furniture 94.9
child 94.5
wear 94
leader 93.7
print 93.6
outfit 93
many 92.5
several 91.6
home 89.3
three 89.3
sit 88.1

Imagga

fountain 36.5
statue 32.2
structure 25.4
architecture 25.3
building 23.4
sculpture 23.3
old 21.6
city 18.3
ancient 18.2
history 17.9
monument 17.7
art 16.4
stone 15.4
travel 14.1
famous 14
historic 13.8
groom 12.9
man 12.8
religion 12.5
tourism 12.4
world 12.3
antique 12.1
landmark 11.7
tourist 11.2
male 10.7
book jacket 10.3
person 10
wall 9.6
church 9.2
vintage 9.1
jacket 9
people 8.9
palace 8.8
water 8.7
facade 8.7
culture 8.5
temple 8.5
aged 8.1
column 8.1
covering 8.1
memorial 7.7
grunge 7.7
house 7.6
historical 7.5
park 7.4
symbol 7.4
black 7.3
decoration 7.2
portrait 7.1
sky 7

Microsoft

building 99.8
person 93.6
clothing 93
outdoor 88.4
window 86.4
painting 85.6
drawing 82.5
old 72.3
text 65.9
sketch 65.7
picture frame 8.8

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 54.3%
Fear 46.8%
Confused 45.2%
Calm 45.1%
Angry 52%
Surprised 45.3%
Sad 45.4%
Disgusted 45.1%
Happy 45.1%

AWS Rekognition

Age 18-30
Gender Female, 51.2%
Calm 45%
Happy 45%
Angry 45%
Confused 45%
Fear 45.9%
Disgusted 45%
Surprised 45%
Sad 54.1%

AWS Rekognition

Age 31-47
Gender Male, 50.7%
Fear 45.3%
Calm 50.8%
Confused 45.2%
Angry 45.1%
Happy 45.4%
Surprised 47.3%
Sad 45.9%
Disgusted 45%

AWS Rekognition

Age 26-40
Gender Male, 53.3%
Angry 45.6%
Sad 46.6%
Happy 45.2%
Surprised 45.7%
Confused 45.2%
Disgusted 45.2%
Calm 48.8%
Fear 47.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 97.1%
Dog 83.2%

Captions

Microsoft

a group of people sitting and standing in front of a window 93.4%
a group of people sitting in front of a window 91.4%
a group of people standing in front of a window 91.3%