Human Generated Data

Title

Holy Family

Date

19th century

People

Artist: Unidentified Artist,

Previous attribution: Andrea del Sarto, Italian 1486 - 1530

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Professor and Mrs. John Tucker Murray, 1938.83

Human Generated Data

Title

Holy Family

People

Artist: Unidentified Artist,

Previous attribution: Andrea del Sarto, Italian 1486 - 1530

Date

19th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Professor and Mrs. John Tucker Murray, 1938.83

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 98.6
Human 98.6
Person 97.6
Painting 96.2
Art 96.2
Person 95.5

Clarifai
created on 2020-04-24

people 100
portrait 99.3
two 99.3
child 98.8
baby 98.6
adult 98.6
group 97.8
three 97.2
offspring 97.1
son 96.6
family 95.9
art 95.7
administration 95.7
interaction 95.6
sibling 95.5
man 94.8
affection 94.8
wear 92.4
print 87.6
retro 86.4

Imagga
created on 2020-04-24

brother 34.3
adult 30.4
child 30
mother 29.8
people 27.9
man 26.9
portrait 25.9
person 25.2
parent 23.7
happy 23.2
love 22.9
male 21.6
attractive 21
model 20.2
fashion 19.6
body 19.2
couple 19.2
family 18.7
sexy 17.7
lifestyle 16.6
mosquito net 16.5
face 16.3
smile 15.7
sitting 15.5
home 15.1
hair 15.1
one 14.9
sensuality 14.5
human 14.2
dark 14.2
boy 13.9
lady 13.8
smiling 13.7
father 13.7
trampoline 13.6
couch 13.5
happiness 13.3
together 13.1
sofa 13
cute 12.9
skin 12.7
husband 12.4
kid 11.5
dad 11.3
passion 11.3
fun 11.2
protective covering 11.1
women 11.1
gymnastic apparatus 10.9
elegance 10.9
black 10.9
posing 10.7
loving 10.5
wife 10.4
lying 10.3
pretty 9.8
looking 9.6
jeans 9.6
casual 9.3
kin 9.2
dress 9
life 8.7
brunette 8.7
desire 8.7
room 8.4
studio 8.4
house 8.4
leisure 8.3
sports equipment 8.2
children 8.2
cheerful 8.1
romance 8
interior 8
son 8
seduce 7.9
covering 7.8
boyfriend 7.7
girlfriend 7.7
expression 7.7
youth 7.7
erotic 7.6
rain 7.5
relaxation 7.5
style 7.4
light 7.3
indoor 7.3
fitness 7.2
childhood 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

human face 97.5
person 97.1
text 96.6
drawing 95.4
baby 95.3
sketch 93.4
painting 92.9
clothing 83.4
window 81.7
toddler 80.5
picture frame 55.4
image 42.8
posing 41.8
old 41.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-30
Gender Female, 83.2%
Calm 0.4%
Confused 66%
Happy 0.2%
Fear 11.9%
Disgusted 7.2%
Angry 0.5%
Sad 0.4%
Surprised 13.3%

AWS Rekognition

Age 11-21
Gender Male, 96.2%
Sad 3.8%
Angry 3.5%
Happy 0.6%
Fear 0.2%
Disgusted 1.5%
Calm 88.5%
Confused 1.6%
Surprised 0.3%

AWS Rekognition

Age 42-60
Gender Male, 97.6%
Calm 13.3%
Disgusted 0.1%
Sad 84%
Surprised 0.3%
Confused 0.6%
Angry 0.1%
Fear 0.8%
Happy 0.8%

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 3
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Painting 96.2%

Categories

Imagga

people portraits 60.6%
pets animals 26.8%
paintings art 12.2%