Human Generated Data

Title

Magdalene

Date

19th century

People

Artist: Giuseppe Fusinati, Italian 1803 -

Artist after: Titian (Tiziano Vecellio), Italian c. 1488 - 1576

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1589

Human Generated Data

Title

Magdalene

People

Artist: Giuseppe Fusinati, Italian 1803 -

Artist after: Titian (Tiziano Vecellio), Italian c. 1488 - 1576

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1589

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Person 99.6
Human 99.6
Art 96.4
Painting 69.2
Art Gallery 55.5

Clarifai
created on 2019-11-07

people 99.4
portrait 98.9
art 98.3
adult 97.8
one 97.5
woman 96
painting 94.3
wear 91.6
picture frame 90.9
museum 89.6
exhibition 89.3
two 89
retro 86.9
music 86.3
man 83.2
furniture 82
group 80.8
facial expression 79.7
empty 74.3
print 73.9

Imagga
created on 2019-11-07

portrait 23.3
man 19.6
people 19
person 18.9
world 17.7
model 17.1
face 15.6
adult 15.5
attractive 15.4
male 15.1
sexy 14.5
hair 14.3
human 14.2
love 14.2
vintage 14.1
body 13.6
mother 13
book jacket 12.9
skin 12.2
fashion 12.1
one 11.9
black 11.5
adolescent 11.3
pretty 11.2
culture 11.1
art 11.1
jacket 11
happy 10.7
couple 10.5
home 10.4
expression 10.2
child 10.1
head 10.1
currency 9.9
lady 9.7
looking 9.6
juvenile 9.5
youth 9.4
money 9.4
cute 9.3
sensuality 9.1
antique 8.8
sofa 8.7
women 8.7
boy 8.7
lifestyle 8.7
old 8.4
museum 8.2
symbol 8.1
office 8
close 8
business 7.9
sitting 7.7
wrapping 7.6
serious 7.6
erotic 7.6
house 7.5
water 7.3
cash 7.3
smiling 7.2
happiness 7.1

Google
created on 2019-11-07

Photograph 95.9
Picture frame 93.7
Painting 79.5
Art 75.5
Portrait 69.2
Visual arts 68.6
Photography 67.8
Drawing 66.1
Room 65.7
Stock photography 59.4
Artwork 52.3

Microsoft
created on 2019-11-07

gallery 100
scene 100
room 100
painting 98.9
art 98.3
drawing 97.3
sketch 90.1
picture frame 85.6
woman 80.6
person 79.8
human face 79
posing 61.1
white 60.4
clothing 51.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-23
Gender Male, 91%
Angry 2.8%
Disgusted 1.1%
Surprised 26.5%
Calm 44.4%
Fear 11%
Happy 0.6%
Sad 4.5%
Confused 9.2%

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Painting 69.2%

Categories

Imagga

paintings art 100%