Human Generated Data

Title

Ecce Homo

Date

19th century

People

Artist: Pierre Fran├žois Bertonnier, French 1791 - ?

Artist after: Guido Reni, Italian 1575 - 1642

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R13039

Human Generated Data

Title

Ecce Homo

People

Artist: Pierre Fran├žois Bertonnier, French 1791 - ?

Artist after: Guido Reni, Italian 1575 - 1642

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Art 97.3
Human 85.1
Drawing 85.1
Person 79
Sketch 76.9
Face 74.7
Photography 65.3
Photo 65.3
Portrait 63.7
Painting 60.4
Art Gallery 59.3

Clarifai
created on 2019-11-10

people 98.1
portrait 97.7
one 97.6
adult 95.2
painting 93.2
door 92.7
light 92.5
wear 91.5
indoors 90.9
room 90.4
no person 88.1
art 87.5
window 86.9
museum 86.9
girl 86.8
house 86.4
building 85.8
wall 83.6
architecture 83.5
woman 82.7

Imagga
created on 2019-11-10

wall clock 77.9
clock 62.4
timepiece 47.1
device 40.6
measuring instrument 34.4
old 34.1
wall 31.8
door 31.5
ventilator 25.5
vintage 23.1
fire alarm 21.3
architecture 21.1
antique 20.8
grunge 20.4
entrance 20.3
texture 18
instrument 17.9
wooden 17.6
alarm 16.9
house 16.7
aged 16.3
wood 15.8
window 15.4
ancient 14.7
retro 13.9
metal 12.9
frame 12.5
classic 12.1
detail 12.1
building 11.9
rusty 11.4
decoration 11.1
lock 10.9
design 10.7
weathered 10.4
paper 10.2
dirty 9.9
material 9.8
style 9.6
brown 9.6
yellow 9.3
traditional 9.1
paint 9
country 8.8
gate 8.7
worn 8.6
close 8.6
art 8.5
border 8.1
symbol 8.1
home 8
glass 7.8
rust 7.7
construction 7.7
stone 7.6
framework 7.5
sign 7.5
pattern 7.5
historic 7.3
object 7.3
knocker 7.3
structure 7.2
open 7.2
religion 7.2
history 7.1
surface 7

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

human face 97
drawing 96.6
art 92.7
sketch 84.3
painting 81.3
white 74.4
picture frame 71.6
person 65.8
jack 57.5
open 43.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-56
Gender Female, 59.3%
Fear 1%
Angry 0.5%
Disgusted 0.7%
Surprised 1.1%
Happy 0.2%
Sad 7.7%
Calm 1.3%
Confused 87.5%

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 79%

Captions

Microsoft

a close up of a door 59.8%
a close up of a white door 41.2%
a close up of a white wall 41.1%