Human Generated Data

Title

Jacob and Rachel

Date

19th century

People

Artist: Giovita Garavaglia, Italian 1790 - 1835

Artist after: Andrea Appiani, Italian 1754 - 1817

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1603

Human Generated Data

Title

Jacob and Rachel

People

Artist: Giovita Garavaglia, Italian 1790 - 1835

Artist after: Andrea Appiani, Italian 1754 - 1817

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1603

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Human 99.6
Person 99.6
Person 99.4
Person 98.7
Person 98.6
Person 97.7
Painting 97
Art 97
Person 84.8
Person 76.1
Person 65.3
Angel 59.2
Archangel 59.2
Person 45.8

Clarifai
created on 2019-11-07

people 99.9
group 99.4
art 99.1
adult 98.6
woman 97.8
illustration 97.3
man 96.8
child 96
painting 95.4
portrait 95.3
two 94.7
baby 94
print 92.1
one 90.9
wear 90.6
family 90.4
religion 88.5
furniture 85.8
music 85
many 84.5

Imagga
created on 2019-11-07

kin 70.2
statue 44.1
sculpture 40.5
art 27.7
religion 25.1
ancient 25.1
old 24.4
culture 20.5
architecture 20.3
monument 19.6
stone 19.3
child 18.2
marble 17.4
history 17
religious 16.9
antique 16.5
decoration 15.7
cemetery 15.7
detail 15.3
historical 14.1
historic 13.8
face 13.5
travel 12.7
building 12.6
god 12.4
vintage 12.4
famous 12.1
figure 11.9
head 11.8
roman 11.7
man 11.4
city 10.8
catholic 10.7
closeup 10.1
carving 10.1
people 10
traditional 10
outdoor 9.9
landmark 9.9
tourism 9.9
sibling 9.9
carved 9.8
temple 9.7
church 9.3
portrait 9.1
world 8.9
symbol 8.8
sepia 8.7
decorative 8.4
sketch 8.1
body 8
love 7.9
male 7.9
spiritual 7.7
human 7.5
column 7.3
mother 7.1

Google
created on 2019-11-07

Microsoft
created on 2019-11-07

text 93.7
posing 92.6
person 88.2
clothing 83.9
old 77.7
painting 77.3
gallery 74.8
drawing 70.8
white 68.8
group 58.8
woman 58.8
human face 57.9
art 57.2
vintage 55.1
room 49.6
family 19.6
picture frame 13.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 11-21
Gender Female, 54.9%
Confused 45.2%
Disgusted 45.3%
Angry 45.3%
Happy 46.5%
Surprised 45.5%
Sad 45%
Calm 52.1%
Fear 45.1%

AWS Rekognition

Age 19-31
Gender Male, 54.9%
Sad 45%
Disgusted 45%
Fear 45%
Happy 45%
Surprised 45%
Calm 54.9%
Confused 45%
Angry 45%

AWS Rekognition

Age 33-49
Gender Male, 54.6%
Disgusted 45.5%
Fear 45.2%
Surprised 47.2%
Happy 45.1%
Calm 50.4%
Angry 46.1%
Sad 45.2%
Confused 45.4%

AWS Rekognition

Age 5-15
Gender Female, 54.4%
Angry 45.2%
Happy 45%
Calm 54.7%
Sad 45%
Disgusted 45%
Confused 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 23-35
Gender Male, 52%
Angry 46.1%
Sad 45.2%
Happy 45.1%
Fear 45.2%
Disgusted 45.2%
Calm 50%
Surprised 48.1%
Confused 45.2%

AWS Rekognition

Age 13-23
Gender Male, 51.7%
Sad 45.1%
Disgusted 45%
Angry 45.4%
Happy 45.2%
Surprised 45%
Fear 45%
Confused 45%
Calm 54.2%

AWS Rekognition

Age 15-27
Gender Female, 54.8%
Surprised 45.5%
Sad 45%
Calm 48.5%
Disgusted 45%
Fear 45%
Angry 45%
Happy 50.9%
Confused 45%

AWS Rekognition

Age 39-57
Gender Male, 54.9%
Fear 45%
Confused 45%
Happy 45%
Calm 54.4%
Sad 45.2%
Angry 45.2%
Surprised 45%
Disgusted 45.1%

AWS Rekognition

Age 47-65
Gender Male, 54.7%
Fear 45%
Happy 45.1%
Calm 53.7%
Surprised 45.1%
Angry 45.3%
Disgusted 45.6%
Confused 45.1%
Sad 45.1%

Feature analysis

Amazon

Person 99.6%
Painting 97%

Categories

Text analysis

Amazon

RACHL
CUM
PATRIS
RCCE RACHL YENIEBAT CUM 0YI33VS PATRIS
RCCE
YENIEBAT
0YI33VS

Google

BIT TERRAM ORIENTALEM..ET EC RACHEL ENIRBAT CUM OY13173 PATR1S S
BIT
TERRAM
ORIENTALEM..ET
EC
RACHEL
ENIRBAT
CUM
OY13173
PATR1S
S