Human Generated Data

Title

Last Supper

Date

18th-19th century

People

Artist: Raphael Morghen, Italian 1758 - 1833

Artist after: Leonardo da Vinci, Italian 1452 - 1519

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2751

Human Generated Data

Title

Last Supper

People

Artist: Raphael Morghen, Italian 1758 - 1833

Artist after: Leonardo da Vinci, Italian 1452 - 1519

Date

18th-19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2751

Machine Generated Data

Tags

Amazon
created on 2019-11-03

Person 99.3
Human 99.3
Person 99.2
Person 99.2
Person 99.1
Person 99.1
Person 98.7
Person 98.6
Person 98.4
Person 95.8
Person 92.3
Person 91.7
People 88.4
Art 79.4
Painting 79.4
Family 75.8
Person 68.9
Porch 66.4
Clinic 62.7
Hospital 57.4
Person 55.2

Clarifai
created on 2019-11-03

people 100
group 99.7
adult 98.4
many 98.1
group together 97.9
child 95.5
man 93.5
administration 92.7
woman 92.2
furniture 92.1
military 90.4
several 89.4
leader 89.1
war 87.5
wear 85.4
five 85.1
soldier 84.2
sit 82.3
outfit 81.6
seat 77.9

Imagga
created on 2019-11-03

cadaver 24.9
statue 21
old 20.9
sculpture 20.7
stretcher 17.4
architecture 16.4
people 14.5
litter 14.1
art 14
person 12.9
travel 12.7
religion 12.5
room 12.5
man 12.1
building 12.1
conveyance 11.8
history 11.6
money 11
portrait 11
vintage 10.7
male 10.6
house 10
one 9.7
home 9.6
ancient 9.5
culture 9.4
stone 9.3
cash 9.1
close 9.1
retro 9
wealth 9
franklin 8.8
adult 8.5
two 8.5
religious 8.4
city 8.3
fashion 8.3
tourism 8.2
outdoors 8.2
style 8.2
sofa 8.1
lifestyle 7.9
paper 7.8
horizontal 7.5
monument 7.5
savings 7.4
dollar 7.4
closeup 7.4
church 7.4
banking 7.3
alone 7.3
bank 7.3
hair 7.1
face 7.1
antique 7.1

Google
created on 2019-11-03

Photograph 95.4
Stock photography 67.6
History 64.5
Photography 62.4
Table 58.6
Furniture 53.3
Art 50.2

Microsoft
created on 2019-11-03

wall 96.2
text 95.2
person 94.6
indoor 91.7
clothing 85.2
furniture 50.4
woman 50.3
old 48.1
several 15.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 54.6%
Calm 52.6%
Surprised 45.1%
Happy 45.1%
Sad 45.5%
Angry 46.2%
Fear 45.1%
Disgusted 45.2%
Confused 45.1%

AWS Rekognition

Age 29-45
Gender Male, 50.3%
Sad 49.7%
Happy 49.5%
Disgusted 49.5%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Confused 49.5%
Calm 50.2%

AWS Rekognition

Age 12-22
Gender Male, 50.4%
Fear 49.5%
Happy 49.5%
Sad 49.6%
Calm 50.3%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Angry 49.5%

AWS Rekognition

Age 37-55
Gender Male, 53.9%
Calm 46.9%
Happy 45.1%
Surprised 45.1%
Fear 45.1%
Sad 50.6%
Disgusted 45.4%
Confused 45.4%
Angry 46.5%

AWS Rekognition

Age 41-59
Gender Male, 53.5%
Happy 45%
Surprised 45%
Fear 45%
Disgusted 45%
Angry 55%
Confused 45%
Sad 45%
Calm 45%

AWS Rekognition

Age 28-44
Gender Female, 53.5%
Sad 49.7%
Fear 45.1%
Happy 45.1%
Surprised 45.1%
Calm 49.1%
Disgusted 45.2%
Confused 45.1%
Angry 45.6%

AWS Rekognition

Age 30-46
Gender Male, 50.5%
Angry 50.3%
Confused 49.5%
Calm 49.6%
Happy 49.5%
Surprised 49.5%
Fear 49.5%
Disgusted 49.5%
Sad 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.3%
Confused 49.5%
Happy 49.5%
Sad 50.2%
Calm 49.8%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Fear 49.5%

AWS Rekognition

Age 22-34
Gender Female, 54.2%
Happy 45%
Angry 45%
Fear 45%
Calm 54.5%
Disgusted 45%
Surprised 45%
Sad 45.3%
Confused 45.1%

AWS Rekognition

Age 26-42
Gender Female, 50.2%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.6%
Fear 49.9%
Calm 49.5%
Angry 49.7%
Sad 49.8%

AWS Rekognition

Age 22-34
Gender Female, 50.4%
Angry 49.7%
Surprised 49.5%
Calm 49.9%
Disgusted 49.6%
Fear 49.5%
Confused 49.5%
Happy 49.5%
Sad 49.7%

AWS Rekognition

Age 23-35
Gender Male, 53.8%
Surprised 45.4%
Angry 45.6%
Calm 49.9%
Sad 47.6%
Happy 45%
Disgusted 45.1%
Confused 45.2%
Fear 46.2%

AWS Rekognition

Age 33-49
Gender Male, 53.8%
Surprised 45.1%
Sad 49.9%
Calm 47.3%
Fear 45.2%
Angry 47.3%
Happy 45%
Confused 45.2%
Disgusted 45.1%

Feature analysis

Amazon

Person 99.3%
Painting 79.4%

Captions

Microsoft
created on 2019-11-03

an old photo of a person 62.6%
a group of people in a room 62.5%
old photo of a person 58.6%

Text analysis

Amazon

DICO
VESTRUM
AMEN
TRADITURUS
UNUS
AMEN DICO VOBI QUIA UNUS VESTRUM ME TRADITURUS ES'T..
QUIA
VOBI
ME
okntinarmite
Lufirim
111
Himre
okntinarmite 111 Hage Himre Dner
ES'T..
Dner
Hage

Google

VESTRUM ME TRADITURUS EST. AMEN DICO VOBIS QUIA UNUS Cachulimande ce Map Helomoau Due
VESTRUM
ME
TRADITURUS
EST.
AMEN
DICO
VOBIS
QUIA
UNUS
Cachulimande
ce
Map
Helomoau
Due