Human Generated Data

Title

Presentation in the Temple, after Louis de Boulogne

Date

18th century

People

Artist: Jan Joost van Cossiau, German 1660 - 1732

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, 1898.577

Human Generated Data

Title

Presentation in the Temple, after Louis de Boulogne

People

Artist: Jan Joost van Cossiau, German 1660 - 1732

Date

18th century

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-29

Painting 98.6
Art 98.6
Person 92.5
Human 92.5
Person 89.8
Person 79.3
Person 74.7
Person 69.8
Person 66.6

Clarifai
created on 2020-04-29

art 99.2
painting 98.9
people 98.4
woman 96.7
religion 96.2
adult 95
wear 93.1
man 92.7
Renaissance 92.4
veil 92.2
furniture 92.1
portrait 91
desktop 90.8
family 90.8
illustration 90.7
retro 89.9
picture frame 89.7
old 89.6
vintage 89.5
no person 89.5

Imagga
created on 2020-04-29

tray 34.7
carving 34
sculpture 29.4
art 26.1
container 24.5
receptacle 24.2
food 23.9
plastic art 20.6
old 20.2
brown 19.9
paper 18
texture 16.7
vintage 16.5
antique 16.3
decoration 15.2
grunge 14.5
crab 14.4
meal 13.9
gold 13.1
retro 13.1
ancient 12.1
culture 12
design 11.8
fresh 11.8
traditional 11.6
crustacean 11.6
delicious 11.6
ingredient 11.4
page 11.1
gourmet 11
organic 10.9
tasty 10.9
healthy 10.7
cuisine 10.6
yellow 10.6
color 10.6
cooking 10.5
seafood 10.4
object 10.3
frame 10.1
closeup 10.1
fish 10.1
style 9.6
pattern 9.6
restaurant 9.5
blank 9.4
snack 9.4
holiday 9.3
dry 9.3
close 9.1
market 8.9
parchment 8.6
leaf 8.6
china 8.5
eat 8.4
ribbon 8.3
herb 8.2
aged 8.1
border 8.1
figure 8.1
space 7.8
dinner 7.7
meat 7.3
diet 7.3
dirty 7.2
sweet 7.1
breakfast 7.1

Google
created on 2020-04-29

Microsoft
created on 2020-04-29

gallery 98
scene 97.4
room 97.1
cat 95.2
indoor 94.3
art 94.1
museum 90.8
drawing 87
person 77.7
picture frame 62.1
text 60.8
painting 19.7

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Female, 50.4%
Confused 49.5%
Disgusted 49.5%
Angry 49.8%
Sad 49.7%
Calm 49.5%
Fear 50%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 14-26
Gender Male, 50.3%
Calm 50.4%
Confused 49.5%
Disgusted 49.5%
Fear 49.5%
Sad 49.5%
Surprised 49.5%
Happy 49.6%
Angry 49.5%

AWS Rekognition

Age 36-54
Gender Male, 50.3%
Disgusted 49.5%
Sad 50.4%
Confused 49.5%
Calm 49.5%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Happy 49.5%

AWS Rekognition

Age 8-18
Gender Female, 50.3%
Fear 49.5%
Sad 49.5%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Calm 50.5%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 50-68
Gender Male, 50.5%
Fear 49.5%
Calm 49.9%
Sad 50.1%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 1-7
Gender Male, 50.5%
Angry 49.7%
Fear 49.5%
Disgusted 49.8%
Surprised 49.5%
Happy 49.5%
Calm 49.7%
Sad 49.8%
Confused 49.5%

AWS Rekognition

Age 28-44
Gender Male, 50.4%
Calm 50.3%
Angry 49.5%
Disgusted 49.5%
Confused 49.5%
Sad 49.7%
Fear 49.5%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 12-22
Gender Female, 50%
Fear 49.5%
Angry 49.5%
Confused 49.5%
Sad 50.4%
Surprised 49.5%
Calm 49.5%
Happy 49.5%
Disgusted 49.5%

AWS Rekognition

Age 21-33
Gender Male, 50.4%
Surprised 49.5%
Angry 50.2%
Happy 49.5%
Sad 49.7%
Fear 49.6%
Disgusted 49.5%
Confused 49.5%
Calm 49.5%

AWS Rekognition

Age 41-59
Gender Female, 50.1%
Fear 49.5%
Sad 50.4%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Calm 49.6%
Confused 49.5%
Happy 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.3%
Fear 49.5%
Happy 49.5%
Sad 49.6%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Calm 50.3%
Disgusted 49.5%

AWS Rekognition

Age 23-35
Gender Male, 50.2%
Fear 49.5%
Angry 50.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Surprised 49.5%
Confused 49.5%
Calm 49.5%

Feature analysis

Amazon

Painting 98.6%
Person 92.5%

Captions

Microsoft

a painting of a cat 52.1%
a painting of a cat in a room 52%
a painting of a person 51.9%