Human Generated Data

Title

Interior with Dancing Couple

Date

17th century

People

Artist: Jonas Suyderhoef, Dutch c. 1613 - 1686

Artist after: Adriaen van Ostade, Dutch 1610 - 1685

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Melvin R. Seiden, S5.14

Human Generated Data

Title

Interior with Dancing Couple

People

Artist: Jonas Suyderhoef, Dutch c. 1613 - 1686

Artist after: Adriaen van Ostade, Dutch 1610 - 1685

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Melvin R. Seiden, S5.14

Machine Generated Data

Tags

Amazon
created on 2019-04-03

Art 99.3
Painting 99.3
Human 98.5
Person 98.5
Person 98.5
Person 95.8
Person 93.8
Person 91.8
Person 90.3
Person 89.3
Person 83.8
Person 73.7
Person 71.9

Clarifai
created on 2018-03-24

people 100
group 99.9
many 99.9
print 99.4
adult 99.3
child 98.6
military 97.6
engraving 97.6
man 97.4
art 96.2
soldier 95.8
furniture 95.6
sit 95.3
room 95
war 94.5
administration 93.7
group together 93.6
several 92
wear 91.3
home 89.1

Imagga
created on 2018-03-24

book jacket 35.8
old 29.3
jacket 27.9
grunge 27.2
vintage 25.6
antique 24
texture 22.9
architecture 22.4
covering 22.1
art 21.5
wrapping 21.2
aged 19
building 18.4
city 17.5
drawing 16.9
ancient 16.4
retro 15.6
pattern 15
sketch 14.2
design 14.1
decoration 13.7
skyline 13.3
structure 12.7
dirty 12.6
brown 12.5
screen 12.4
material 11.6
wallpaper 11.5
travel 11.3
paper 11
wall 10.8
landmark 10.8
black 10.8
dark 10
tower 9.8
history 9.8
graffito 9.7
fire screen 9.6
stone 9.6
urban 9.6
artistic 9.6
grungy 9.5
graphic 9.5
protective covering 9.3
temple 9.3
style 8.9
cityscape 8.5
business 8.5
modern 8.4
sky 8.3
letter 8.3
border 8.1
skyscraper 8.1
new 8.1
financial 8
jigsaw puzzle 7.8
empty 7.7
decay 7.7
mystery 7.7
house 7.6
damaged 7.6
finance 7.6
canvas 7.6
buildings 7.6
frame 7.5
cemetery 7.5
landscape 7.4
cell 7.4
natural 7.4
backgrounds 7.3
rough 7.3
paint 7.2

Google
created on 2018-03-24

Microsoft
created on 2018-03-24

text 91.7
old 86.9
vintage 37.8

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 38-59
Gender Male, 54.7%
Calm 46.5%
Disgusted 45.1%
Surprised 45.2%
Confused 45.1%
Sad 45.3%
Angry 45.2%
Happy 52.7%

AWS Rekognition

Age 48-68
Gender Male, 54.5%
Confused 45.9%
Surprised 45.8%
Calm 49.7%
Angry 46.8%
Happy 45.5%
Disgusted 45.6%
Sad 45.7%

AWS Rekognition

Age 38-57
Gender Male, 54.4%
Confused 45.2%
Angry 46.5%
Sad 52.2%
Calm 45.1%
Surprised 45.2%
Disgusted 45.5%
Happy 45.3%

AWS Rekognition

Age 26-44
Gender Male, 54.4%
Calm 45.6%
Angry 45.6%
Confused 45.3%
Sad 48.1%
Surprised 45.5%
Happy 49.4%
Disgusted 45.4%

AWS Rekognition

Age 27-44
Gender Male, 51%
Surprised 45.1%
Disgusted 45.1%
Calm 45.4%
Sad 52.6%
Happy 46.4%
Confused 45.2%
Angry 45.3%

AWS Rekognition

Age 60-90
Gender Male, 52.8%
Calm 45.5%
Angry 45.7%
Disgusted 45.3%
Sad 46.8%
Confused 45.6%
Surprised 45.6%
Happy 50.4%

AWS Rekognition

Age 20-38
Gender Male, 54.3%
Calm 48.7%
Happy 45.6%
Sad 47.6%
Disgusted 45.3%
Angry 46.3%
Confused 45.6%
Surprised 45.8%

AWS Rekognition

Age 45-63
Gender Male, 53.8%
Disgusted 45.1%
Angry 45.4%
Confused 45%
Sad 45.1%
Happy 45.2%
Calm 54.1%
Surprised 45%

Microsoft Cognitive Services

Age 62
Gender Male

Feature analysis

Amazon

Painting 99.3%
Person 98.5%

Text analysis

Amazon

aSrlunory