Human Generated Data

Title

Historical Scene

Date

17th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Marian H. Phinney Fund, 1978.472

Human Generated Data

Title

Historical Scene

People

Artist: Unidentified Artist,

Date

17th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Marian H. Phinney Fund, 1978.472

Machine Generated Data

Tags

Amazon
created on 2020-05-01

Art 98.2
Human 94.8
Person 94.8
Person 94.6
Painting 93
Person 89.6
Person 80.6
Person 76.6
Drawing 75.2
Animal 74.8
Horse 74.8
Mammal 74.8
Horse 67.9
Horse 67.7
Person 58.4
Person 43.6

Clarifai
created on 2020-05-01

people 99.8
group 99.4
illustration 99.2
print 99
art 98.9
adult 98
cavalry 97.8
engraving 96.3
wear 95.7
man 95.4
painting 95
war 90.4
many 89.3
vehicle 86.9
vintage 85.6
mammal 85.3
military 85.3
soldier 85
skirmish 83.8
administration 83.4

Imagga
created on 2020-05-01

sketch 87.1
drawing 69
representation 54.3
billboard 38.4
signboard 32.1
structure 31.3
landscape 24.5
snow 22.6
old 21.6
grunge 21.3
vintage 20.7
tree 19.2
winter 17.9
sky 17.9
cold 17.2
outdoors 17.2
trees 16.9
scenery 16.2
travel 16.2
forest 15.7
antique 14.7
black 14.4
park 14.2
texture 13.9
ancient 13.8
river 13.3
mountain 13.3
aged 12.7
rural 12.3
mountains 12
paper 11.8
history 11.6
tourism 11.5
outdoor 11.5
scenic 11.4
scene 11.3
season 10.9
national 10.9
city 10.8
snowy 10.7
environment 10.7
art 10.5
grungy 10.4
architecture 10.3
wall 10.1
landmark 9.9
retro 9.8
freeze 9.7
woods 9.6
old fashioned 9.5
water 9.3
building 9.1
seasonal 8.8
nobody 8.6
ice 8.5
wilderness 8.5
summer 8.4
frame 8.3
exterior 8.3
dirty 8.1
brown 8.1
urban 7.9
fog 7.7
outside 7.7
culture 7.7
frost 7.7
clouds 7.6
historical 7.5
wood 7.5
grain 7.4
natural 7.4
countryside 7.3
business 7.3
new 7.3
paint 7.2
country 7

Google
created on 2020-05-01

Microsoft
created on 2020-05-01

drawing 99.8
text 99.4
sketch 99.4
painting 94.3
gallery 91.5
art 88.6
scene 80.3
cartoon 80.3
room 75.6
person 65
child art 52.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-51
Gender Male, 50.4%
Sad 50.1%
Calm 49.5%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Confused 49.5%
Fear 49.8%

AWS Rekognition

Age 1-7
Gender Male, 50%
Disgusted 49.5%
Calm 49.6%
Surprised 49.5%
Fear 49.5%
Sad 49.9%
Angry 49.9%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 17-29
Gender Male, 50.3%
Disgusted 49.6%
Angry 49.6%
Happy 49.5%
Confused 49.9%
Calm 49.5%
Surprised 49.5%
Fear 49.6%
Sad 49.7%

AWS Rekognition

Age 54-72
Gender Male, 50.3%
Surprised 49.5%
Confused 49.5%
Disgusted 50.2%
Calm 49.5%
Angry 49.5%
Happy 49.5%
Fear 49.6%
Sad 49.6%

AWS Rekognition

Age 43-61
Gender Male, 50.4%
Happy 49.5%
Angry 49.5%
Surprised 49.7%
Confused 49.5%
Calm 50.3%
Sad 49.5%
Disgusted 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 94.8%
Painting 93%
Horse 74.8%

Captions

Text analysis

Amazon

MUSEUM.
FOGG
1978.47
ART
FOGG ART MUSEUM. HA
German
HA

Google

FOGG ART MUSEUM, HA 1978.47 ACCESSION an. German
FOGG
ART
MUSEUM,
HA
1978.47
ACCESSION
an.
German