Human Generated Data

Title

View of a Town with an Obelisk

Date

19th century

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Frances L. Hofer, 1979.8

Human Generated Data

Title

View of a Town with an Obelisk

People

Artist: Unidentified Artist,

Date

19th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Frances L. Hofer, 1979.8

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 94.9
Human 94.9
Art 93.4
Painting 89.9
Person 81.6
Person 74.4
Person 50.7
Person 48.9

Clarifai
created on 2020-04-24

people 99.6
print 96.9
art 96.4
monochrome 96.4
home 95.7
no person 94.4
adult 94.2
group 94
military 93.6
war 93.2
street 92.4
man 91.7
soldier 87.6
tree 87.3
cemetery 87
infrared 85.6
house 85.5
cavalry 85.5
painting 84.5
two 83.9

Imagga
created on 2020-04-24

landscape 28.3
wall 28.2
tree 24.8
old 24.4
forest 21.8
fence 19.3
tunnel 19
snow 19
structure 18.7
stone 17.1
stocks 15.8
architecture 15.6
trees 15.1
building 14.7
sky 14.7
passageway 14.5
road 14.5
device 14.3
travel 14.1
park 14
ancient 13.8
instrument of punishment 13.7
grunge 13.6
barrier 13.1
passage 12.8
mountain 12.5
vintage 12.4
outdoor 12.2
season 11.7
natural 11.4
instrument 10.9
way 10.8
light 10.7
environment 10.7
scene 10.4
winter 10.2
river 9.8
ruins 9.7
outdoors 9.7
rural 9.7
country 9.7
stile 9.6
black 9.6
frost 9.6
rock 9.6
cold 9.5
construction 9.4
wood 9.2
weather 9.2
countryside 9.1
industrial 9.1
fall 9.1
dirty 9
scenery 9
retro 9
sun 8.9
grass 8.7
fog 8.7
antique 8.7
art 8.5
texture 8.3
brick 8.3
tourism 8.3
obstruction 8.1
autumn 7.9
scenic 7.9
urban 7.9
upright 7.8
ruin 7.8
wilderness 7.6
frame 7.5
city 7.5
window 7.5
mountains 7.4
vacation 7.4
street 7.4
water 7.3
lake 7.3
peaceful 7.3
branch 7.3
aged 7.2
morning 7.2
cell 7.2
bridge 7.2
history 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 100
painting 99.5
drawing 99.2
tree 97.9
sketch 96.5
old 89.6
house 89.2
black and white 77.7
child art 67.6
vintage 33

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-26
Gender Female, 50.2%
Happy 49.5%
Sad 49.5%
Fear 49.5%
Confused 49.5%
Calm 50.3%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 94.9%
Painting 89.9%

Categories

Captions

Microsoft
created on 2020-04-24

a vintage photo of a train 46.2%
an old photo of a train 44.1%
a vintage photo of a person 44%

Text analysis

Amazon

FOGG
SARRY
UNIVERSITY
ART
FOGG ART MUSEUM
DONAHUE
HARV UNIVERSITY SARRY DONAHUE
MUSEUM
CREDIT
PLEASE CREDIT
PLEASE
/9798
HARV
BY
PHOTO BY
PHOTO
TODOUUWWUDLLEEDDCCODCEDODODOTOY

Google

FOGG
ART
BY
UNIVERSITY
BARRY
1979.8 FOGG ART MUSEUM PHOTO BY HARVARD UNIVERSITY BARRY DONAHUE PLEASE CREDIT
1979.8
MUSEUM
PHOTO
HARVARD
DONAHUE
PLEASE
CREDIT