Human Generated Data

Title

Imaginary View of Florence

Date

late 16th century

People

Artist: Jan van der Straet, Netherlandish 1523 - 1605

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Dan Paul, 2012.192

Human Generated Data

Title

Imaginary View of Florence

People

Artist: Jan van der Straet, Netherlandish 1523 - 1605

Date

late 16th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Dan Paul, 2012.192

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Painting 99.7
Art 99.7
Person 99.2
Human 99.2
Person 98.3
Person 97.5
Person 95.3
Person 93.8
Person 93.6
Person 87.4
Person 83.9
Person 77.2
Person 73.4
Person 72.8
Person 68.7
Person 67.6
Person 62.9
Person 55.3
Person 45.2
Person 45.1

Clarifai
created on 2018-03-16

people 99.5
painting 98.6
art 98.5
group 98.1
architecture 97.1
travel 96.3
adult 95.2
no person 94.9
building 94.4
religion 94.2
town 93.6
Renaissance 91.9
man 88.9
woman 88.8
daylight 87.1
street 86.5
church 85.7
city 85.6
home 83.8
many 83

Imagga
created on 2018-03-16

cathedral 76.3
architecture 64.2
building 52.4
church 44.5
window 38.4
city 38.3
old 33.5
religion 33.2
tourism 31.4
travel 29.6
landmark 25.3
house 24.1
historic 23.9
ancient 23.4
tower 23.3
history 21.5
roof 21.4
stone 21.1
urban 21
structure 20.1
framework 18.8
famous 18.6
town 18.6
tourist 18.3
medieval 18.3
historical 17.9
university 17.8
monument 17.8
vault 17.2
sky 16
exterior 15.7
street 15.7
buildings 15.1
palace 14.8
brick 14.8
supporting structure 14.2
religious 14.1
culture 13.7
facade 13.6
wall 13.4
temple 13
arch 12.8
residence 12.5
england 12.4
windows 11.5
detail 11.3
scene 11.3
protective covering 11.2
catholic 10.9
night 10.7
attraction 10.5
door 10.5
place 10.3
river 9.8
boat 9.8
built 9.7
gondola 9.5
construction 9.4
water 9.4
st 8.7
god 8.6
castle 7.8
cities 7.8
monastery 7.7
capital 7.6
cityscape 7.6
covering 7.6
canal 7.6
outdoors 7.5
vacation 7.4
art 7.3

Google
created on 2018-03-16

Microsoft
created on 2018-03-16

building 99.9
outdoor 92.7
old 51.1
way 47.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-55
Gender Male, 53.8%
Disgusted 45.1%
Angry 45.3%
Surprised 45.1%
Sad 53.5%
Confused 45.1%
Calm 45.7%
Happy 45.2%

AWS Rekognition

Age 35-52
Gender Male, 51.9%
Disgusted 45.4%
Surprised 45.7%
Sad 45.6%
Confused 45.2%
Happy 45.5%
Angry 45.2%
Calm 52.3%

AWS Rekognition

Age 17-27
Gender Male, 51.7%
Angry 46.3%
Happy 45.5%
Calm 48.3%
Disgusted 46.1%
Surprised 45.5%
Sad 48%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Male, 53.8%
Angry 45.3%
Disgusted 45.4%
Confused 45.4%
Surprised 45.3%
Happy 45.4%
Calm 46.8%
Sad 51.4%

AWS Rekognition

Age 23-38
Gender Male, 54.6%
Happy 45.1%
Confused 45.2%
Angry 45.5%
Calm 52.1%
Disgusted 45.1%
Surprised 45.3%
Sad 46.7%

AWS Rekognition

Age 45-63
Gender Female, 50.4%
Disgusted 49.5%
Sad 50%
Confused 49.5%
Calm 49.7%
Surprised 49.7%
Angry 49.6%
Happy 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Calm 49.8%
Angry 49.5%
Surprised 49.5%
Sad 50%
Happy 49.6%
Confused 49.5%
Disgusted 49.6%

AWS Rekognition

Age 35-55
Gender Female, 50.2%
Surprised 49.7%
Happy 49.5%
Sad 49.7%
Calm 50%
Confused 49.5%
Angry 49.6%
Disgusted 49.5%

AWS Rekognition

Age 38-59
Gender Female, 50%
Calm 50%
Sad 49.9%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Happy 49.5%

AWS Rekognition

Age 38-59
Gender Female, 50.2%
Happy 46.8%
Calm 47.6%
Sad 48.5%
Confused 45.2%
Disgusted 46%
Surprised 45.5%
Angry 45.4%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Sad 50.2%
Disgusted 49.5%
Calm 49.7%
Happy 49.5%
Surprised 49.5%
Angry 49.5%
Confused 49.5%

Feature analysis

Amazon

Painting 99.7%
Person 99.2%