Human Generated Data

Title

The Dam and Damrak, Amsterdam

Date

c. 1663

People

Artist: Jan van der Heyden, Dutch 1637 - 1712

Artist: Adriaen van de Velde, Dutch 1636 - 1672

Previous attribution: Gerrit Adriaensz Berckheyde, Dutch 1638 - 1698

Classification

Paintings

Human Generated Data

Title

The Dam and Damrak, Amsterdam

People

Artist: Jan van der Heyden, Dutch 1637 - 1712

Artist: Adriaen van de Velde, Dutch 1636 - 1672

Previous attribution: Gerrit Adriaensz Berckheyde, Dutch 1638 - 1698

Date

c. 1663

Classification

Paintings

Machine Generated Data

Tags

Amazon

Painting 99.5
Art 99.5
Human 99.1
Person 99.1
Person 99
Person 96.4
Person 95.8
Person 94.8
Person 94.6
Person 92
Person 86.7
Road 82.2
Person 79.1
Person 78.3
Person 77.2
Person 74.1
Person 69.4
Mammal 66.3
Horse 66.3
Animal 66.3
Person 64.9
Cattle 63.5
Cow 63.5
Urban 61
Person 50.6
Person 43.1

Clarifai

people 99.6
group 99.2
vehicle 98
many 97.3
no person 96.8
travel 95.8
architecture 95.5
home 95.1
daylight 95.1
outdoors 94.9
adult 94.3
city 94
building 93.9
military 93.6
religion 92.9
town 92.4
landscape 90.2
house 89.3
war 89
street 88

Imagga

palace 54.1
architecture 52.9
building 50.9
city 50.8
university 50.3
town 39
tower 34.1
old 33.5
travel 33.2
fortress 31.5
house 31.2
tourism 30.6
sky 28.6
landmark 28
buildings 26.5
urban 26.3
cathedral 25.1
castle 24.4
history 24.2
church 23.2
skyscraper 22
monument 21.5
residence 21.3
center 20.8
roof 20.5
structure 20.1
medieval 19.2
skyline 18.1
famous 17.7
historic 17.5
brick 17.1
capital 17.1
ancient 16.5
river 16
tourist 15.6
religion 15.3
stone 15.2
wall 14.7
cityscape 14.2
historical 13.2
square 12.7
landscape 12.7
traditional 12.5
panorama 12.4
bridge 12.3
fortification 12.1
construction 12
culture 12
houses 11.6
panoramic 11.5
dwelling 11.5
st 10.7
religious 10.3
destination 10.3
exterior 10.2
window 10.1
roofs 9.9
republic 9.8
sightseeing 9.8
day 9.4
water 9.4
vacation 9
high 8.7
downtown 8.7
tree 8.5
street 8.3
new 8.1
facade 8
towers 7.8
central 7.8
scene 7.8
aerial 7.8
england 7.6
place 7.5
sea 7
scenic 7

Google

sky 96.5
town 89.2
painting 86.9
evening 84.4
cloud 83.8
wall 80.1
morning 77
city 75.4
impressionist 67.9
visual arts 67.3
cityscape 67.2
tourist attraction 66.5
history 62.9
dusk 60
sunset 58.2
horizon 53.3

Microsoft

outdoor 94.4
old 43.2
day 14.1
crowd 0.6

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 50.5%
Sad 49.9%
Angry 49.6%
Happy 49.5%
Calm 49.6%
Surprised 49.6%
Disgusted 49.7%
Confused 49.6%

AWS Rekognition

Age 10-15
Gender Female, 50.2%
Happy 49.6%
Calm 49.6%
Surprised 49.5%
Angry 49.7%
Confused 49.5%
Disgusted 49.5%
Sad 50%

AWS Rekognition

Age 17-27
Gender Female, 50.1%
Angry 49.6%
Sad 50.1%
Calm 49.6%
Confused 49.5%
Surprised 49.5%
Happy 49.5%
Disgusted 49.7%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Confused 49.5%
Sad 49.8%
Calm 49.6%
Happy 49.6%
Surprised 49.5%
Angry 49.9%
Disgusted 49.6%

AWS Rekognition

Age 15-25
Gender Male, 50.5%
Sad 49.7%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 49.9%
Happy 49.7%
Confused 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50%
Sad 50.2%
Confused 49.5%
Happy 49.6%
Disgusted 49.5%
Angry 49.6%
Surprised 49.5%
Calm 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Sad 49.7%
Angry 49.6%
Disgusted 49.6%
Surprised 49.6%
Calm 49.7%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Confused 49.6%
Angry 49.6%
Surprised 49.5%
Calm 49.7%
Happy 49.6%
Disgusted 49.6%
Sad 50%

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Disgusted 49.5%
Confused 49.6%
Angry 49.5%
Surprised 49.5%
Happy 49.7%
Sad 50%
Calm 49.6%

Feature analysis

Amazon

Painting 99.5%
Person 99.1%
Horse 66.3%
Cow 63.5%

Captions

Microsoft

a group of people walking in front of a building 94.8%
a group of people walking in front of a large building 93.1%
a group of people standing in front of a building 91%