Human Generated Data

Title

The Forge in the Countryside

Date

17th century

People

Artist: Johannes Visscher, Dutch 1636 - after 1692

Artist after: Philips Wouwerman, Dutch 1619 - 1668

Classification

Prints

Human Generated Data

Title

The Forge in the Countryside

People

Artist: Johannes Visscher, Dutch 1636 - after 1692

Artist after: Philips Wouwerman, Dutch 1619 - 1668

Date

17th century

Classification

Prints

Machine Generated Data

Tags

Amazon

Art 98.1
Animal 94.4
Mammal 94.4
Horse 94.4
Painting 93.4
Human 90.3
Person 90.3
Horse 89.1
Person 86.7
Person 85.3
Horse 84.3
Horse 82.8
Person 71.2
Horse 67.6
Drawing 67.1
Person 65.4
Person 65
Person 57.7
Person 54

Clarifai

people 100
print 99.9
cavalry 99.8
group 99.6
illustration 99.4
engraving 99.2
many 98.6
man 97.7
military 97.7
soldier 97.4
mammal 97
art 96.6
adult 95.8
seated 95.5
transportation system 95.5
wagon 94.3
war 93.7
carriage 93.3
vehicle 93.3
skirmish 92.6

Imagga

snow 88.7
weather 45.4
landscape 26.8
grunge 26.4
winter 26.4
sketch 25.4
drawing 25.4
tree 23.1
old 21.6
forest 20.9
vintage 20.7
cold 19.8
zebra 19.8
texture 18.8
scene 17.3
representation 16.2
equine 15.6
structure 15.2
decoration 15
sky 14.7
graffito 14.4
frost 14.4
pattern 14.4
antique 13.9
white 13.8
frozen 13.4
grungy 13.3
season 13.3
park 13.2
rough 12.8
dirty 12.7
wood 12.5
frame 12.5
trees 12.5
rural 12.3
retro 12.3
brown 11.8
aged 11.8
billboard 11.6
black 11.4
design 11.3
art 11.1
field 10.9
scenery 10.8
snowy 10.7
woods 10.5
outdoors 10.5
ice 10.2
color 10
paint 10
sand 9.7
country 9.7
old fashioned 9.5
paper 9.4
signboard 9.4
space 9.3
grain 9.2
wallpaper 9.2
outdoor 9.2
cool 8.9
grass 8.7
obsolete 8.6
empty 8.6
travel 8.5
silhouette 8.3
environment 8.2
branch 8.2
road 8.1
material 8
yellow 7.9
textured 7.9
day 7.8
frosty 7.8
ancient 7.8
fog 7.7
aging 7.7
damaged 7.6
city 7.5
natural 7.4
light 7.4
graphic 7.3
sun 7.2
tranquil 7.2
border 7.2
morning 7.2
painting 7.2
ungulate 7.2
autumn 7

Google

Microsoft

text 99.3
book 98.3
old 94.8
outdoor 86.8
people 58.6
posing 36.5
vintage 27.3

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Sad 49.9%
Happy 49.5%
Disgusted 49.6%
Surprised 49.6%
Angry 49.7%
Confused 49.6%
Calm 49.7%

AWS Rekognition

Age 26-43
Gender Female, 50.6%
Confused 45.5%
Surprised 45.4%
Calm 45.8%
Sad 46.5%
Disgusted 45.7%
Angry 45.7%
Happy 50.3%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Disgusted 49.5%
Surprised 49.5%
Angry 49.6%
Happy 49.7%
Calm 49.9%
Sad 49.7%
Confused 49.5%

AWS Rekognition

Age 17-27
Gender Female, 50.5%
Angry 49.6%
Confused 49.5%
Disgusted 49.6%
Sad 50.2%
Happy 49.5%
Calm 49.6%
Surprised 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.3%
Happy 49.5%
Angry 50%
Calm 49.6%
Disgusted 49.6%
Sad 49.6%
Surprised 49.6%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Surprised 49.6%
Sad 49.8%
Confused 49.5%
Calm 49.9%
Disgusted 49.6%
Angry 49.6%
Happy 49.5%

Feature analysis

Amazon

Horse 94.4%
Painting 93.4%
Person 90.3%

Captions

Microsoft

a vintage photo of a person riding a horse 89.4%
a vintage photo of a group of people posing for the camera 85.2%
a vintage photo of a horse 85.1%

Text analysis

Amazon

9.oan slefrit