Human Generated Data

Title

[View from Riverside Drive of the North River]

Date

1939

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.502.30

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View from Riverside Drive of the North River]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.4
Person 99.4
Person 99.3
Person 98.3
Person 98.1
Person 98
Clothing 90
Apparel 90
Banister 89.5
Handrail 89.5
Flooring 72.8
Dessert 64.2
Icing 64.2
Cake 64.2
Food 64.2
Cream 64.2
Creme 64.2
Military 58.4
Officer 58.4
Military Uniform 58.4
Coat 56.3
Overcoat 56.3

Clarifai
created on 2019-11-19

people 99.9
group together 99.7
adult 98.1
man 97.6
group 97.5
several 96.8
many 96.5
wear 96.2
watercraft 95.7
leader 94.2
vehicle 93.2
four 92.5
two 91.9
woman 91.9
military 90
outfit 89.6
administration 88.3
transportation system 88
five 86.8
recreation 86.7

Imagga
created on 2019-11-19

man 31.1
male 23.4
sky 21.7
sea 20.3
beach 19.4
people 19
sunset 18.9
travel 18.3
ocean 17.5
silhouette 16.5
water 16
person 15.7
building 15.4
tourism 14.8
architecture 14.8
tourist 14.6
outdoors 14.5
world 13.5
couple 13.1
coast 11.7
summer 11.6
sun 11.3
shore 11.1
love 11
romantic 10.7
businessman 10.6
spectator 10.5
landscape 10.4
adult 10.3
business 10.3
day 10.2
lifestyle 10.1
groom 10.1
sand 10
history 9.8
old 9.7
dusk 9.5
sitting 9.4
men 9.4
happiness 9.4
sunrise 9.4
landmark 9
lab coat 8.8
standing 8.7
smiling 8.7
sunny 8.6
horizon 8.1
suit 8.1
group 8.1
romance 8
scenic 7.9
together 7.9
bride 7.8
life 7.8
full length 7.8
clothing 7.7
palace 7.7
stone 7.7
evening 7.5
famous 7.4
boat 7.4
column 7.4
vacation 7.4
wedding 7.4
holiday 7.2
river 7.1
women 7.1
coat 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 92.4
clothing 88.8
text 75.5
man 66.5
black and white 63.4

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a black and white photo of a man 75.8%
an old photo of a man 74.9%
black and white photo of a man 69%