Human Generated Data

Title

[Man in rowboat with reflection]

Date

1931-1933

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.248.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man in rowboat with reflection]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931-1933

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 93.9
Person 93.9
Furniture 90.3
Nature 86
Outdoors 84.7
Musical Instrument 66.6
Leisure Activities 66.6
Piano 66.6
Building 66.2
Face 63.4
Clothing 59.7
Apparel 59.7
Crypt 55.9
Silhouette 55.1

Clarifai
created on 2019-11-19

people 99.1
light 96.4
landscape 96.3
monochrome 95.6
no person 95.1
adult 93.9
one 92.5
art 91.4
mammal 89.9
shadow 89.2
cavalry 87.8
wear 87.6
portrait 86.1
man 86
indoors 83.6
silhouette 83.5
reflection 83.3
cow 83
travel 82.7
transportation system 82.6

Imagga
created on 2019-11-19

upright 100
piano 100
stringed instrument 100
percussion instrument 100
keyboard instrument 100
musical instrument 82.2
grand piano 44.8
water 18
light 17.4
wood 15
landscape 14.9
sunset 14.4
dark 14.2
silhouette 14.1
sky 13.4
black 13.2
tree 13.1
sun 12.9
bench 12.8
old 12.5
sea 12.5
travel 12
lake 11.9
park 11.5
keyboard 11.3
music 10.8
wooden 10.5
sunrise 10.3
outdoor 9.9
dawn 9.7
scenic 9.7
dusk 9.5
evening 9.3
peaceful 9.2
ocean 9.1
summer 9
grass 8.7
antique 8.6
musical 8.6
instrument 8.4
color 8.3
outdoors 8.2
shadow 8.1
home 8
forest 7.8
cloud 7.7
classical 7.6
beach 7.6
relax 7.6
chair 7.6
key 7.5
boat 7.4
calm 7.3
coast 7.2
art 7.2
river 7.1
trees 7.1
night 7.1
interior 7.1
rural 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

grass 96.3
outdoor 88.4
black and white 85.4
monochrome 81.4
dark 77
text 63.2
white 60.6
weapon 55.5
gun 38.7

Face analysis

Amazon

AWS Rekognition

Age 44-62
Gender Male, 53.9%
Confused 45.2%
Happy 45.3%
Angry 45.9%
Fear 45.6%
Calm 49%
Surprised 45.4%
Sad 48.5%
Disgusted 45.2%

Feature analysis

Amazon

Person 93.9%
Piano 66.6%

Captions

Microsoft

a black and white photo of a fire 52.3%
a person in a dark room 52.2%
a person sitting in a dark room 37.9%