Human Generated Data

Title

[Passengers on ship deck]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.466.36

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Passengers on ship deck]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.3
Person 99.3
Person 97.8
Person 96.5
Person 95.6
Lighting 95.2
Building 93.3
Boardwalk 91.5
Bridge 91.5
Stage 81.3
Person 70.5
Architecture 68.1
Person 66.3
Leisure Activities 64.5
Space 61.3
Astronomy 61.3
Universe 61.3
Outer Space 61.3
Flooring 58.5
Column 55.9
Pillar 55.9
Dance Pose 55.4
Handrail 55.2
Banister 55.2
Person 52.9

Clarifai
created on 2019-11-19

monochrome 99.8
people 99.4
street 98.1
adult 97.1
light 96.2
man 94.3
city 93.3
woman 93.3
girl 93
stage 91.8
group 91.5
shadow 91.1
theater 90.9
music 90.8
black and white 89.5
group together 88.9
concert 88.3
silhouette 84.6
landscape 84.5
opera 84.1

Imagga
created on 2019-11-19

stage 52.1
platform 41.4
silhouette 26.5
night 24
light 22
sky 21.1
sunset 18.9
sun 17.7
landscape 17.1
travel 16.9
people 16.7
water 16.7
person 16.5
sea 16.4
dusk 16.2
man 15.7
evening 14.9
male 14.9
architecture 14.2
city 14.1
pier 13.7
history 13.4
black 13.3
ocean 13.3
lights 13
beach 12.4
device 11.9
grand piano 11.4
structure 11.3
percussion instrument 11
dark 10.9
landmark 10.8
piano 10.7
tourism 10.7
lamp 10.5
scene 10.4
stone 10.1
tower 10
orange 10
coast 9.9
building 9.8
twilight 9.7
urban 9.6
sunrise 9.4
holiday 9.3
business 9.1
religion 9
romantic 8.9
symbol 8.7
support 8.7
fountain 8.5
adult 8.4
design 8.4
color 8.3
island 8.2
musical instrument 8.2
tourist 8.2
reflection 8.1
sexy 8
celebration 8
bright 7.9
men 7.7
summer 7.7
culture 7.7
keyboard instrument 7.6
stringed instrument 7.5
vacation 7.4
chair 7.3
horizon 7.2
river 7.1
vibrant 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

Face analysis

Amazon

AWS Rekognition

Age 4-12
Gender Male, 50.4%
Sad 50.3%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 49.5%
Fear 49.5%
Happy 49.5%
Angry 49.6%

AWS Rekognition

Age 25-39
Gender Female, 50.1%
Sad 50.4%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 49.5%
Fear 49.5%
Happy 49.5%
Angry 49.5%

AWS Rekognition

Age 49-67
Gender Male, 50.2%
Calm 49.6%
Surprised 49.5%
Happy 49.5%
Angry 49.7%
Disgusted 49.5%
Sad 50%
Fear 49.5%
Confused 49.7%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a person in a dark room 54.6%
a person sitting in a dark room 35%
a person standing in a dark room 34.9%