Human Generated Data

Title

[People on ship deck]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.169.16

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People on ship deck]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Human 99.1
Person 99.1
Person 86.8
Apparel 83.1
Clothing 83.1
Person 80.5
Urban 77.1
Building 72.8
Construction 67.3
Hardhat 66.9
Helmet 66.9
Town 65
City 65
Person 62.8
Face 61.7
High Rise 58.8
Architecture 57.7
Wood 55.5

Clarifai
created on 2021-04-04

people 99.9
group 98.3
group together 98.3
vehicle 98.2
watercraft 98
adult 97.8
many 97.4
monochrome 97.2
transportation system 96
man 95.9
woman 92.2
wear 92.2
street 90.9
several 89.7
two 89.2
one 87.6
ship 87
military 84.9
administration 84.2
three 83.3

Imagga
created on 2021-04-04

ferris wheel 68.3
rotating mechanism 54.8
ride 53.6
mechanical device 41.5
mechanism 31.8
building 25.8
device 25
industrial 21.8
water 20
ship 18.8
vessel 18.3
industry 17.9
urban 17.5
architecture 17.5
factory 17.4
night 16.9
city 16.6
steel 15.9
bridge 15.7
metal 15.3
engineering 15.2
boat 14.9
machine 13.9
structure 13.4
travel 13.4
old 13.2
business 12.7
sea 12.5
sky 12.1
power 11.8
heavy 11.4
equipment 11.3
modern 11.2
reflection 10.6
construction 10.3
ocean 10
transportation 9.9
manufacturing 9.8
boats 9.7
technical 9.7
port 9.6
work 9.5
craft 9.3
street 9.2
gear 9.2
rope 9
black 9
machinery 8.8
mechanical 8.7
light 8.7
pirate 8.6
rigging 8.5
lights 8.3
plant 8.2
landmark 8.1
man 8.1
tower 8.1
river 8
deck 8
loom 8
people 7.8
dock 7.8
waste 7.8
pollution 7.7
house 7.5
iron 7.5
tourism 7.4
inside 7.4
schooner 7.3

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

ship 99.3
text 96.8
indoor 93.6
black and white 93.5
person 92.3
vehicle 75.8
watercraft 73.3
boat 69
monochrome 59.8

Face analysis

Amazon

AWS Rekognition

Age 41-59
Gender Male, 80.1%
Calm 34.3%
Surprised 24.5%
Sad 17.9%
Fear 14.2%
Angry 5.3%
Happy 2.4%
Confused 1.2%
Disgusted 0.3%

AWS Rekognition

Age 26-42
Gender Male, 63.2%
Sad 59.3%
Calm 35.6%
Confused 2.4%
Angry 1%
Fear 0.7%
Surprised 0.5%
Happy 0.5%
Disgusted 0.1%

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people sitting on a bed 49.7%
a group of people in a room 49.6%
a group of people on a bed 49.5%