Human Generated Data

Title

[Figures on ship deck]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1007.119

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Figures on ship deck]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1007.119

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Water 99.9
Waterfront 99.9
Construction 99.7
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Person 99.3
Female 99.3
Woman 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.9
Person 98.6
Person 98.6
Person 98.5
Person 98.2
Person 97.8
Person 96.5
Person 95.6
Person 94.6
Person 93.2
Construction Crane 88.7
Outdoors 87.2
Port 78.3
Car 73.1
Transportation 73.1
Vehicle 73.1
Face 64.6
Head 64.6
Cruiser 57.3
Military 57.3
Navy 57.3
Ship 57.3
Boardwalk 56.5
Bridge 56.5
Yacht 56.2
Oilfield 56
Arch 55.2
Architecture 55.2
Pier 55.2
Back 55.1
Body Part 55.1

Clarifai
created on 2023-10-15

people 99.8
group 98.9
group together 98.9
many 98
aircraft 97.1
adult 96.9
man 96.7
vehicle 96.2
transportation system 94.9
airplane 94
military 93.5
watercraft 90.6
airport 89.6
woman 88.8
leader 87.9
war 87.8
wear 86.7
administration 78.7
child 78.3
crowd 74.7

Imagga
created on 2019-01-31

engineer 42.1
sky 30.7
industry 26.5
construction 22.2
industrial 19.1
building 18.3
tower 17.9
structure 17.4
power 16.8
crane 16.3
architecture 15.6
city 15
environment 14.8
steel 14.2
urban 14
metal 13.7
travel 13.4
equipment 13
business 12.8
water 12.7
clouds 12.7
landscape 12.6
energy 12.6
work 12.6
scene 12.1
tourism 11.5
ship 11.3
cloud 11.2
man 10.7
high 10.4
environmental 10.4
winter 10.2
snow 10
silhouette 9.9
old 9.8
technology 9.6
black 9.6
pollution 9.6
electricity 9.4
sea 9.4
history 8.9
stage 8.9
factory 8.7
port 8.7
platform 8.6
development 8.6
outdoor 8.4
house 8.4
global 8.2
danger 8.2
cable 8.1
destruction 7.8
cold 7.7
station 7.7
cannon 7.7
concrete 7.7
part 7.6
site 7.5
electric 7.5
outdoors 7.5
town 7.4
new 7.3
protection 7.3
people 7.2
landmark 7.2
river 7.1
vessel 7

Google
created on 2019-01-31

Microsoft
created on 2019-01-31

person 94.6
window 90.8
people 66.9
group 63.1
old 49.4
posing 35.8
picture frame 6.8
train 3.3
black and white 3.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Female, 56.3%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Happy 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Female, 98.5%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 2-8
Gender Female, 85.7%
Sad 100%
Calm 6.6%
Surprised 6.4%
Fear 6.2%
Happy 1.6%
Angry 1%
Disgusted 0.9%
Confused 0.7%

AWS Rekognition

Age 7-17
Gender Male, 85.7%
Calm 92%
Surprised 8.4%
Fear 5.9%
Sad 2.6%
Happy 0.9%
Disgusted 0.6%
Confused 0.6%
Angry 0.5%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Female 99.3%
Woman 99.3%
Car 73.1%

Categories