Human Generated Data

Title

[Onboard the S.S. Pennsylvania, view of workers]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.158.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Onboard the S.S. Pennsylvania, view of workers]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.158.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 99.6
Human 99.6
Person 89.3
Stage 85.2
Leisure Activities 78.1
Clothing 75.2
Apparel 75.2
Circus 71.6
Crowd 60
Interior Design 57.4
Indoors 57.4
Theater 55.9
Room 55.9

Clarifai
created on 2021-04-04

people 99
man 97.8
monochrome 96.7
adult 95.5
rope 94.8
woman 89.8
vehicle 89.6
street 86.6
two 85.7
group together 85.4
group 85.3
music 83.5
military 83.4
one 82.8
boy 81.2
transportation system 78.7
war 75.4
watercraft 74.5
construction worker 74.3
shadow 73.7

Imagga
created on 2021-04-04

stage 51.7
musical instrument 39.8
platform 36.3
percussion instrument 33.1
man 28.9
vibraphone 28.6
person 22.8
male 22.7
sport 21.4
device 20.3
silhouette 19.9
stringed instrument 19
people 17.3
bowed stringed instrument 16.4
sunset 13.5
black 13.2
adult 13
water 12.7
leisure 12.5
cello 12
outdoors 11.9
recreation 11.7
club 11.3
business 10.9
exercise 10.9
men 10.3
singer 10.3
sea 10.2
fisherman 10.1
music 9.9
rock 9.6
day 9.4
ball 9.2
equipment 8.9
musician 8.9
businessman 8.8
body 8.8
grass 8.7
fishing 8.7
relaxation 8.4
summer 8.4
hat 8.2
laptop 8.2
protection 8.2
fitness 8.1
sun 8.1
job 8
rod 7.9
boy 7.8
model 7.8
play 7.8
youth 7.7
sky 7.7
outdoor 7.6
golf 7.6
studio 7.6
dark 7.5
sports 7.4
guitar 7.3
lake 7.3
competition 7.3
group 7.3
sexy 7.2
worker 7.1
microphone 7
performer 7

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

text 97.7
person 92.5
black and white 89.4
ship 62.3
clothing 57.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-54
Gender Male, 83.4%
Calm 85.4%
Sad 13%
Angry 0.6%
Fear 0.4%
Confused 0.3%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Captions

Microsoft
created on 2021-04-04

a man holding a gun 33.9%

Text analysis

Google

ADMITTANCE
ADMITTANCE