Human Generated Data

Title

[Ferry between Hamburg and Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.125.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Ferry between Hamburg and Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.125.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Stage 99.9
Person 99.8
Human 99.8
Person 99.5
Person 99.3
Person 98.4
Person 98.3
Person 93.9
Leisure Activities 84.5
Clothing 84.1
Apparel 84.1
Shorts 82.2
Room 66.5
Indoors 66.5
Dance Pose 61.4
Theater 57.4
Musical Instrument 56.8
People 56.5
Person 55.3
Person 43.2

Clarifai
created on 2021-04-03

people 99.9
music 98.5
group 98.1
adult 97.3
group together 97.3
man 97.1
many 96.7
concert 96.4
theater 95.7
stage 95.4
audience 94.6
musician 92.6
woman 92.5
performance 92
child 91.9
opera 91.7
auditorium 91
vehicle 89.7
singer 89.3
portrait 89.3

Imagga
created on 2021-04-03

percussion instrument 100
musical instrument 84.3
vibraphone 77.1
device 36.5
man 26.2
male 23.4
people 20.1
piano 19.6
person 18.5
business 18.2
marimba 17.4
grand piano 16.2
stringed instrument 16.1
office 15.4
keyboard instrument 15.2
chair 14.2
interior 14.1
men 13.7
night 12.4
black 12
adult 11.7
table 11.5
light 11.4
room 11.3
modern 11.2
lifestyle 10.8
music 10.8
steel drum 10.8
hand 10.6
businessman 10.6
work 10.2
computer 9.8
fun 9.7
group 9.7
technology 9.6
sitting 9.4
smiling 9.4
silhouette 9.1
indoors 8.8
happy 8.8
women 8.7
laptop 8.3
entertainment 8.3
holding 8.3
worker 8.1
love 7.9
happiness 7.8
education 7.8
glass 7.8
model 7.8
portrait 7.8
center 7.6
dance 7.6
relaxation 7.5
house 7.5
dark 7.5
electronic instrument 7.5
indoor 7.3
upright 7.2
suit 7.2
working 7.1

Google
created on 2021-04-03

Entertainment 82.3
Black-and-white 82.3
Performing arts 76.8
Suit 76.3
Event 72.8
Monochrome photography 71.3
Stage 69.6
Monochrome 69.5
Darkness 68.2
Vintage clothing 65.9
Crew 65.5
Midnight 64.5
Music 62.6
Night 59.4
Room 58.3
Performance 57.1
Chair 55.1
Curtain 54.1
Recreation 53.7
Hat 53

Microsoft
created on 2021-04-03

black and white 94.9
person 87.8
monochrome 81.6
man 72.5
clothing 68.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-49
Gender Male, 90.9%
Sad 67.4%
Fear 18.5%
Calm 9.9%
Angry 1.7%
Happy 1.6%
Surprised 0.4%
Confused 0.4%
Disgusted 0.1%

AWS Rekognition

Age 33-49
Gender Female, 69%
Sad 97.2%
Calm 0.8%
Fear 0.7%
Confused 0.4%
Angry 0.4%
Disgusted 0.2%
Happy 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.8%

Categories