Human Generated Data

Title

[Passengers on ship deck]

Date

unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.149.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Passengers on ship deck]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

unknown

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 99.7
Human 99.7
Person 99.6
Person 97.8
Apparel 97.4
Clothing 97.4
Wood 79.5
Person 77.2
Coat 73.2
Banister 70.3
Handrail 70.3
Pants 69.3
Overcoat 67.1
Tire 64.2
Wheel 58.9
Machine 58.9
Alloy Wheel 57.9
Spoke 57.9
Suit 57.9
Pedestrian 57.2
Hat 56.4

Clarifai
created on 2021-04-04

people 99.7
man 96.7
monochrome 96.1
music 95
adult 94.9
portrait 94.3
woman 93.5
light 92.8
street 92.1
two 91.1
art 90.9
dancer 90.5
theater 89.8
indoors 89.7
girl 88.2
shadow 86.3
one 84.2
stage 84
child 83.8
boy 83.4

Imagga
created on 2021-04-04

cadaver 35.5
dark 25.9
person 24.8
adult 24.6
body 22.4
people 21.2
dancer 20.4
sexy 20.1
model 19.5
erotic 18.2
attractive 16.8
performer 16
happy 15.7
sensual 15.5
fashion 15.1
hair 15.1
portrait 14.9
light 14
water 14
man 13.4
passion 13.2
lifestyle 13
lady 13
skin 12.8
one 12.7
style 12.6
pretty 12.6
posing 12.4
enjoy 12.2
studio 12.2
sitting 12
blond 12
love 11.8
happiness 11.8
world 11.4
fun 11.2
sensuality 10.9
silhouette 10.8
night 10.7
male 10.6
seductive 10.5
human 10.5
room 10.2
dance 10.1
smile 10
entertainer 9.9
pleasure 9.4
face 9.2
black 9.2
wet 9
together 8.8
rain 8.5
orange 8.4
stone 8.4
beach 8.4
floor 8.4
pose 8.2
dress 8.1
romantic 8
interior 8
seduce 7.9
couple 7.8
shower 7.8
scene 7.8
nude 7.8
motion 7.7
lying 7.5
house 7.5
leisure 7.5
evening 7.5
slim 7.4
sofa 7.3
exercise 7.3
gorgeous 7.2
dirty 7.2
fitness 7.2
teacher 7.2
sunset 7.2
home 7.2
women 7.1
sea 7
modern 7

Microsoft
created on 2021-04-04

clothing 96
black and white 94.3
person 93.1
text 86.7
man 86.5
indoor 85.9
monochrome 85
footwear 82.7

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 54.6%
Sad 73.9%
Calm 17.8%
Fear 3.8%
Confused 1.9%
Angry 1.2%
Happy 0.6%
Disgusted 0.3%
Surprised 0.3%

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man and a woman standing in a room 55.7%
a couple of people that are standing in a room 55.6%
a person standing in a room 55.5%