Human Generated Data

Title

[Man and woman on ferry]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.217.30

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man and woman on ferry]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 98.2
Person 98.2
Person 97.9
Transportation 88.5
Boat 88.5
Vehicle 88.5
Face 86.8
Finger 74.4
Sitting 74.3
Skin 66.7
Apparel 61
Clothing 61
Head 60.9
Portrait 60.2
Photo 60.2
Photography 60.2
Food 58.4
Meal 58.4
Musical Instrument 58.3
Musician 58.3
Neck 57.9
Display 57.4
Screen 57.4
Monitor 57.4
Electronics 57.4
LCD Screen 57.4

Clarifai
created on 2019-11-19

people 99.8
adult 99.3
one 98.9
man 98.9
portrait 98.6
monochrome 94.9
music 94.5
side view 94.2
light 93.5
profile 93.2
wear 92
musician 91
facial expression 87.9
furniture 86.2
recreation 85
two 84.7
street 84.2
concentration 84.1
leader 83
singer 82.8

Imagga
created on 2019-11-19

world 42.8
man 26.2
people 25.7
grandma 25.5
portrait 24.6
happy 24.4
person 24.4
adult 23.6
hair 23
male 22.2
smile 20
black 18.6
senior 17.8
couple 17.4
love 16.6
old 15.3
face 14.9
attractive 14.7
pretty 14
smiling 13.7
child 13.7
mother 13.5
lady 13
eyes 12.9
human 12.7
groom 12.7
elderly 12.4
grandfather 12.3
men 12
body 12
husband 11.6
one 11.2
mature 11.2
skin 11.1
happiness 11
head 10.9
gray 10.8
wife 10.4
hands 10.4
blond 10.4
looking 10.4
piano 10.4
parent 10.1
silhouette 9.9
close 9.7
retired 9.7
together 9.6
sexy 9.6
marriage 9.5
expression 9.4
dark 9.2
grand piano 9.1
health 9
romance 8.9
handsome 8.9
women 8.7
water 8.7
stringed instrument 8.6
fun 8.2
wet 8
lifestyle 7.9
grandmother 7.8
married 7.7
studio 7.6
hand 7.6
joy 7.5
care 7.4
hold 7.4
girls 7.3
holiday 7.2
family 7.1
look 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 98.7
text 91.9
man 90.2
human face 85.9
dark 46

Face analysis

Amazon

AWS Rekognition

Age 35-51
Gender Male, 84.3%
Calm 7.1%
Sad 49.8%
Angry 17.7%
Fear 12.3%
Disgusted 0.3%
Happy 1.8%
Surprised 10.3%
Confused 0.8%

Feature analysis

Amazon

Person 98.2%
Boat 88.5%

Captions

Microsoft

a person sitting in a dark room 71.6%
a man and a woman sitting in a dark room 55.2%
a man and a woman in a dark room 55.1%

Text analysis

Google

ERKST
ERKST