Human Generated Data

Title

[Aboard ocean liner]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.161.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Aboard ocean liner]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.161.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 97.2
Human 97.2
Person 96.8
Person 87.2
Person 85.6
Face 79.7
Person 78.1
Workshop 71.8
Crowd 68.8
People 66.7
Room 64.3
Indoors 64.3
Clinic 63.9
Sitting 55.7

Clarifai
created on 2021-04-04

people 99.9
group 99.2
adult 98.5
administration 96.7
man 95.8
wear 95.5
group together 95.3
leader 94.6
music 91
military 91
three 89
several 88.5
war 85.7
many 85.6
two 84.1
vehicle 83.2
four 82.3
outfit 82.2
retro 80.5
interaction 79.2

Imagga
created on 2021-04-04

painter 40.7
sketch 23.7
drawing 20.4
person 17.2
decoration 17
man 16.8
black 16.2
people 16.2
graffito 14.7
fashion 14.3
portrait 14.2
art 13.2
representation 12.9
vintage 11.7
retro 11.5
sexy 11.2
hair 11.1
adult 11
paper 10.9
old 10.4
grunge 10.2
male 9.9
style 9.6
hand 9.1
painting 9
design 8.9
happy 8.8
brunette 8.7
lifestyle 8.7
ancient 8.6
wall 8.5
face 8.5
head 8.4
pattern 8.2
paint 8.1
closeup 8.1
body 8
love 7.9
work 7.8
eyes 7.7
one 7.5
close 7.4
dress 7.2
handsome 7.1
smile 7.1
happiness 7
modern 7

Google
created on 2021-04-04

Photograph 94.1
Suit 79.4
Snapshot 74.3
Monochrome 72.1
Event 71.6
Monochrome photography 70.5
Font 69.7
Vintage clothing 69.6
Hat 67.8
Team 67.6
Crew 67.3
White-collar worker 65.2
Stock photography 63.6
Blazer 61.8
History 60.6
Photo caption 57.9
Tie 57.7
Sitting 56.8
Room 56.6
Photography 51.7

Microsoft
created on 2021-04-04

person 98.7
indoor 98.4
text 93.1
man 91.3
human face 81
old 80.4
weapon 74.3
clothing 72.4
musical instrument 59.4
black and white 57.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Male, 59.4%
Angry 64.8%
Surprised 16.7%
Calm 13.9%
Happy 2.4%
Disgusted 0.8%
Sad 0.6%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 14-26
Gender Female, 81.1%
Calm 81.9%
Sad 8.4%
Happy 6.1%
Angry 1.6%
Surprised 0.9%
Confused 0.6%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 20-32
Gender Female, 55.8%
Fear 62%
Sad 14%
Calm 12.1%
Happy 3.4%
Surprised 3.1%
Angry 2.2%
Confused 1.9%
Disgusted 1.2%

Feature analysis

Amazon

Person 97.2%

Categories

Captions

Text analysis

Google

www
www