Human Generated Data

Title

[Man and woman looking at plane propeller]

Date

1931?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man and woman looking at plane propeller]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.1
Human 99.1
Clothing 95.3
Apparel 95.3
Person 91
Face 70.8
Tripod 68.2
Photo 65.1
Photography 65.1
People 65
Electronics 57.4
Accessory 55.9
Accessories 55.9
Tie 55.9
Coat 55.7
Overcoat 55.7

Clarifai
created on 2019-11-19

people 99.8
adult 98
music 97.6
musician 96
group together 95.9
two 95.5
man 95.4
group 95.4
one 94.6
wear 93.7
singer 92.4
administration 92.4
vehicle 91.9
microphone 91.3
woman 90.7
three 90.3
several 89.1
leader 88
stringed instrument 86
instrument 85.3

Imagga
created on 2019-11-19

sax 45.1
man 36.9
male 31.9
wind instrument 26.9
person 26.7
adult 26.6
music 25.5
brass 23.9
musician 22.3
people 22.3
black 20.4
business 19.4
device 18.9
businessman 18.5
professional 17
musical instrument 17
portrait 16.2
concert 15.5
handsome 15.2
instrument 15.1
horn 14.7
men 14.6
musical 14.4
phone 13.8
playing 13.7
face 13.5
job 13.3
player 13.2
performer 13.1
lifestyle 13
play 12.9
entertainment 12.9
looking 12.8
worker 12.6
rock 12.2
occupation 11.9
businesspeople 11.4
guy 11.3
sexy 11.2
suit 11.1
stage 11.1
oboe 10.6
singer 10.3
work 10.2
happy 10
guitar 10
star 9.8
sound 9.8
trombone 9.7
microphone 9.6
bass 9.6
performance 9.6
serious 9.5
youth 9.4
executive 9.3
alone 9.1
hand 9.1
holding 9.1
style 8.9
office 8.8
employee 8.8
corporate 8.6
club 8.5
dark 8.3
emotion 8.3
indoor 8.2
confident 8.2
group 8.1
metal 8
saxophone 7.9
guitarist 7.9
jazz 7.8
couple 7.8
standing 7.8
cellphone 7.8
call 7.7
cornet 7.7
modern 7.7
attractive 7.7
industry 7.7
confidence 7.7
career 7.6
mobile 7.5
fun 7.5
mature 7.4
telephone 7.4
glasses 7.4
building 7.4
20s 7.3
protection 7.3
drum 7.1
indoors 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 99.8
clothing 97.6
human face 93.5
man 91.8
text 80.9
black and white 76.4
standing 75.1
posing 70.7
fashion accessory 64.9
hat 60.4
crowd 0.9

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-42
Gender Male, 56.7%
Angry 0.8%
Surprised 1.1%
Fear 0.5%
Happy 59.6%
Sad 1.9%
Confused 1.1%
Calm 34.5%
Disgusted 0.5%

AWS Rekognition

Age 8-18
Gender Female, 75.1%
Disgusted 0.8%
Confused 1.1%
Fear 0.5%
Happy 3.4%
Calm 82%
Angry 4.7%
Surprised 1.4%
Sad 6.1%

AWS Rekognition

Age 48-66
Gender Female, 53.8%
Happy 45.1%
Calm 46.3%
Angry 49.7%
Surprised 45.1%
Disgusted 45.3%
Confused 45.6%
Fear 45.2%
Sad 47.7%

Microsoft Cognitive Services

Age 37
Gender Male

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

people portraits 68.9%
events parties 27.1%
food drinks 1.7%