Human Generated Data

Title

[Julia and Lux Feininger in Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.26

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia and Lux Feininger in Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.26

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Apparel 100
Clothing 100
Human 99.4
Person 99.4
Person 98.9
Coat 97.9
Tarmac 88.8
Asphalt 88.8
Overcoat 83.2
Suit 78.3
Road 75.1
Raincoat 67.8
Path 64.4
Pedestrian 58.8

Clarifai
created on 2019-11-18

people 99.7
adult 98.6
man 97.3
one 96.5
monochrome 95.8
music 92.8
two 92.3
woman 92.3
street 92
wear 89.8
recreation 89.3
group together 87.6
musician 85.8
group 83.9
sit 83.6
guitar 83.1
bench 82.6
vehicle 76.6
transportation system 76.2
leader 74.5

Imagga
created on 2019-11-18

man 35
male 24.9
chair 23.7
adult 21.1
wheelchair 20.1
people 19.5
person 18.1
musical instrument 18
business 17.6
seat 15.2
businessman 15
black 14.6
city 14.1
men 13.7
outdoors 13
world 12.7
silhouette 12.4
building 12
sport 11.1
corporate 10.3
furniture 10.1
professional 10
office 9.9
travel 9.9
urban 9.6
thinking 9.5
women 9.5
keyboard instrument 9.5
work 9.4
youth 9.4
accordion 9.2
leisure 9.1
exercise 9.1
suit 9
worker 9
stringed instrument 9
fun 9
sitting 8.6
life 8.6
skateboard 8.5
street 8.3
holding 8.2
human 8.2
style 8.2
wind instrument 8.1
looking 8
wheeled vehicle 8
job 8
lifestyle 7.9
boy 7.8
loneliness 7.8
career 7.6
athlete 7.5
player 7.5
executive 7.5
board 7.3
water 7.3
alone 7.3
indoor 7.3
businesswoman 7.3
working 7.1

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

outdoor 96.6
black and white 96.6
street 96.1
text 95.4
monochrome 87.2
person 85.7
clothing 73.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 55%
Happy 45%
Fear 45%
Angry 45.3%
Calm 54.4%
Sad 45.3%
Disgusted 45%
Confused 45%
Surprised 45%

AWS Rekognition

Age 22-34
Gender Male, 54.4%
Happy 45%
Calm 45.6%
Confused 45%
Sad 53%
Angry 45.1%
Fear 46.3%
Surprised 45%
Disgusted 45%

Feature analysis

Amazon

Person 99.4%