Human Generated Data

Title

[Faculty and students at Mills College, Oakland, California]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.131.25

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Faculty and students at Mills College, Oakland, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.131.25

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-23

Human 99.6
Person 99.6
Person 99.2
Person 98.5
Leisure Activities 98.4
Person 97.8
Person 97.2
Person 96.4
Musical Instrument 94.9
Person 94
Person 91.9
Guitar 88.2
Banjo 86.3
Person 84.1
Person 83.7
Musician 72
Lute 61.7
Guitarist 55.5
Performer 55.5

Clarifai
created on 2019-05-23

people 100
group 99.6
adult 99.2
furniture 97.9
leader 97.8
sit 97.7
group together 97.3
man 97.3
administration 92.7
many 92.4
woman 92.3
wear 91.5
several 90.7
two 90.2
seat 89.8
room 89.3
chair 88.6
child 87.9
outfit 84.8
three 83.4

Imagga
created on 2019-05-23

musical instrument 41.5
percussion instrument 32.3
man 26.2
steel drum 22.3
people 20.6
kin 19.7
male 17
person 17
old 16
drum 13.3
room 12.8
adult 12.7
vintage 12.4
black 12
portrait 11.6
wind instrument 11.3
couple 10.4
sitting 10.3
happiness 10.2
building 9.8
businessman 9.7
sepia 9.7
business 9.1
park 9
outdoors 8.9
family 8.9
stringed instrument 8.6
tree 8.5
summer 8.3
holding 8.2
classroom 8.1
history 8
lifestyle 7.9
women 7.9
love 7.9
day 7.8
accordion 7.7
grunge 7.7
outdoor 7.6
two 7.6
statue 7.6
happy 7.5
mother 7.5
silhouette 7.4
retro 7.4
girls 7.3
new 7.3
art 7.1
to 7.1

Google
created on 2019-05-23

Microsoft
created on 2019-05-23

wall 95.5
person 92.2
clothing 89.4
black and white 82.8
funeral 81.1
black 74.2
white 68.3
man 64.7
old 40.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-27
Gender Female, 51.3%
Confused 45.9%
Surprised 45.5%
Calm 47.6%
Disgusted 45.4%
Sad 49.6%
Happy 45.2%
Angry 45.9%

AWS Rekognition

Age 6-13
Gender Male, 53.4%
Calm 45.9%
Sad 45.9%
Surprised 45.2%
Angry 50%
Happy 47.3%
Confused 45.5%
Disgusted 45.2%

AWS Rekognition

Age 23-38
Gender Female, 53.1%
Disgusted 49.7%
Angry 46.2%
Calm 46.3%
Surprised 45.9%
Confused 45.8%
Sad 45.4%
Happy 45.6%

AWS Rekognition

Age 45-63
Gender Female, 50.2%
Surprised 49.6%
Calm 49.6%
Happy 49.7%
Disgusted 49.6%
Confused 49.5%
Sad 49.7%
Angry 49.7%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Happy 49.5%
Angry 49.6%
Confused 49.6%
Calm 49.8%
Sad 49.6%
Surprised 49.6%
Disgusted 49.8%

Feature analysis

Amazon

Person 99.6%