Human Generated Data

Title

[Faculty and students at Mills College, Oakland, California]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.131.26

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Faculty and students at Mills College, Oakland, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.131.26

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-23

Person 99.5
Human 99.5
Apparel 99.2
Clothing 99.2
Person 99.2
Person 99.1
Person 98.7
Person 98.6
Person 98.5
Person 98
Person 96.5
Person 95
Face 88.9
People 86.5
Female 78.1
Coat 74.4
Overcoat 70.5
Girl 68.8
Photography 63.3
Photo 63.3
Suit 58.7
Family 57.2
Hat 56.6

Clarifai
created on 2019-05-23

people 100
group 99.8
adult 99.4
many 99.3
group together 99
leader 98.7
several 96.4
wear 95.9
administration 95
man 94.6
woman 90.9
child 89.4
veil 86.3
home 85.5
outfit 84.4
clergy 83.8
five 81.3
crowd 81.1
chair 80.2
offspring 80

Imagga
created on 2019-05-23

statue 31.5
sculpture 25.8
world 21.5
monument 19.6
tourism 19
old 18.8
art 18.7
travel 18.3
history 17.9
clothing 17.5
architecture 17.2
religion 17
people 16.2
historic 15.6
city 15
god 14.3
religious 14.1
uniform 13
man 12.8
landmark 12.6
brass 12.5
wind instrument 12.3
ancient 12.1
catholic 12
stone 11.9
musical instrument 11.8
building 11.7
marble 11.6
faith 11.5
church 11.1
military uniform 10.8
adult 10.6
famous 10.2
tourist 10.1
person 9.9
detail 9.7
culture 9.4
national 9.1
holy 8.7
men 8.6
bugle 8.4
black 8.4
tradition 8.3
traditional 8.3
symbol 8.1
bronze 7.8
male 7.8
saint 7.7
spiritual 7.7
historical 7.5
metropolitan 7.4
covering 7.3
peace 7.3
dress 7.2
consumer goods 7.1
memorial 7

Google
created on 2019-05-23

Microsoft
created on 2019-05-23

person 98.6
clothing 92.7
window 92
group 87.2
old 87
people 82.6
man 79.2
white 66.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 95.2%
Disgusted 0.7%
Sad 63.1%
Calm 14.9%
Confused 5%
Surprised 3.1%
Angry 3.5%
Happy 9.7%

AWS Rekognition

Age 15-25
Gender Female, 51.3%
Angry 45.6%
Happy 49.2%
Calm 47.6%
Surprised 46.3%
Disgusted 45.2%
Confused 45.3%
Sad 45.7%

AWS Rekognition

Age 45-63
Gender Female, 53.3%
Surprised 45%
Calm 45.1%
Angry 45.1%
Disgusted 45%
Sad 54.8%
Confused 45%
Happy 45%

AWS Rekognition

Age 20-38
Gender Female, 53.1%
Calm 51.9%
Disgusted 45.4%
Confused 45.3%
Happy 45.2%
Sad 45.8%
Angry 45.8%
Surprised 45.8%

AWS Rekognition

Age 35-52
Gender Female, 51.6%
Surprised 45.3%
Happy 45.2%
Disgusted 45.1%
Angry 45.8%
Sad 50.8%
Confused 45.3%
Calm 47.5%

AWS Rekognition

Age 26-43
Gender Female, 51.1%
Disgusted 45.4%
Angry 45.6%
Calm 46.7%
Surprised 45.8%
Confused 45.4%
Sad 49.2%
Happy 46.9%

Feature analysis

Amazon

Person 99.5%

Categories