Human Generated Data

Title

[Group of faculty and students at Mills College, Oakland, California]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.204.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Group of faculty and students at Mills College, Oakland, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.204.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Person 99
Human 99
Person 98.9
Person 98.7
Person 97.5
Person 96.8
Person 96.8
Person 95.7
Person 95
Person 93
Person 91
Person 88.8
Apparel 87.2
Clothing 87.2
Performer 86.5
Person 86.1
Person 76.9
Accessories 75.9
Accessory 75.9
Tie 75.9
Costume 75.8
Person 73.4
Crowd 69.6
People 67.7
Face 66.5
Leisure Activities 61.4
Indoors 59
Room 59
Stage 58.3
Skin 56.6
Female 56.5
Furniture 56.2
Tie 52.1

Clarifai
created on 2019-11-18

people 100
group 99.7
group together 99.4
adult 98.7
many 98.3
leader 97
several 96.8
wear 96.7
administration 96.1
man 94.9
woman 93.4
outfit 92.7
five 92.1
four 90.7
music 87.1
musician 80.9
offspring 80.9
three 77.9
education 77.2
actor 76.3

Imagga
created on 2019-11-18

people 27.9
kin 24.4
person 23.7
man 22.8
male 21.4
musical instrument 17.5
adult 17.2
wind instrument 16.7
family 15.1
child 15.1
room 14.8
men 14.6
performer 13.9
portrait 13.6
happiness 13.3
happy 13.1
women 12.6
classroom 12.2
couple 12.2
teacher 12.2
brass 11.9
girls 11.8
dress 11.7
black 11.5
fashion 11.3
fun 11.2
youth 11.1
business 10.9
kid 10.6
interior 10.6
group 10.5
boy 10.4
two 10.2
world 10.2
holding 9.9
dancer 9.7
life 9.6
bride 9.6
entertainer 9.4
mother 9.3
human 9
style 8.9
decoration 8.8
home 8.8
celebration 8.8
together 8.8
musician 8.7
holiday 8.6
professional 8.4
singer 8.3
clothing 8.2
new 8.1
childhood 8.1
businessman 7.9
love 7.9
urban 7.9
gift 7.7
pretty 7.7
old 7.7
clothes 7.5
traditional 7.5
vintage 7.4
groom 7.4
wedding 7.4
children 7.3
smiling 7.2
looking 7.2
costume 7.1
indoors 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

wall 98.9
person 98.4
clothing 95.7
text 80.4
posing 79.3
standing 77.4
woman 66.5
dress 64.9
group 64.3
old 49.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Male, 53.8%
Angry 46.7%
Sad 47.8%
Fear 46%
Disgusted 45%
Happy 45%
Surprised 46%
Calm 48%
Confused 45.3%

AWS Rekognition

Age 46-64
Gender Male, 53.2%
Happy 45.2%
Sad 48%
Fear 45.1%
Angry 45.1%
Disgusted 45.1%
Surprised 45.4%
Calm 46.3%
Confused 49.9%

AWS Rekognition

Age 36-54
Gender Male, 50.4%
Fear 45.4%
Calm 52.6%
Disgusted 45%
Angry 45.1%
Happy 45%
Confused 45.1%
Sad 46.1%
Surprised 45.7%

AWS Rekognition

Age 8-18
Gender Female, 53.5%
Fear 45.3%
Angry 45.1%
Calm 46.7%
Surprised 51.3%
Confused 45.1%
Happy 46.3%
Disgusted 45.2%
Sad 45.1%

AWS Rekognition

Age 27-43
Gender Male, 54.5%
Disgusted 45.1%
Sad 47.4%
Angry 46%
Confused 45.1%
Fear 45.1%
Happy 45.1%
Surprised 45.7%
Calm 50.5%

AWS Rekognition

Age 41-59
Gender Male, 53.5%
Surprised 46.2%
Sad 46.5%
Confused 45.8%
Fear 48%
Calm 46.6%
Happy 45%
Disgusted 45.1%
Angry 46.9%

Feature analysis

Amazon

Person 99%
Tie 75.9%

Categories

Imagga

events parties 72.3%
people portraits 27%