Human Generated Data

Title

[Shop window mannequins]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.611.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Shop window mannequins]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.611.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Human 99.4
Person 99.4
Person 98.6
Clothing 98.6
Apparel 98.6
Person 97.6
Person 94.6
Person 79.8
People 70.8
Female 68.4
Coat 62.7
Photo 60.8
Photography 60.8
Face 60.5
Portrait 60.5
Hat 60
Nurse 59.5
Overcoat 58.2
Performer 57.8
Woman 55.4
Sailor Suit 55.3
Shorts 55.1

Clarifai
created on 2019-05-30

people 100
group 99.4
adult 99
group together 98.6
administration 97.2
wear 96.6
leader 96
man 95.7
several 94.8
four 92.7
three 92.6
woman 91.3
many 91
outfit 90.8
five 90
watercraft 89.3
military 88.9
music 85.3
child 84.6
war 84.5

Imagga
created on 2019-05-30

man 28.9
person 26.7
people 25.1
adult 23.4
male 22.8
black 18.9
world 17.1
style 16.3
fashion 15.8
dress 14.5
performer 12.9
attractive 12.6
clothing 12.5
happy 12.5
portrait 12.3
couple 12.2
business 12.1
businessman 11.5
musical instrument 11.4
sexy 11.2
women 11.1
city 10.8
teacher 10.7
family 10.7
posing 10.7
urban 10.5
men 10.3
happiness 10.2
silhouette 9.9
group 9.7
body 9.6
boy 9.6
love 9.5
youth 9.4
casual 9.3
elegance 9.2
dark 9.2
modern 9.1
pose 9.1
one 9
dancer 8.9
brass 8.7
window 8.6
wall 8.5
clothes 8.4
shop 8.4
room 8.4
professional 8.3
human 8.2
danger 8.2
dance 8
lifestyle 7.9
model 7.8
corporate 7.7
old 7.7
grunge 7.7
wind instrument 7.6
fun 7.5
vintage 7.4
child 7.4
street 7.4
device 7.3
sensuality 7.3
looking 7.2
romance 7.1
cool 7.1
interior 7.1
cornet 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

clothing 96.7
person 93
outdoor 91.2
old 86
standing 83.9
human face 81
woman 79.1
people 68.9
smile 68
dress 64.4
white 62.8
black and white 57.8
posing 44.9
vintage 26.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 14-23
Gender Male, 59.1%
Disgusted 3%
Sad 4.6%
Confused 40.2%
Happy 13.2%
Surprised 10.4%
Angry 24.6%
Calm 4%

AWS Rekognition

Age 23-38
Gender Female, 53.2%
Confused 45.3%
Happy 45.4%
Angry 45.2%
Surprised 45.3%
Calm 53%
Disgusted 45.1%
Sad 45.7%

AWS Rekognition

Age 26-43
Gender Female, 51.1%
Happy 45.5%
Confused 45.2%
Disgusted 45.1%
Calm 52.6%
Sad 45.7%
Angry 45.4%
Surprised 45.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories