Human Generated Data

Title

[Group of unidentified people sitting outside]

Date

early 1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.581.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Group of unidentified people sitting outside]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

early 1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Apparel 99.4
Clothing 99.4
Person 99.2
Human 99.2
Leisure Activities 96.6
Dance Pose 96.6
Person 95.4
Footwear 88.9
Shoe 88.9
Female 88.9
Dress 88.4
Furniture 85.4
Chair 77
Shoe 74.4
Woman 71.8
Helmet 70.5
Shoe 69.5
Performer 67
People 66.6
Girl 63.3
Dance 61.9
Photography 57.5
Photo 57.5
Face 57.5
Portrait 57.5
Costume 55.7

Clarifai
created on 2019-11-20

people 99.7
adult 98.7
wear 97.3
woman 96
one 94.9
monochrome 94.8
two 93.6
group 91.4
man 90.3
child 88.6
portrait 87.4
dress 86.6
veil 84.7
music 84.6
chair 84.2
group together 83.3
recreation 82.9
leader 80.9
outfit 79.5
girl 79.4

Imagga
created on 2019-11-20

negative 52.3
film 41.2
newspaper 38.1
photographic paper 31.8
product 29.2
creation 24.8
man 24.2
people 24
person 23.1
photographic equipment 21.2
adult 20.4
male 16.3
portrait 15.5
fashion 13.6
sexy 12.8
attractive 12.6
human 12
happy 11.9
hair 11.9
dress 11.7
art 11.3
sitting 11.2
men 11.2
music 10.8
face 10.7
fun 10.5
black 10.3
youth 10.2
lady 9.7
business 9.7
couple 9.6
women 9.5
love 9.5
smile 9.3
one 9
worker 8.9
businessman 8.8
work 8.6
bride 8.6
model 8.6
casual 8.5
modern 8.4
brass 8.3
alone 8.2
child 8
handsome 8
lifestyle 7.9
wind instrument 7.9
happiness 7.8
grunge 7.7
clothing 7.6
studio 7.6
joy 7.5
holding 7.4
style 7.4
decoration 7.4
body 7.2
celebration 7.2
romantic 7.1
indoors 7
together 7

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

dance 84.1
clothing 83.5
text 82.6
person 73.9
drawing 72.3

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 52.1%
Happy 45.8%
Confused 45.1%
Fear 45.1%
Surprised 45.1%
Calm 50.5%
Disgusted 45.1%
Sad 46.6%
Angry 46.6%

Feature analysis

Amazon

Person 99.2%
Shoe 88.9%
Helmet 70.5%

Captions

Microsoft

a black and white photo of a person 58.7%
an old photo of a person 58.6%
old photo of a person 58.1%