Human Generated Data

Title

At the Cradle

Date

c. 1931

People

Artist: Jeanne Mammen, German 1890 - 1976

Classification

Drawings

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Mr. and Mrs. Edward Ruppert, BR73.62

Copyright

© Mammen-Gesellschaft e. V. / Artists Rights Society (ARS), New York / VG Bild-Kunst, Germany

Human Generated Data

Title

At the Cradle

People

Artist: Jeanne Mammen, German 1890 - 1976

Date

c. 1931

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Art 98.8
Painting 98.8
Person 98.4
Human 98.4
Person 97.6
Female 71.2
Sketch 65.1
Drawing 65.1
Photography 62.8
Face 62.8
Photo 62.8
Portrait 62.8
People 62
Apparel 59.7
Clothing 59.7
Girl 56.8

Clarifai
created on 2018-02-10

people 100
two 99.7
portrait 99.2
adult 99.2
man 97.4
affection 96.2
one 95.8
leader 95.2
offspring 94.6
facial expression 94.4
group 94.2
baby 93.1
son 91.2
wear 90.1
three 88.4
administration 87
interaction 86.5
love 85.8
woman 84
child 81.5

Imagga
created on 2018-02-10

groom 32.9
doll 29.7
plaything 24.1
statue 23.3
portrait 22.7
sculpture 19.5
sketch 19.2
face 19.2
religion 17.9
sexy 17.7
art 16.5
person 15.9
ancient 15.6
people 14.5
adult 14.2
architecture 14.1
drawing 13.9
old 13.2
love 12.6
religious 12.3
representation 12.2
monument 12.1
fashion 12.1
pretty 11.9
decoration 11.8
model 11.7
attractive 11.2
body 11.2
culture 11.1
skin 11
dress 10.8
history 10.7
temple 10.6
hair 10.3
stone 10.2
church 10.2
closeup 10.1
makeup 10.1
figure 10.1
erotic 10
currency 9.9
design 9.8
detail 9.7
historical 9.4
cute 9.3
historic 9.2
travel 9.2
human 9
man 9
lady 8.9
child 8.8
god 8.6
money 8.5
head 8.4
sensual 8.2
style 8.2
negative 8.1
mother 8.1
bride 8.1
romantic 8
carving 8
women 7.9
antique 7.9
brunette 7.8
holy 7.7
spiritual 7.7
kin 7.7
blond 7.5
vintage 7.4
tourism 7.4
banking 7.4
cash 7.3
male 7.3
costume 7.2
posing 7.1

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 98.8
posing 97.8
woman 95.8
old 86.4
black 73.1
white 73.1
vintage 30.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-27
Gender Male, 53.2%
Happy 1.4%
Angry 5.3%
Calm 64%
Confused 5.3%
Disgusted 14.4%
Sad 6.2%
Surprised 3.4%

AWS Rekognition

Age 26-44
Gender Female, 87.1%
Happy 11.2%
Angry 7.3%
Disgusted 10.6%
Sad 9.5%
Surprised 11.4%
Confused 14.1%
Calm 35.9%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 98.8%
Person 98.4%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 96.1%
a vintage photo of a group of people posing for a picture 96%
a group of people posing for a photo 95.9%

Text analysis

Amazon

Maunen