Human Generated Data

Title

Katherine Dunham

Date

1987-1988

People

Artist: Brian Lanker, American 1947 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.334

Copyright

© Brian Lanker

Human Generated Data

Title

Katherine Dunham

People

Artist: Brian Lanker, American 1947 - 2011

Date

1987-1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.334

Copyright

© Brian Lanker

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Human 93.8
Person 93.8
Clothing 91.3
Apparel 91.3
Painting 85.5
Art 85.5
Face 80
Person 77.2
Costume 75.3
Wood 72.3
Leisure Activities 65.6
Floor 59.3
Tribe 57.7
Cushion 57.4

Clarifai
created on 2018-02-10

people 99.4
man 96.9
art 96.1
adult 95.1
illustration 94
one 93.1
veil 91.9
portrait 90.2
wear 86.7
woman 86.5
sit 82.7
monochrome 82.6
group 82.5
print 79.9
engraving 79
vintage 78.6
old 78.1
religion 77.5
symbol 76.9
black and white 73.8

Imagga
created on 2018-02-10

boat 21.4
black 20.1
man 19.5
product 19.3
jacket 17.7
water 16.7
creation 16.1
silhouette 15.7
fountain 14.1
symbol 13.5
gondola 13.4
people 12.8
ocean 12.4
newspaper 12.1
male 12
wrapping 11.9
person 11.8
sea 10.9
sunset 10.8
vessel 10.6
travel 10.6
beach 10.1
religion 9.8
structure 9.8
crown 9.7
business 9.1
vintage 9.1
old 9
art 8.8
calm 8.2
one 8.2
light 8
night 8
couple 7.8
portrait 7.8
adult 7.8
grunge 7.7
dark 7.5
lake 7.4
love 7.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 48-68
Gender Female, 99.7%
Disgusted 46.3%
Sad 17.4%
Angry 1.7%
Calm 28.5%
Surprised 2.5%
Happy 2.3%
Confused 1.3%

AWS Rekognition

Age 57-77
Gender Male, 94.4%
Happy 0.7%
Sad 87.6%
Calm 2.8%
Confused 1.5%
Disgusted 2.1%
Angry 4.1%
Surprised 1.2%

Microsoft Cognitive Services

Age 54
Gender Female

Feature analysis

Amazon

Person 93.8%
Painting 85.5%