Human Generated Data

Title

Untitled

Date

1973

People

Artist: John Clem Clarke, American 1937 -

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Student Print Rental Collection, M15885

Human Generated Data

Title

Untitled

People

Artist: John Clem Clarke, American 1937 -

Date

1973

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-01

Plant 99.9
Food 98.6
Fruit 98.6
Pineapple 98.2
Human 98.1
Person 98.1
Painting 86
Art 86

Clarifai
created on 2019-11-01

people 98.4
flower 96.4
woman 96.2
fruit 94.9
adult 94.5
one 93.4
indoors 92.9
flora 92.6
food 92.6
man 92.2
nature 92
basket 91.6
health 89.8
wine 89.5
cooking 88.3
decoration 87.2
family 86.4
vegetable 85.6
portrait 85.1
pasture 84

Imagga
created on 2019-11-01

happy 31.3
person 27.1
smiling 24.6
people 23.4
smile 22.8
attractive 21
couple 20.9
portrait 20.7
groom 20.3
adult 20.3
love 19.7
man 18.8
home 18.3
cheerful 17.9
bouquet 16.9
pretty 16.1
lifestyle 15.9
happiness 15.7
bride 15.3
male 15.1
dress 14.5
holding 14
wedding 13.8
sexy 13.7
cute 13.6
celebration 13.6
lady 13
romance 12.5
flower 12.3
face 12.1
day 11.8
indoors 11.4
fruit 11.4
flowers 11.3
looking 11.2
sitting 11.2
joy 10.9
studio 10.6
married 10.5
summer 10.3
holiday 10
romantic 9.8
hair 9.5
marriage 9.5
women 9.5
healthy 9.4
decoration 9.4
child 9.3
camera 9.2
gorgeous 9.1
fashion 9
blond 9
together 8.8
world 8.6
gift 8.6
container 8.5
rose 8.4
food 8.4
evening 8.4
tree 8.3
alone 8.2
one 8.2
brunette 7.8
standing 7.8
bridal 7.8
eyes 7.7
health 7.6
two 7.6
hobby 7.6
elegance 7.6
togetherness 7.6
house 7.5
fun 7.5
tradition 7.4
teenager 7.3
sensual 7.3
kitchen 7.2
box 7.1
tray 7.1
interior 7.1

Google
created on 2019-11-01

Still life 85.1
Painting 73.3
Art 69.5
Still life photography 65.1
Photography 62.4
Grape 59.7
Plant 58.5
Vitis 50.1

Microsoft
created on 2019-11-01

fruit 99.1
food 95.3
vegetable 94.7
text 91
painting 77.8
natural foods 69.3
local food 64.1
apple 58.7
whole food 53.3
different 37
picture frame 10.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 99.5%
Disgusted 3.1%
Angry 8.3%
Sad 64.3%
Surprised 0.5%
Confused 2.8%
Fear 1%
Happy 0.6%
Calm 19.2%

AWS Rekognition

Age 45-63
Gender Female, 50.2%
Confused 49.5%
Fear 49.7%
Surprised 49.5%
Disgusted 49.5%
Sad 50%
Calm 49.5%
Angry 49.7%
Happy 49.5%

AWS Rekognition

Age 32-48
Gender Female, 50%
Angry 49.5%
Fear 50.4%
Calm 49.5%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Sad 49.6%
Happy 49.5%

AWS Rekognition

Age 30-46
Gender Male, 50%
Calm 49.6%
Fear 49.5%
Sad 50.2%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Happy 49.5%
Confused 49.5%

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Painting 86%

Captions

Microsoft

a person holding a flower 28.3%
a person holding a flower 28.2%
an old photo of a person 28.1%

Text analysis

Amazon

MC.Cllue