Human Generated Data

Title

Ingres and the Lovers

Date

1986

People

Artist: Stephen Curtis, American 1946 - 1996

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Andrew Wyeth Fund for the purchase of American drawings, 1986.457

Human Generated Data

Title

Ingres and the Lovers

People

Artist: Stephen Curtis, American 1946 - 1996

Date

1986

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Andrew Wyeth Fund for the purchase of American drawings, 1986.457

Machine Generated Data

Tags

Amazon
created on 2020-04-30

Human 98.8
Person 98.8
Person 97.4
Person 94.3
Person 92.1
Drawing 91.7
Art 91.7
Nature 85.9
Face 81.2
Tree 80.6
Plant 80.6
Outdoors 80.3
Vegetation 80.1
Sitting 79.2
Person 78.3
Person 76.5
Sketch 72.7
Female 72.2
Clothing 67.9
Apparel 67.9
Painting 65.2
Shorts 62
Photography 61.1
Photo 61.1
Portrait 61.1
Girl 60.3
Building 59.7
Shelter 59.7
Countryside 59.7
Rural 59.7
Woman 57.5
Blonde 57.5
Child 57.5
Kid 57.5
Teen 57.5
Land 57.2
Doodle 55

Clarifai
created on 2020-04-30

people 99.8
monochrome 99.2
adult 98.2
tree 98.2
woman 97.6
art 97.4
black and white 97.4
portrait 96.4
nude 96.2
two 95.9
group 94.7
couple 93.6
girl 92.5
child 92.4
man 92
sepia 88.9
print 85.9
beautiful 84.3
bench 84.2
vintage 83.9

Imagga
created on 2020-04-30

sprinkler 65.6
mechanical device 53.9
mechanism 40.1
device 29.5
fountain 23.9
structure 20.8
old 17.4
person 16.3
black 16.2
grunge 16.2
people 16.2
outdoor 16
negative 15.9
man 15.5
adult 14.2
park 14
vintage 13.2
portrait 12.9
sexy 12
body 12
frame 11.8
face 11.4
male 11.3
art 11.3
film 11.2
human 10.5
antique 10.4
tree 10.1
dirty 9.9
sport 9.9
retro 9.8
one 9.7
hair 9.5
dark 9.2
silhouette 9.1
pretty 9.1
snow 9.1
summer 9
outdoors 9
lady 8.9
sepia 8.7
forest 8.7
space 8.5
wall 8.4
attractive 8.4
texture 8.3
fashion 8.3
fence 8.3
fun 8.2
wet 8
close 8
women 7.9
photographic paper 7.9
smile 7.8
sad 7.7
winter 7.7
statue 7.6
weathered 7.6
power 7.6
city 7.5
landscape 7.4
paint 7.2
aged 7.2
lifestyle 7.2
posing 7.1
textured 7

Google
created on 2020-04-30

Microsoft
created on 2020-04-30

text 98.8
person 90.5
outdoor 85.9
black and white 79
clothing 68.1
tree 62.4
old 50.9
posing 35.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Male, 75.5%
Surprised 0%
Sad 37.1%
Fear 0%
Disgusted 0%
Calm 62.2%
Angry 0.1%
Confused 0.1%
Happy 0.4%

AWS Rekognition

Age 20-32
Gender Female, 52.5%
Sad 46.4%
Fear 45.1%
Surprised 45.3%
Calm 51.4%
Happy 46.5%
Angry 45.1%
Confused 45.1%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 54.6%
Angry 45%
Sad 54.1%
Confused 45.1%
Fear 45.3%
Disgusted 45%
Calm 45.4%
Happy 45%
Surprised 45%

AWS Rekognition

Age 14-26
Gender Female, 54.4%
Calm 51.6%
Happy 45.1%
Sad 47.5%
Fear 45.1%
Disgusted 45.3%
Surprised 45%
Confused 45.1%
Angry 45.2%

AWS Rekognition

Age 13-23
Gender Female, 54.2%
Happy 46.8%
Calm 51.5%
Confused 45.3%
Angry 45.1%
Disgusted 45%
Fear 45.3%
Surprised 45.5%
Sad 45.4%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 65.2%

Categories

Imagga

paintings art 99.1%