Human Generated Data

Title

The Art Lover

Date

1937

People

Artist: Mervin Jules, American 1912 - 1994

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Donated by Jack Sando '62 and Judith Sando, in appreciation of Professor Seymour Slive, 2005.114

Human Generated Data

Title

The Art Lover

People

Artist: Mervin Jules, American 1912 - 1994

Date

1937

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Donated by Jack Sando '62 and Judith Sando, in appreciation of Professor Seymour Slive, 2005.114

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Painting 99.9
Art 99.9
Human 99.2
Person 99.2
Person 95.6
Person 85.9
Person 82.3
Person 65.1

Clarifai
created on 2018-04-19

people 99.9
adult 99.8
painting 99.6
one 99.3
man 98.9
two 98.6
religion 97.2
furniture 96.7
wear 96.2
art 95.8
room 95.6
elderly 95.6
home 94.7
group 94.4
family 94.3
woman 93.5
indoors 93.2
portrait 93
battle 90.2
war 89.6

Imagga
created on 2018-04-19

fireplace 72.9
old 22.3
man 21.5
jigsaw puzzle 19.3
fire screen 17.9
puzzle 16.2
screen 15.1
chest 14
person 13.8
vintage 13.2
fire 13.1
portrait 12.3
texture 11.8
aged 11.8
people 11.7
male 11.3
protective covering 11.2
grunge 11.1
room 11
box 10.9
interior 10.6
covering 10.5
game 10.4
home 10.4
face 9.9
adult 9.7
black 9.6
brown 9.6
sitting 9.4
hot 9.2
attractive 9.1
indoors 8.8
container 8.7
antique 8.6
living 8.5
food 8.5
meal 8.1
structure 8
metal 8
close 8
paper 7.8
ancient 7.8
industry 7.7
grungy 7.6
dinner 7.6
relaxation 7.5
pattern 7.5
flame 7.5
design 7.4
closeup 7.4
heat 7.4
retro 7.4
color 7.2
religion 7.2
work 7.2
art 7.1
worker 7.1
job 7.1
working 7.1

Google
created on 2018-04-19

painting 91.4
art 86.1
picture frame 77.6
wood 61.7
artwork 57.7
portrait 57.4
stock photography 55.8
impressionist 52.5

Microsoft
created on 2018-04-19

indoor 85.6
old 47.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Male, 76.9%
Angry 15.3%
Confused 4.5%
Calm 26.3%
Sad 46.5%
Happy 1.1%
Surprised 5.4%
Disgusted 0.8%

AWS Rekognition

Age 35-52
Gender Male, 57.6%
Calm 3.5%
Surprised 0.8%
Happy 10.8%
Confused 1.1%
Disgusted 2%
Angry 2.1%
Sad 79.7%

AWS Rekognition

Age 45-66
Gender Female, 98.3%
Surprised 2.4%
Disgusted 1.6%
Confused 1.8%
Angry 2.6%
Happy 1.1%
Calm 1.2%
Sad 89.3%

AWS Rekognition

Age 38-59
Gender Male, 73.9%
Angry 3.5%
Confused 5.9%
Disgusted 5.7%
Surprised 4.4%
Calm 41.5%
Sad 32.1%
Happy 6.9%

AWS Rekognition

Age 35-55
Gender Male, 52.4%
Disgusted 45.1%
Surprised 45.1%
Sad 48.2%
Calm 51%
Angry 45.5%
Confused 45.1%
Happy 45%

AWS Rekognition

Age 48-68
Gender Male, 54.2%
Happy 45%
Calm 54.2%
Angry 45.1%
Sad 45.1%
Confused 45.1%
Disgusted 45.3%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Male, 53.7%
Disgusted 45.2%
Angry 45.3%
Sad 47%
Confused 45.2%
Calm 50%
Surprised 45.3%
Happy 47%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.9%
Person 99.2%

Captions

Azure OpenAI

Created on 2024-01-26

This image depicts an individual sitting in profile to the right, facing a mirror that reflects an interior scene different from the setting the individual is in. The reflected scene appears to be a bustling environment with various individuals and activity. The individual's attention is directed towards the mirror, and in their hand are two oval items, which they appear to be examining closely. The painting's perspective is designed to invite the viewer to look over the individual's shoulder, into the mirror and the scene depicted within.