Human Generated Data

Title

Saint Peter

Date

c. 1615

People

Artist: Peter Paul Rubens, Flemish 1577 - 1640

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Lore Heinemann in memory of her husband, Dr. Rudolf J. Heinemann, 1997.31

Human Generated Data

Title

Saint Peter

People

Artist: Peter Paul Rubens, Flemish 1577 - 1640

Date

c. 1615

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Lore Heinemann in memory of her husband, Dr. Rudolf J. Heinemann, 1997.31

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 96.2
Human 88.2
Person 88.2
Painting 78
Face 72.4

Clarifai
created on 2020-04-24

people 99.1
art 99
portrait 98.1
man 98
painting 97.4
one 96.1
concert 96
music 95.8
adult 93.7
musician 92.5
light 86.2
guitar 85.2
performance 83.6
color 83.3
face 83
god 81.6
religion 81.3
smoke 80.6
instrument 80.5
monk 80.3

Imagga
created on 2020-04-24

person 35
man 34.3
male 31.9
stage 27.5
harmonica 27
portrait 26.5
black 25.8
face 24.1
senior 23.4
free-reed instrument 22.3
people 21.2
old 20.2
platform 19.9
wind instrument 19.7
looking 18.4
musical instrument 18.2
grandfather 17.7
adult 17.6
dark 17.5
hair 16.6
microphone 14.8
elderly 14.4
human 14.2
rustic 14.1
light 14
body 13.6
men 12.9
handsome 12.5
performer 10.7
retired 10.7
night 10.7
serious 10.5
expression 10.2
mature 10.2
head 10.1
hand 9.9
skin 9.6
hands 9.6
eyes 9.5
model 9.3
musician 9.3
power 9.2
music 9
eye 8.9
fan 8.9
singer 8.8
pain 8.6
art 8.6
attractive 8.4
smoke 8.4
fractal 8.3
happy 8.1
fantasy 8.1
look 7.9
sadness 7.8
thoughtful 7.8
hope 7.7
motion 7.7
mystery 7.7
one 7.5
close 7.4
style 7.4
generated 7.4
emotion 7.4
sexy 7.2
gray 7.2
follower 7.2
women 7.1
love 7.1
beard 7

Google
created on 2020-04-24

Portrait 82.9
Painting 81.8
Artist 78
Art 76.7
Modern art 68.5
Self-portrait 66.9
Musician 65.9
Flesh 57.7
Music 55.2
Visual arts 55

Microsoft
created on 2020-04-24

person 96.5
painting 95.1
human face 93
drawing 80.9
sketch 64
text 63
man 57.2
music 54.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Male, 97.7%
Disgusted 7%
Sad 20.9%
Happy 1%
Calm 11.6%
Confused 3.7%
Angry 41.8%
Surprised 1.4%
Fear 12.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 88.2%
Painting 78%

Categories

Captions

Microsoft
created on 2020-04-24

a man looking at the camera 46.1%
the face of a man 46%
a close up of a man 45.9%

Azure OpenAI

Created on 2024-02-09

This image depicts a person with a bared, muscular back, draped in a flowing, orange-toned fabric that catches light and shadow dramatically. The subject is holding what appears to be a flute-like musical instrument close to their body with their right hand, fingers gripping the instrument's rods visibly. The artwork is rich with earthy tones and a stark contrast that brings out the curves and edges of the subject's musculature and the flowing fabric. The background is predominantly dark, creating a sharp contrast with the illuminated portions of the subject.