Human Generated Data

Title

Dives and Lazarus

Date

17th century

People

Artist: Luca Giordano, Italian 1634 - 1705

Previous attribution: Luca Giordano, Italian 1634 - 1705

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of David E. Rust, 1976.100

Human Generated Data

Title

Dives and Lazarus

People

Artist: Luca Giordano, Italian 1634 - 1705

Previous attribution: Luca Giordano, Italian 1634 - 1705

Date

17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of David E. Rust, 1976.100

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Art 99.9
Painting 99.9
Person 99.5
Human 99.5
Person 98.9
Person 98.4
Person 98
Person 97.9
Person 97.4
Person 96.9
Person 96.8
Person 95.2
Person 93.8
Person 93.4
Person 87.5
Person 86.9
Person 65.1

Clarifai
created on 2018-02-10

people 99.9
group 99.8
adult 99.5
religion 98.4
woman 98.1
furniture 97.8
wear 97.5
man 96.8
many 96.3
seat 96
child 95.1
room 94.7
painting 94.1
reclining 93.4
recreation 93.2
art 93
position 92.6
education 88.5
sit 86.6
son 86.4

Imagga
created on 2018-02-10

brass 62.1
man 26.2
person 24.5
adult 22.4
people 18.9
teacher 18.1
male 17
religion 16.1
portrait 14.9
baritone 14.6
faith 14.3
sax 14.2
religious 14
old 13.9
art 13.2
antique 12.1
black 11.4
lady 11.4
church 11.1
dress 10.8
belief 10.7
room 10.6
interior 10.6
holy 10.6
attractive 10.5
couple 10.4
sexy 10.4
sitting 10.3
color 10
fashion 9.8
prayer 9.6
spiritual 9.6
god 9.6
statue 9.5
men 9.4
happiness 9.4
model 9.3
educator 9.3
music 9.1
vintage 9.1
posing 8.9
bass 8.8
pray 8.7
boy 8.7
love 8.7
lifestyle 8.7
spirituality 8.6
luxury 8.6
culture 8.5
expression 8.5
professional 8.5
historical 8.5
happy 8.1
looking 8
home 8
sculpture 7.9
catholic 7.9
together 7.9
cornet 7.9
bible 7.8
golden 7.7
elegant 7.7
saint 7.7
passion 7.5
traditional 7.5
style 7.4
gold 7.4
performer 7.4
historic 7.3
decoration 7.3
student 7.2
smile 7.1

Google
created on 2018-02-10

art 83.9
painting 82.1
disciple 60.2

Microsoft
created on 2018-02-10

person 99.4
people 64.8
group 64.4
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-7
Gender Male, 77.7%
Calm 3.7%
Happy 0.3%
Disgusted 0.4%
Confused 0.4%
Angry 0.8%
Sad 94.2%
Surprised 0.2%

AWS Rekognition

Age 35-52
Gender Male, 78.8%
Disgusted 1.1%
Sad 2.3%
Happy 1.1%
Surprised 1.7%
Calm 91.7%
Angry 1.1%
Confused 1%

AWS Rekognition

Age 11-18
Gender Female, 98.6%
Happy 14.1%
Sad 56.7%
Disgusted 12.2%
Surprised 4.3%
Calm 3.5%
Confused 2.8%
Angry 6.5%

AWS Rekognition

Age 23-38
Gender Female, 98.4%
Surprised 3.3%
Happy 3.7%
Sad 7.5%
Disgusted 6.4%
Confused 3.2%
Calm 46.6%
Angry 29.3%

AWS Rekognition

Age 17-27
Gender Female, 84.5%
Happy 1.3%
Surprised 5.2%
Sad 50.9%
Angry 4.1%
Disgusted 2.7%
Confused 2.8%
Calm 33%

AWS Rekognition

Age 4-9
Gender Female, 66.8%
Angry 21.6%
Calm 47.5%
Confused 2.4%
Surprised 3.2%
Sad 9.3%
Disgusted 12.1%
Happy 3.9%

AWS Rekognition

Age 19-36
Gender Female, 54%
Happy 45.2%
Sad 47.3%
Disgusted 45.3%
Surprised 45.1%
Calm 50.7%
Confused 45.2%
Angry 46.2%

AWS Rekognition

Age 2-5
Gender Female, 84.5%
Surprised 1.1%
Disgusted 89.5%
Confused 1%
Happy 2.2%
Calm 1.3%
Sad 2.5%
Angry 2.3%

AWS Rekognition

Age 26-43
Gender Female, 55.3%
Happy 20.5%
Angry 4.9%
Disgusted 3.9%
Sad 14.5%
Surprised 14.6%
Calm 28.1%
Confused 13.5%

AWS Rekognition

Age 35-52
Gender Female, 52.9%
Happy 45.2%
Calm 46.2%
Confused 45.4%
Angry 45.4%
Sad 51.9%
Disgusted 45.5%
Surprised 45.3%

AWS Rekognition

Age 1-5
Gender Female, 71.2%
Happy 1.6%
Surprised 3.7%
Angry 18.8%
Calm 49.9%
Disgusted 8.1%
Sad 14.3%
Confused 3.6%

AWS Rekognition

Age 27-44
Gender Female, 50.3%
Happy 49.6%
Surprised 49.6%
Angry 49.6%
Calm 49.9%
Disgusted 49.8%
Sad 49.6%
Confused 49.5%

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.9%
Person 99.5%

Categories

Captions

Meta Llama

Created on 2024-11-22

The image depicts a painting of a scene from ancient Greece, where a group of people are gathered around a table. The scene is set in a dimly lit room with a large table in the center, covered with a white cloth. Several people are seated around the table, while others stand nearby. In the foreground, a woman sits on a chair, wearing a white dress and holding a plate of food. To her right, a man stands, wearing a red tunic and holding a plate of food as well. On the opposite side of the table, another man sits, wearing a green tunic and holding a plate of food. In the background, several other people are visible, including a group of men playing musical instruments and a woman standing near the back of the room. The overall atmosphere of the painting appears to be one of celebration and festivity, with the people gathered around the table enjoying food and drink together. The painting is likely a depiction of a ancient Greek banquet or feast, where people would gather to socialize and celebrate special occasions. The use of muted colors and soft lighting creates a sense of warmth and intimacy, drawing the viewer into the scene and inviting them to participate in the celebration.