Human Generated Data

Title

The Concert

Date

17th century

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Francis H. Burr Memorial, Alpheus Hyatt Purchasing, Louise Haskell Daly and Richard Norton Funds and Gifts for Special Uses Fund, 1970.143

Human Generated Data

Title

The Concert

People

Artist: Unidentified Artist,

Date

17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Francis H. Burr Memorial, Alpheus Hyatt Purchasing, Louise Haskell Daly and Richard Norton Funds and Gifts for Special Uses Fund, 1970.143

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 97.8
Person 97.8
Person 97.8
Painting 95.3
Art 95.3
Person 92.7
Person 88.1
Person 77.7
Person 74.7
Musician 70.1
Musical Instrument 70.1
Person 65.2
Leisure Activities 62.5
Photography 61.5
Photo 61.5
Face 60.5
Portrait 60.5
Lute 60
Drawing 59.3

Clarifai
created on 2020-04-24

people 100
adult 99.1
group 98.9
print 98.2
man 97.6
art 97.5
many 95.6
woman 95.5
engraving 92.6
painting 91.4
administration 91.2
war 89.6
military 89.3
soldier 88.4
illustration 86.9
leader 86
veil 83.5
music 82.8
portrait 82.3
wear 81.3

Imagga
created on 2020-04-24

people 21.7
man 21.6
person 20.4
statue 19.1
kin 17.7
old 16
art 15.2
adult 15
fan 14.4
male 14.2
portrait 14.2
sculpture 13.9
ancient 13.8
dark 13.4
love 12.6
face 12.1
spectator 11.7
religion 11.6
couple 11.3
fashion 11.3
one 11.2
follower 10.8
world 10.8
sexy 10.4
architecture 10.1
room 10.1
hair 9.5
culture 9.4
happy 9.4
vintage 9.2
dress 9
symbol 8.7
god 8.6
two 8.5
religious 8.4
fun 8.2
sensual 8.2
history 8
stone 8
romantic 8
together 7.9
happiness 7.8
black 7.8
antique 7.8
memorial 7.7
elegant 7.7
spiritual 7.7
groom 7.5
passion 7.5
human 7.5
historic 7.3
group 7.2
body 7.2
child 7.1
night 7.1

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 98.1
person 96
human face 92.2
painting 82
man 79.3
black and white 75.9
clothing 74.2
drawing 71.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 91.1%
Happy 35.7%
Calm 24%
Angry 10.6%
Disgusted 0.9%
Fear 3.1%
Sad 4.2%
Confused 4.3%
Surprised 17.1%

AWS Rekognition

Age 20-32
Gender Male, 98.8%
Fear 7.9%
Disgusted 1.9%
Confused 6.6%
Angry 3.3%
Happy 4.4%
Calm 21.7%
Surprised 19.6%
Sad 34.6%

AWS Rekognition

Age 33-49
Gender Male, 96.8%
Surprised 2.2%
Angry 11.2%
Disgusted 0.8%
Calm 70.9%
Fear 1%
Confused 1.1%
Happy 4.8%
Sad 8.1%

AWS Rekognition

Age 22-34
Gender Female, 83%
Fear 1%
Disgusted 1.2%
Sad 22%
Angry 1.2%
Calm 50.8%
Confused 1.1%
Happy 21%
Surprised 1.7%

AWS Rekognition

Age 37-55
Gender Female, 59.7%
Calm 0%
Confused 0.1%
Happy 0%
Fear 98.8%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Surprised 0.8%

AWS Rekognition

Age 14-26
Gender Female, 52.6%
Calm 17.8%
Confused 0.1%
Angry 1.4%
Happy 0%
Fear 0.2%
Disgusted 0%
Surprised 0%
Sad 80.5%

AWS Rekognition

Age 22-34
Gender Male, 63.1%
Surprised 21.5%
Confused 1.1%
Happy 0.2%
Angry 0.4%
Calm 1.2%
Fear 72.9%
Disgusted 0.2%
Sad 2.4%

AWS Rekognition

Age 18-30
Gender Male, 85.8%
Disgusted 0.1%
Fear 0.2%
Surprised 0.6%
Happy 17.6%
Angry 0.2%
Sad 4.8%
Calm 76.2%
Confused 0.3%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 49
Gender Male

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Painting 95.3%

Categories

Captions

Microsoft
created on 2020-04-24

a group of people looking at a book 31.3%

Azure OpenAI

Created on 2024-02-07

This image is a grayscale painting depicting a group of individuals engaged in a musical performance. A person in elaborate clothing, adorned with ruffles at the neck and wrist, is visible playing a stringed instrument, while another is seated at a keyboard instrument, likely a harpsichord, with sheet music before them. The scene is rendered with a focus on fine details and contrasts between light and dark, characteristic of chiaroscuro painting techniques. Additionally, there are several rectangular shapes obscuring parts of the composition. These shapes are uniformly filled with a solid color and appear to be added elements, not part of the original artwork.

Anthropic Claude

Created on 2024-03-30

The image depicts a dramatic, black and white scene of a group of people in what appears to be a religious or academic setting. At the center is a bearded man, surrounded by others who seem to be engaged in deep discussion or debate. Some of the figures have pained or anguished expressions, while others appear focused on reading or writing. The overall atmosphere conveys a sense of intense intellectual and emotional engagement. The image has a strong chiaroscuro effect, with dramatic contrasts of light and shadow that add to the sense of drama and tension in the scene.

Text analysis

Google

itglay
itglay