Human Generated Data

Title

The Laundress

Date

c. 1761

People

Artist: Jean-Baptiste Greuze, French 1725 - 1805

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Charles E. Dunlap, 1957.181

Human Generated Data

Title

The Laundress

People

Artist: Jean-Baptiste Greuze, French 1725 - 1805

Date

c. 1761

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Charles E. Dunlap, 1957.181

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Art 100
Painting 100
Adult 99.1
Female 99.1
Person 99.1
Woman 99.1
Clothing 96.2
Hat 96.2
Face 92.1
Head 92.1
Pottery 87
Footwear 70.2
Shoe 70.2
Shoe 67.4
Bonnet 65.2
Jug 56.3
Cookware 55.6
Pot 55.6

Clarifai
created on 2023-11-01

art 99.5
people 99.2
woman 98.7
one 98.1
painting 97.5
gown (clothing) 96.3
adult 96.2
portrait 95
wear 94
Renaissance 93
seat 93
dress 91.8
religion 91.5
girl 91.3
veil 91.2
furniture 91.2
print 86.2
god 85.9
sit 85.6
baby 85

Imagga
created on 2018-12-18

fashion 25.6
costume 24.3
people 22.3
person 22.2
lady 21.9
adult 21.3
attractive 21
face 19.2
religion 18.8
portrait 18.8
interior 17.7
pretty 16.1
model 15.6
sexy 14.5
art 13
church 13
hair 12.7
dress 12.6
old 12.5
happy 12.5
traditional 12.5
blond 12.2
clothing 12.2
elegant 12
style 11.9
vintage 11.6
god 11.5
man 11.4
human 11.2
religious 11.2
culture 11.1
performer 10.6
sculpture 10.2
black 10.2
elegance 10.1
pray 9.7
spiritual 9.6
faith 9.6
male 9.5
cute 9.3
casual 9.3
child 9.2
sensuality 9.1
antique 8.8
hat 8.8
body 8.8
home 8.8
device 8.7
worship 8.7
holy 8.7
covering 8.6
figure 8.5
fun 8.2
gold 8.2
posing 8
lifestyle 7.9
love 7.9
holiday 7.9
look 7.9
couple 7.8
smile 7.8
happiness 7.8
color 7.8
sitting 7.7
couch 7.7
prayer 7.7
saint 7.7
gramophone 7.7
hairstyle 7.6
mask 7.6
cap 7.5
comedian 7.4
furniture 7.4
sensual 7.3

Google
created on 2018-12-18

Microsoft
created on 2018-12-18

person 13.5
art 12.5
painting 5.4
museum 5
vacation 3.7
clothing 3.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-18
Gender Female, 65.4%
Calm 98.9%
Surprised 6.4%
Fear 6%
Sad 2.2%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Female 99.1%
Person 99.1%
Woman 99.1%
Shoe 70.2%

Categories

Imagga

events parties 99.4%

Captions

Azure OpenAI

Created on 2024-01-26

This image portrays a person in a historical setting, engaged in domestic work. The individual is seated, focusing on the task at hand with a massive basin placed in front of them. Clothing details include a garment with a striped pattern and muted colors. The background reveals a rustic interior with wooden furniture, including a cabinet and a shelf where kitchen utensils and textiles are stored or displayed, hinting at a kitchen or a space where household chores are performed. The lighting and fine details suggest this scene is from a painting, capturing a moment of everyday life from a bygone era.