Human Generated Data

Title

Untitled (Madeleine Escudier Lerolle and Henry Lerolle)

Date

1895-1896

People

Artist: Hilaire-Germain-Edgar Degas, French 1834 - 1917

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2004.60

Human Generated Data

Title

Untitled (Madeleine Escudier Lerolle and Henry Lerolle)

People

Artist: Hilaire-Germain-Edgar Degas, French 1834 - 1917

Date

1895-1896

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2004.60

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Nature 97.2
Person 95
Human 95
Outdoors 93.5
Face 81.8
Crowd 78.5
Person 73.2
Text 69.3
People 61
Countryside 60.8
Building 58.8
Rural 58.8
Shack 58.8
Hut 58.8
Pedestrian 55.8

Clarifai
created on 2023-10-30

people 99.8
portrait 98.6
adult 97.9
street 97
vintage 94.9
woman 94.8
child 93.4
art 92.4
wear 91.3
man 88.9
one 88.6
girl 88.2
vehicle 88
two 84.2
retro 83.5
old 83.2
boy 82.2
rain 79.3
analogue 77.4
musician 77

Imagga
created on 2022-05-28

dark 23.4
light 17.4
blackboard 15.4
dirty 15.4
grunge 15.3
old 15.3
man 14.8
fantasy 14.4
mystery 13.4
art 13.1
industrial 12.7
landscape 12.6
black 12.6
wall 12.4
person 11.8
horror 11.6
forest 11.3
scene 11.3
people 11.2
danger 10.9
vintage 10.8
night 10.7
grungy 10.4
texture 10.4
water 10
fog 9.7
factory 9.6
industry 9.4
smoke 9.3
protection 9.1
environment 9
cool 8.9
evil 8.8
symbol 8.8
season 8.6
adult 8.4
sky 8.3
sun 8
trees 8
male 8
country 7.9
mask 7.8
scary 7.7
death 7.7
tree 7.7
canvas 7.6
fashion 7.5
structure 7.5
park 7.5
car 7.5
motor vehicle 7.5
retro 7.4
safety 7.4
ecology 7.3
peaceful 7.3
graphic 7.3
aged 7.2
color 7.2

Google
created on 2022-05-28

Microsoft
created on 2022-05-28

human face 98.1
old 94.5
outdoor 93.3
text 92.9
person 90.7
black and white 84.6
clothing 70.6
vintage 30.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Female, 100%
Calm 94.4%
Surprised 6.3%
Fear 5.9%
Sad 4%
Confused 0.5%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 95%

Categories

Captions

Microsoft
created on 2022-05-28

a vintage photo of a train 82.9%
an old photo of a train 82%
old photo of a train 78.7%

Azure OpenAI

Created on 2024-01-26

This is a grayscale photograph featuring an interior setting with vintage or antique appearance. The most prominent feature is an individual dressed in dark attire with their right hand resting on what appears to be a cane. This person seems to be seated on a settee or a bench with ornate patterns. Behind the individual, there is a wall decorated with a floral or foliate wallpaper design, adding to the antique atmosphere of the room. To the left, there is a source of bright light, possibly a window, and above, a framed piece of artwork or mirror is mounted on the wall. The overall quality of the image is aged, with specks and marks that indicate it is an old photograph.

Anthropic Claude

Created on 2024-03-29

The image appears to be a black and white photograph of two individuals seated in what looks like a theater or stage setting. The individuals are facing the camera, with one person wearing a patterned garment and the other sitting close by. The background includes ornate architectural details and fabrics, suggesting a formal or ornate setting. The image has a vintage, historical quality, indicating it was likely taken some time in the past.

Text analysis

Google

LKW K
LKW
K