Human Generated Data

Title

Untitled (seated woman)

Date

c. 1910

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.711

Human Generated Data

Title

Untitled (seated woman)

People

Artist: Martin Schweig, American 20th century

Date

c. 1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.711

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 95.9
Human 95.9
Art 85.7
Painting 80.7
Dance 77.7
Photography 66.6
Face 66.6
Portrait 66.6
Photo 66.6
Clothing 62.4
Apparel 62.4
Female 55

Clarifai
created on 2023-10-25

people 99.4
portrait 99.3
sepia 98.2
art 98.2
one 97.8
woman 97.7
wedding 96.7
wear 96
bride 95.5
veil 95.4
dress 95.3
vintage 95.3
retro 94.2
adult 93.9
dancing 92.8
girl 92.5
sepia pigment 90.8
old 89.8
fashion 89.5
nostalgia 89.1

Imagga
created on 2022-01-08

statue 29.1
portrait 25.3
person 24.3
people 22.9
dress 22.6
groom 22.6
adult 19.2
love 18.9
couple 18.3
man 18.2
attractive 16.8
fashion 16.6
happiness 16.5
bride 16.3
sculpture 15.9
face 14.9
wedding 14.7
pretty 14.7
sexy 14.5
happy 13.8
married 13.4
male 13
lady 13
hair 12.7
old 12.5
model 11.7
art 11.5
hand 11.4
body 11.2
lifestyle 10.8
bridal 10.7
smile 10.7
looking 10.4
bouquet 10.4
expression 10.2
smiling 10.1
cheerful 9.8
human 9.8
antique 9.6
clothing 9.5
two 9.3
sensual 9.1
monk 9
detail 8.9
veil 8.8
skin 8.8
women 8.7
life 8.6
black 8.5
studio 8.4
emotion 8.3
holding 8.3
historic 8.3
dancer 8.2
gorgeous 8.2
romance 8
romantic 8
night 8
home 8
indoors 7.9
gown 7.9
cute 7.9
world 7.9
flowers 7.8
mother 7.8
ancient 7.8
luxury 7.7
elegant 7.7
youth 7.7
child 7.7
casual 7.6
marriage 7.6
stone 7.6
erotic 7.6
ring 7.6
performer 7.5
parent 7.5
city 7.5
one 7.5
harmonica 7.4
indoor 7.3
adolescent 7.3
architecture 7

Microsoft
created on 2022-01-08

wall 98.2
text 97.7
dress 93.4
clothing 89.6
person 84.3
woman 82.4
human face 81.5
wedding dress 75.8
painting 74.6
portrait 55.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Female, 99.3%
Calm 99.8%
Sad 0.1%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.9%
Painting 80.7%

Categories

Imagga

paintings art 54.8%
pets animals 43.4%

Text analysis

Amazon

MARTIN
MARTIN SCHWER
SCHWER

Google

NARIN
SCHME
NARIN SCHME