Human Generated Data

Title

Untitled (old woman in black dress posed seated in fancy chair with book)

Date

1937

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9909

Human Generated Data

Title

Untitled (old woman in black dress posed seated in fancy chair with book)

People

Artist: Martin Schweig, American 20th century

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.6
Clothing 99.6
Hat 95.9
Human 94.8
Person 93.1
Bonnet 82.2
Blossom 80.5
Plant 80.5
Flower 80.5
Flower Arrangement 71.7
Furniture 70.1
Chair 70.1
Face 69.3
Portrait 64.7
Photography 64.7
Photo 64.7
Flower Bouquet 64
Vase 61
Jar 61
Pottery 61
Cake 56.7
Cream 56.7
Dessert 56.7
Food 56.7
Icing 56.7
Creme 56.7
Female 56.5

Imagga
created on 2022-01-29

clothing 37
person 33.7
domestic 32.1
bride 30.7
portrait 28.5
wedding 27.6
people 26.8
adult 26.1
dress 21.7
love 20.5
shower cap 19.5
looking 19.2
face 19.2
smile 18.5
human 18
cap 17.9
veil 17.6
happy 16.9
clothes 16.9
attractive 16.8
gown 16.6
happiness 16.5
headdress 15.6
groom 15.3
fashion 15.1
man 14.8
male 14.4
one 14.2
nurse 14.1
smiling 13
pretty 12.6
couple 12.2
eyes 12.1
church 12
hair 11.9
robe 11.8
model 11.7
posing 11.6
women 11.1
life 10.8
cheerful 10.6
garment 10.6
married 10.5
indoors 10.5
health 10.4
bouquet 10.4
men 10.3
day 10.2
20s 10.1
cute 10
lady 9.7
professional 9.6
innocence 9.6
elegance 9.2
indoor 9.1
negative 9
blond 9
fun 9
medical 8.8
celebration 8.8
bridal 8.8
youth 8.5
business 8.5
doctor 8.5
bathrobe 8.4
mask 8.4
suit 8.1
home 8
lifestyle 8
wed 7.9
flowers 7.8
color 7.8
uniform 7.6
marriage 7.6
consumer goods 7.6
purity 7.4
patient 7.4
covering 7.4
inside 7.4
alone 7.3
pose 7.3
sexy 7.2
hospital 7.2
handsome 7.1
cool 7.1
film 7.1

Microsoft
created on 2022-01-29

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 91.7%
Calm 88.4%
Sad 4.9%
Happy 4.5%
Surprised 1%
Confused 0.6%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Hat 95.9%
Person 93.1%

Captions

Microsoft

a man standing in front of a window 73%
a man standing next to a window 64.7%
a man standing in a room 64.6%

Text analysis

Amazon

MJIR
MJIR A70A
A70A

Google

A7DA
YT33AZ
MJ17
MJ17 YT33AZ A7DA