Human Generated Data

Title

Untitled (advertising, woman posed, hat with veil, watch, bouquet of flowers)

Date

1940

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22258

Human Generated Data

Title

Untitled (advertising, woman posed, hat with veil, watch, bouquet of flowers)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 100
Apparel 100
Veil 89.2
Fashion 76.3
Robe 76.3
Gown 76.2
Human 75.5
Person 72.6
Plant 71
Finger 63
Hat 61.4
Female 60.4
Blossom 60
Flower 60
Sleeve 57
Lace 56.4

Imagga
created on 2022-03-11

body armor 100
chain mail 100
armor 100
protective covering 68.8
covering 34.5
portrait 28.5
adult 28.4
person 23.1
bride 20.1
happy 19.4
face 19.2
hair 19
people 19
pretty 18.9
model 17.1
attractive 16.8
wedding 16.5
one 16.4
dress 16.3
man 15.4
looking 15.2
fashion 14.3
smile 14.2
women 14.2
lady 13.8
lifestyle 13.7
sexy 13.6
male 13.5
love 13.4
smiling 13
body 12.8
human 12.7
veil 12.7
skin 11.8
head 11.8
senior 11.2
bridal 10.7
cute 10
joy 10
holding 9.9
hat 9.9
gown 9.8
married 9.6
females 9.5
happiness 9.4
casual 9.3
old 9.1
black 9
outdoors 9
wet 8.9
cheerful 8.9
shower 8.7
elderly 8.6
eyes 8.6
life 8.6
men 8.6
relaxation 8.4
horizontal 8.4
studio 8.4
health 8.3
care 8.2
girls 8.2
healthy 8.2
protection 8.2
sensuality 8.2
look 7.9
outdoor 7.6
marriage 7.6
bath 7.6
blond 7.4
water 7.3
indoor 7.3
business 7.3
sensual 7.3
worker 7.1
indoors 7

Microsoft
created on 2022-03-11

human face 97.2
wedding dress 95.4
bride 94
fashion accessory 92.7
text 92.1
person 85.7
woman 85.7
clothing 79.9
smile 73.2
black and white 70.3
dress 53.8

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Female, 86.2%
Calm 52.7%
Happy 41.4%
Sad 2.2%
Surprised 1.3%
Angry 1%
Fear 0.6%
Disgusted 0.5%
Confused 0.3%

Feature analysis

Amazon

Person 72.6%

Captions

Microsoft

an old photo of a person 41%
a person wearing a hat 29.3%
a person wearing a hat 29.2%

Text analysis

Amazon

via
was
TAH
not