Human Generated Data

Title

Untitled (woman holding baby)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16805

Human Generated Data

Title

Untitled (woman holding baby)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 100
Clothing 100
Person 99.4
Human 99.4
Robe 96.4
Fashion 96.4
Gown 95.5
Bridegroom 94.9
Wedding 94.9
Dress 94.8
Female 94.3
Bride 84.3
Wedding Gown 84.3
Plant 84.1
Woman 83.7
Outdoors 83.6
Tree 81.7
Face 81
Sleeve 80.8
Nature 79.3
Chair 74.9
Furniture 74.9
Meal 74.3
Food 74.3
Overcoat 67.4
Coat 67.4
Suit 67.4
Photography 66.3
Photo 66.3
Portrait 66.3
Long Sleeve 64.2
Blossom 64
Flower 64
Girl 62.4
Evening Dress 57.8

Imagga
created on 2022-02-26

adult 23.3
person 22.4
man 21.5
people 21.2
groom 19
dress 19
bride 17.8
love 17.4
fashion 17.3
snow 17.2
life 16.8
portrait 16.8
women 16.6
happiness 16.4
male 16.4
couple 15.7
happy 15
outdoor 14.5
winter 14.5
wedding 13.8
cold 13.8
outdoors 13.7
lifestyle 13
sitting 12.9
sexy 12.8
fun 12.7
clothing 12.5
black 12
model 11.7
smiling 11.6
cheerful 11.4
married 10.5
marriage 10.4
men 10.3
human 9.7
lady 9.7
one 9.7
negative 9.4
park 9.3
smile 9.3
elegance 9.2
pretty 9.1
attractive 9.1
active 9.1
professional 9
romance 8.9
cool 8.9
gown 8.8
hair 8.7
water 8.7
outside 8.6
wall 8.6
wife 8.5
youth 8.5
two 8.5
joy 8.3
bench 8.1
romantic 8
business 7.9
casual 7.6
chair 7.6
beach 7.6
passion 7.5
friends 7.5
teacher 7.5
style 7.4
girls 7.3
gorgeous 7.2
kin 7.2
looking 7.2
face 7.1
summer 7.1
working 7.1
work 7.1
businessman 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.4
person 92.2
clothing 88.3
human face 79
christmas tree 78.7
snow 78.2
black and white 64.5
tree 53.3

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 77%
Calm 98.8%
Sad 0.4%
Surprised 0.3%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%
Confused 0.1%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a man and a woman standing in front of a mirror 55.7%
a man and a woman standing in front of a window 55.6%
a person standing in front of a mirror posing for the camera 55.5%

Text analysis

Amazon

VAGON