Human Generated Data

Title

Untitled (two portraits of baby)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1968

Human Generated Data

Title

Untitled (two portraits of baby)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.9
Apparel 99.9
Dress 99.9
Human 97.9
Person 97.4
Person 96
Female 94.1
Face 92.3
Costume 91
Smile 78.9
Girl 76.4
Photography 74.5
Portrait 74.5
Photo 74.5
Fashion 71.7
Woman 71.3
Evening Dress 71.3
Gown 71.3
Robe 71.3
Kid 70.7
Child 70.7
Art 55.2
Head 55

Imagga
created on 2022-01-22

clothing 31
person 27.5
portrait 27.2
shower cap 27
dress 25.3
cap 23.7
face 22.7
people 21.7
happy 21.3
attractive 21
adult 19.7
bride 18.3
pretty 17.5
wedding 17.5
happiness 17.2
hair 16.6
love 16.6
headdress 16.2
cute 15.8
human 15.7
fashion 15.1
man 14.1
negative 14.1
model 14
posing 13.3
smiling 13
lady 13
looking 12.8
winter 12.8
holiday 12.2
child 12
fun 12
one 11.9
lifestyle 11.6
covering 11.4
water 11.3
film 11.2
veil 10.8
gown 10.7
care 10.7
smile 10.7
cheerful 10.6
celebration 10.4
pose 10
male 9.9
consumer goods 9.9
health 9.7
marriage 9.5
women 9.5
cold 9.5
wig 9.3
clean 9.2
summer 9
wet 8.9
family 8.9
body 8.8
washing 8.7
couple 8.7
photographic paper 8.6
season 8.6
expression 8.5
snow 8.5
skin 8.5
studio 8.4
joy 8.3
20s 8.2
garment 8.2
sensual 8.2
sexy 8
kid 8
skirt 7.9
together 7.9
modern 7.7
hairstyle 7.6
laughing 7.6
elegance 7.5
relaxation 7.5
mother 7.5
shower 7.4
hairpiece 7.4
costume 7.4
girls 7.3
sensuality 7.3
world 7.2
childhood 7.2
lovely 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

human face 98.8
clothing 98.4
person 98
text 94.1
smile 93.1
girl 82.1
black and white 80
black 74.2
white 73.8
posing 66.6
woman 62.2
dress 55.2

Face analysis

Amazon

Google

AWS Rekognition

Age 4-10
Gender Female, 88.1%
Calm 62.5%
Sad 20.7%
Happy 7.7%
Surprised 3%
Disgusted 2.5%
Angry 1.8%
Fear 1.3%
Confused 0.5%

AWS Rekognition

Age 6-14
Gender Male, 73.5%
Surprised 87.3%
Happy 9.2%
Calm 2.4%
Disgusted 0.3%
Fear 0.3%
Sad 0.2%
Angry 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.4%

Captions

Microsoft

a vintage photo of a man 82.2%
a vintage photo of a man and woman posing for a picture 48.9%
a vintage photo of a man and a woman posing for a picture 48.8%

Text analysis

Google

ఉం
ఉం