Human Generated Data

Title

Untitled (close-up of girl at beach)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8667

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (close-up of girl at beach)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.3
Face 98.3
Person 97.9
Head 86.1
Female 80.4
Finger 76.6
Photography 66.6
Photo 66.6
Portrait 66.6
Text 66.6
Woman 63.9
Skin 63.2
Apparel 62.2
Clothing 62.2
Sleeve 56.1
Hair 56
Girl 55.7

Imagga
created on 2022-01-09

blond 61.9
hair 43.7
portrait 43.4
face 37
attractive 36.4
negative 35
adult 33.7
eyes 32.8
sexy 31.4
pretty 30.1
model 28
lips 27.8
film 27.7
lady 26.8
fashion 26.4
wig 25.2
person 25.1
black 22.8
people 22.3
sensual 21.9
women 21.4
photographic paper 21.4
cute 20.8
skin 20.8
smile 20.7
hairpiece 20.2
look 20.2
happy 19.5
make 19.1
makeup 17.7
long 17.5
hairdresser 15.9
attire 15.6
smiling 15.2
style 14.9
sensuality 14.6
clothing 14.3
hairstyle 14.3
human 14.3
photographic equipment 14.2
love 14.2
brunette 14
studio 13.7
closeup 13.5
elegance 13.5
erotic 12.8
natural 12.7
close 12.6
bride 12.5
lovely 12.5
posing 12.5
expression 12
youth 11.9
head 11.8
happiness 11
gorgeous 10.9
body 10.4
cosmetics 10.3
cheerful 9.8
one 9.7
wedding 9.2
dark 9.2
dress 9
eye 8.9
brown 8.8
looking 8.8
bust 8.7
lifestyle 8.7
glamor 8.6
casual 8.5
care 8.2
pose 8.2
elegant 7.7
facial 7.7
healthy 7.6
joy 7.5
feminine 7.5
toiletry 7.4
girls 7.3
hair spray 7.3

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.7
text 98.7
human face 95.6
wedding dress 94
bride 90.9
woman 90.1
indoor 89.7
black and white 87.8
girl 76.2
dress 75.6
fashion 74.2
smile 64.8
portrait 62.8
fashion accessory 59.8
clothing 55.8
necklace 51.5

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 96.1%
Surprised 75.5%
Calm 21.6%
Angry 0.8%
Disgusted 0.7%
Happy 0.7%
Confused 0.2%
Sad 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Captions

Microsoft

a woman standing in front of a mirror posing for the camera 66.3%

Text analysis

Amazon

22871.
KODYK-EYLELA

Google

2871.
2
2 2871.