Human Generated Data

Title

New Year's Eve, Times Square

Date

1951

People

Artist: Dan Weiner, American 1919 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1306

Copyright

© Dan Weiner © John Broderick

Human Generated Data

Title

New Year's Eve, Times Square

People

Artist: Dan Weiner, American 1919 - 1959

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.4
Human 99.4
Person 98.6
Make Out 97.4
Person 96.6
Kissing 96
Kiss 96
Person 95.5
Person 94.1
Person 89.1
Person 79
Face 69.9
Person 69.8
Head 68.8
People 67.4
Dating 59.6

Imagga
created on 2022-01-08

man 35.7
adult 35.6
sexy 34.6
attractive 31.5
portrait 30.4
couple 29.6
love 28.4
male 27.7
black 26.5
people 25.7
face 25.6
hair 25.4
pretty 25.2
person 23.1
fashion 22.6
sensual 21.8
model 21.8
body 20.8
skin 20.5
passion 18.8
sensuality 18.2
two 17.8
lady 17
romance 17
happy 16.9
dark 16.7
studio 16
women 15.8
human 15
lips 14.8
lovers 14.5
happiness 14.1
brunette 13.9
eyes 13.8
cute 13.6
sex 13.6
desire 13.5
romantic 13.4
erotic 12.8
passionate 12.8
husband 12.8
smile 12.1
close 12
expression 11.9
elegance 11.8
handsome 11.6
lifestyle 11.6
sexual 11.6
looking 11.2
makeup 11.1
long 11
make 10.9
posing 10.7
boyfriend 10.6
married 10.6
youth 10.2
emotion 10.1
gorgeous 10
lovely 9.8
kiss 9.8
dating 9.7
girlfriend 9.6
beard 9.2
style 8.9
blond 8.8
embrace 8.8
together 8.8
sunglasses 8.8
world 8.6
wife 8.5
relationship 8.4
valentine 8.2
brother 8.2
look 7.9
hug 7.8
underwear 7.7
loving 7.6
adults 7.6
pair 7.6
togetherness 7.6
feminine 7.5
one 7.5
holding 7.4
spectator 7.3
girls 7.3
boy 7.1
lingerie 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

kiss 100
text 98
person 94.3
love 93.7
indoor 92.9
romance 88.9
black and white 81.5
hug 70.6
making out 69.2
interaction 67
dating 55.4
staring 19.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Male, 88.6%
Sad 45%
Calm 20%
Disgusted 16.5%
Happy 6.4%
Angry 4%
Confused 3.8%
Fear 2.7%
Surprised 1.6%

AWS Rekognition

Age 25-35
Gender Male, 99.6%
Calm 53.4%
Sad 30.4%
Confused 6.9%
Disgusted 3.7%
Angry 2.3%
Surprised 1.4%
Fear 1.3%
Happy 0.6%

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a man standing in front of a mirror posing for the camera 35.8%
a man looking at the camera 35.7%
a man and a woman looking at the camera 35.6%