Human Generated Data

Title

Vertical Subway

Date

1977 (printed 1984)

People

Artist: Michael Spano, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 67.1984

Copyright

© Michael Spano

Human Generated Data

Title

Vertical Subway

People

Artist: Michael Spano, American born 1949

Date

1977 (printed 1984)

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-03

Human 97
Person 93.5
Skin 93
Person 85.4
Face 82.3
Person 79.8
Collage 73.2
Advertisement 73.2
Poster 73.2
Clothing 72.8
Apparel 72.8
People 67.7
Female 67.6
Person 67.6
Photography 65.1
Photo 65.1
Girl 62.4
Portrait 61
Sleeping 56.6
Asleep 56.6
Baby 56.6
Art 55.1

Clarifai
created on 2018-03-24

people 99.9
group 98.4
monochrome 98.4
adult 97.2
man 96.8
wear 95.6
child 95
two 94
war 92.6
woman 92.5
group together 91.5
portrait 90.1
uniform 89.6
military 88.2
one 87.4
outfit 86.3
four 85.5
three 84.3
many 83.9
administration 83.7

Imagga
created on 2018-03-24

person 23.6
love 22.9
man 22.8
people 22.3
sexy 20.9
adult 20.4
couple 20
portrait 19.4
attractive 18.2
male 18.1
black 17.5
fashion 15.8
groom 15.6
child 15.6
body 15.2
mother 14.9
lady 14.6
hair 14.3
family 14.2
pretty 14
father 13.3
parent 13.2
happy 13.2
face 12.8
kin 12.7
two 12.7
model 12.4
erotic 12.4
youth 11.9
dress 11.7
bride 11.7
interior 11.5
smile 11.4
skin 11.1
women 11.1
room 11.1
dad 10.8
human 10.5
style 10.4
passion 10.3
relationship 10.3
happiness 10.2
lifestyle 10.1
blond 10.1
sensual 10
military uniform 9.8
romantic 9.8
brunette 9.6
uniform 9.4
lying 9.4
world 9.4
clothing 9.3
decoration 9.3
wedding 9.2
one 9
romance 8.9
handsome 8.9
husband 8.9
looking 8.8
sepia 8.7
boy 8.7
married 8.6
sitting 8.6
marriage 8.5
vintage 8.3
holding 8.3
gorgeous 8.2
posing 8
kid 8
smiling 8
bridal 7.8
underwear 7.7
desire 7.7
wife 7.6
relax 7.6
guy 7.4
closeup 7.4
indoor 7.3
suit 7.3
girls 7.3
detail 7.2
home 7.2
cute 7.2
summer 7.1
day 7.1

Google
created on 2018-03-24

Microsoft
created on 2018-03-24

person 93.8
clothes 19.3
crowd 0.6

Face analysis

Amazon

AWS Rekognition

Age 1-5
Gender Female, 91%
Disgusted 0.7%
Angry 0.4%
Confused 0.3%
Sad 0.8%
Happy 2.6%
Calm 94.3%
Surprised 0.8%

AWS Rekognition

Age 19-36
Gender Male, 54.2%
Disgusted 45.2%
Sad 45.7%
Calm 52%
Angry 45.5%
Surprised 45.6%
Confused 45.6%
Happy 45.4%

AWS Rekognition

Age 26-43
Gender Female, 54%
Sad 53.7%
Surprised 45.1%
Angry 45.1%
Happy 45.1%
Confused 45.1%
Calm 46.1%
Disgusted 45%

AWS Rekognition

Age 20-38
Gender Female, 50%
Angry 46.3%
Calm 47.8%
Confused 45.4%
Surprised 45.6%
Happy 45.5%
Sad 49%
Disgusted 45.4%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Confused 49.5%
Angry 49.6%
Calm 49.5%
Sad 49.7%
Disgusted 49.5%
Happy 50.1%
Surprised 49.5%

Feature analysis

Amazon

Person 93.5%

Captions

Microsoft

a group of people sitting on a bed 44.8%
a group of people sitting in a chair 44.7%
a group of people riding on the back of a motorcycle 22.4%