Human Generated Data

Title

Untitled (waiting room)

Date

1971

People

Artist: Ken Heyman, American born 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.547

Copyright

© Ken Heyman

Human Generated Data

Title

Untitled (waiting room)

People

Artist: Ken Heyman, American born 1930

Date

1971

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Furniture 99.9
Chair 99.9
Person 99.4
Human 99.4
Person 98.3
Indoors 97.7
Interior Design 97.7
Person 97.5
Clothing 95.7
Shoe 95.7
Footwear 95.7
Apparel 95.7
Sitting 94.3
Shoe 91.9
Shoe 91.4
Shorts 89.7
Couch 85.8
Face 72.4
People 60
Man 57.7
Room 56.1
Shoe 50.9

Clarifai
created on 2018-02-10

people 99.7
adult 97.7
two 97.1
group 96.8
woman 96.7
group together 95.7
vehicle 94.1
man 93.2
portrait 93
one 92.8
three 92.5
monochrome 92.1
transportation system 91.9
street 90.6
administration 90.3
sit 90
actor 89.1
several 88.1
seat 87.9
music 87.6

Imagga
created on 2018-02-10

sitting 30.1
person 29.1
attractive 28.7
adult 28.6
people 27.3
sexy 27.3
fashion 24.9
chair 21.8
portrait 20.7
pretty 20.3
model 20.2
hair 19.8
seat 18.8
sensual 18.2
happy 18.2
lifestyle 18.1
device 17.6
man 17.5
women 17.4
one 17.2
clothing 16.8
looking 16.8
style 16.3
cute 15.8
brunette 15.7
couch 15.5
dress 15.4
armchair 15
black 14.8
elegant 14.6
smiling 14.5
sofa 14.4
legs 14.2
lady 13.8
male 13.6
casual 13.6
elegance 13.4
support 13.1
couple 13.1
smile 12.8
blond 12.8
sensuality 12.7
posing 12.4
face 12.1
gorgeous 11.8
human 11.2
body 11.2
happiness 11
cheerful 10.6
jeans 10.5
expression 10.2
20s 10.1
together 9.6
passion 9.4
instrument 9.3
lips 9.3
outdoor 9.2
makeup 9.2
leisure 9.1
pose 9.1
stylish 9
luxury 8.6
sit 8.5
erotic 8.5
skin 8.5
lying 8.5
studio 8.4
suit 8.3
teen 8.3
fun 8.2
home 8
kin 7.9
love 7.9
urban 7.9
professional 7.8
hands 7.8
child 7.8
vogue 7.7
desire 7.7
two 7.6
outdoors 7.5
alone 7.3
business 7.3
car 7.3
success 7.2
handsome 7.1
job 7.1
interior 7.1

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 100
sitting 99.9
bench 98.9
seated 30.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-25
Gender Male, 98.3%
Happy 0.1%
Angry 0.7%
Disgusted 0.2%
Sad 3.6%
Surprised 0.3%
Calm 93.7%
Confused 1.4%

AWS Rekognition

Age 10-15
Gender Female, 90.9%
Surprised 0.9%
Angry 1.2%
Happy 0.3%
Sad 87%
Disgusted 0.5%
Confused 0.6%
Calm 9.6%

AWS Rekognition

Age 20-38
Gender Female, 99.5%
Calm 71.6%
Angry 7.9%
Disgusted 0.7%
Sad 14.7%
Surprised 1.9%
Confused 2.6%
Happy 0.6%

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 95.7%

Captions

Microsoft

a group of people sitting on a bench reading a book 84.6%
a group of people that are sitting on a bench reading a book 78.6%
a man and a woman sitting on a bench reading a book 67.4%

Text analysis

Amazon

TV:CHAIR

Google

CHAIR
TV CHAIR TV CHAIR
TV