Human Generated Data

Title

Untitled

Date

1994

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.276

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

1994

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Person 99.1
Human 99.1
Person 99.1
Person 98.7
Person 95.6
Apparel 94.6
Clothing 94.6
Person 94.6
Person 83.7
Person 83.5
Person 76.2
Overcoat 75.1
Coat 75.1
Face 73.2
Pedestrian 69.5
Suit 68.7
People 65.5
Person 64.4
Door 58.9
Furniture 56.9

Clarifai
created on 2018-11-05

people 99.1
man 96.4
adult 95.8
indoors 95.7
group 95.4
woman 94.5
offense 93.8
music 91.9
wear 91.5
room 91.2
one 91.1
exhibition 90.7
business 89.5
vehicle 88.3
furniture 87.4
portrait 87.1
technology 85.6
blur 83.9
travel 83.5
landscape 82.4

Imagga
created on 2018-11-05

guitar 34.7
stringed instrument 34.1
electric guitar 32.2
man 30.3
musical instrument 27.9
people 27.3
person 23.9
adult 18.9
lifestyle 18.8
male 18.5
sitting 15.5
attractive 15.4
music 15.3
violin 15.1
portrait 14.9
black 14.9
women 14.2
bowed stringed instrument 13.9
musician 13.7
sexy 13.6
casual 13.6
indoors 13.2
business 12.8
handsome 12.5
pretty 11.9
equipment 11.8
work 11.8
looking 11.2
smile 10.7
group 10.5
style 10.4
life 10.3
men 10.3
smiling 10.1
power 10.1
happy 10
device 10
leisure 10
night 9.8
interior 9.7
working 9.7
together 9.6
party 9.5
room 9.4
club 9.4
face 9.2
hot 9.2
singer 9.2
dark 9.2
alone 9.1
one 9
cheerful 8.9
job 8.8
disco 8.8
nightlife 8.8
worker 8.6
modern 8.4
hand 8.4
fashion 8.3
entertainment 8.3
single 8.2
technology 8.2
clothing 8.2
chair 8.2
guitarist 7.9
urban 7.9
happiness 7.8
nightclub 7.8
concert 7.8
performance 7.7
reading 7.6
guy 7.6
friends 7.5
sound 7.5
city 7.5
restaurant 7.5
outdoors 7.5
instrument 7.4
street 7.4
indoor 7.3
autumn 7

Google
created on 2018-11-05

snapshot 81.8

Microsoft
created on 2018-11-05

electronics 97.8
monitor 96
indoor 94
screen 68.3
display 25.2

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 93.7%
Disgusted 11.9%
Calm 12.7%
Sad 35.1%
Angry 20.1%
Confused 9.2%
Happy 4.8%
Surprised 6.3%

AWS Rekognition

Age 15-25
Gender Female, 95.1%
Happy 7.8%
Calm 58.4%
Disgusted 1.8%
Confused 3.4%
Sad 15.4%
Angry 10.6%
Surprised 2.6%

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a flat screen tv sitting in front of a television 55.9%
a flat screen tv sitting on top of a television 52.6%
a flat screen tv 52.5%

Text analysis

Amazon

4/5
WQ 14 4/5
14
WQ

Google

TA 475
475
TA