Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Tripod 99.1
Person 98.7
Human 98.7
Photo 74.9
Photography 74.9
Photographer 67.3

Clarifai

people 99.4
one 98.4
monochrome 98.3
portrait 96.5
adult 96.2
street 94.6
man 91.6
woman 87.4
chair 87
music 86.7
art 84.5
girl 82.1
profile 80.1
analogue 79.4
child 78.5
wall 77.8
black and white 77.2
model 76.7
retro 76.2
indoors 75.5

Imagga

tripod 100
rack 100
support 100
man 27.5
person 26
male 18.4
people 18.4
silhouette 17.4
outdoor 16.8
sport 16.5
adult 16.2
camera 15.7
outdoors 14.2
professional 13.5
sky 12.8
golf 12.4
black 12
happy 11.9
attractive 11.9
playing 11.8
one 11.2
equipment 11.1
sports 11.1
photographer 10.8
active 10.8
studio 10.6
microphone 10.6
photograph 10.6
ball 10.5
hobby 10.4
landscape 10.4
club 10.4
portrait 10.3
grass 10.3
outside 10.3
music 9.9
sunset 9.9
golfer 9.8
men 9.4
player 9.4
exercise 9.1
style 8.9
shoot 8.7
lens 8.7
standing 8.7
rock 8.7
work 8.6
course 8.6
business 8.5
technology 8.2
suit 8.1
activity 8.1
success 8
mountain 8
looking 8
lifestyle 7.9
play 7.8
old 7.7
adventure 7.6
leisure 7.5
instrument 7.4
entertainment 7.4
safety 7.4
competition 7.3
industrial 7.3
dirty 7.2
recreation 7.2
game 7.1
job 7.1
travel 7

Google

Microsoft

person 96.2

Face analysis

Amazon

AWS Rekognition

Age 19-36
Gender Male, 63.8%
Surprised 1.9%
Confused 2.4%
Disgusted 21.3%
Calm 1.7%
Angry 3.5%
Sad 66.5%
Happy 2.7%

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a person sitting in front of a window 72%
a person sitting in front of a window 67.2%
a person sitting in front of a window 55.6%