Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Tripod 98.8
Human 98.6
Person 98.6
Photo 78.5
Photography 78.5
Photographer 60.4

Clarifai

people 99.5
one 98.9
monochrome 97.7
adult 96.7
portrait 96.6
man 92
street 91.4
retro 88.2
chair 87.6
art 86.2
movie 85.2
tripod 83
music 82.2
woman 82.2
old 81.8
wall 80.7
vintage 79.9
analogue 79.1
position 79.1
studio 77.9

Imagga

tripod 100
rack 100
support 100
man 28.2
person 24.8
male 19.9
sport 18.1
people 17.3
silhouette 15.7
outdoors 15.7
outdoor 15.3
adult 14.9
camera 14.8
professional 14.3
golf 14.3
playing 12.8
sky 12.1
happy 11.9
equipment 11.8
golfer 11.7
black 11.4
hobby 11.4
club 11.3
one 11.2
outside 11.1
sports 11.1
photographer 10.8
active 10.8
studio 10.6
photograph 10.6
microphone 10.5
attractive 10.5
ball 10.5
grass 10.3
exercise 10
music 9.9
landscape 9.7
play 9.5
player 9.4
work 9.4
leisure 9.1
old 9.1
sunset 9
technology 8.9
shoot 8.7
lens 8.7
rock 8.7
lifestyle 8.7
course 8.6
men 8.6
business 8.5
portrait 8.4
style 8.2
recreation 8.1
activity 8.1
game 8
standing 7.8
golfing 7.8
life 7.8
travel 7.7
adventure 7.6
instrument 7.4
entertainment 7.4
safety 7.4
alone 7.3
industrial 7.3
sexy 7.2
suit 7.2
looking 7.2
mountain 7.1
job 7.1

Microsoft

person 97.4

Face analysis

Amazon

AWS Rekognition

Age 19-36
Gender Male, 82.7%
Happy 3.4%
Calm 2.3%
Angry 3.9%
Disgusted 19.2%
Sad 65.2%
Surprised 2.3%
Confused 3.7%

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a person sitting in front of a window 64%
a person sitting in front of a window 51%
a person standing in front of a window 50.9%