Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Tripod 99.8
Human 99.1
Person 99.1

Clarifai

people 99.7
monochrome 99.1
adult 98.8
one 96.9
woman 96
portrait 95.4
man 94.9
girl 94.6
room 92.3
street 92.2
two 90.6
model 83.9
indoors 83.3
music 82.7
chair 81
studio 76.7
wear 76.6
actor 75.5
light 73.1
ladder 72.9

Imagga

tripod 100
rack 100
support 88.4
man 28.2
sport 27.2
male 24.1
silhouette 23.2
people 21.2
golf 21
person 17.7
crutch 17.1
cleaner 16.6
golfer 15.6
ball 14.9
outdoor 14.5
club 14.1
golfing 13.7
adult 13.6
outdoors 13.4
leisure 13.3
walking 13.3
player 13.2
play 12.9
staff 12.5
course 12.4
grass 11.9
exercise 11.8
game 11.6
stick 11.3
men 11.2
playing 10.9
sunset 10.8
active 10.8
hole 10.5
one 10.5
sky 10.2
sports 10.2
putt 9.8
hobby 9.5
water 9.3
professional 9.3
travel 9.2
vacation 9
recreation 9
mountain 8.9
putter 8.9
putting 8.8
swing 8.8
standing 8.7
senior 8.4
black 8.4
landscape 8.2
activity 8.1
lifestyle 8
business 7.9
tee 7.8
hit 7.8
happy 7.5
sunrise 7.5
sun 7.2
women 7.1
posing 7.1

Google

Microsoft

floor 96
black 72.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 66.3%
Surprised 6.9%
Angry 17.2%
Disgusted 27.2%
Confused 10.8%
Happy 6.6%
Calm 19.9%
Sad 11.3%

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a black and white photo of a person 73.5%
a person standing in a room 73.4%
a person in a white room 73.3%