Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Tripod 99.7
Human 99.6
Person 99.6
Person 96.5
Photo 71.9
Photography 71.9
Canvas 67.6

Clarifai

people 99.7
monochrome 97.7
adult 97.4
man 96.6
one 96.6
chair 96.5
two 95.2
woman 93.5
room 93
indoors 92.3
furniture 92.3
step 90.9
portrait 87.6
seat 85.1
wear 83.7
street 83.2
group 82
model 78.1
girl 76.5
family 76.1

Imagga

handcart 65.7
container 37
chair 22.5
man 21.5
people 17.8
conveyance 16.6
male 16.3
business 15.8
tripod 14.7
person 14.5
building 13.5
silhouette 13.2
window 12.8
adult 12.3
seat 11.7
basket 11.7
men 11.2
worker 10.7
work 10.3
architecture 10.1
outdoor 9.9
businessman 9.7
office 9.6
wall 9.5
rack 9.4
one 9
metal 8.8
urban 8.7
support 8.7
light 8.7
water 8.7
lifestyle 8.7
furniture 8.7
sitting 8.6
store 8.5
professional 8.4
black 8.4
attractive 8.4
city 8.3
alone 8.2
outdoors 8.2
crutch 8.2
happy 8.1
sun 8
interior 8
working 8
women 7.9
day 7.8
portrait 7.8
empty 7.7
summer 7.7
pretty 7.7
walk 7.6
painter 7.5
suit 7.4
street 7.4
shopping 7.3
smiling 7.2
sexy 7.2
life 7.2
smile 7.1
cleaner 7.1
job 7.1

Google

Microsoft

floor 95.8

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 87%
Calm 63.9%
Angry 4.4%
Disgusted 6.2%
Sad 9.2%
Surprised 1.9%
Confused 1.3%
Happy 13%

AWS Rekognition

Age 11-18
Gender Female, 94.1%
Surprised 0.6%
Happy 0.7%
Disgusted 0.7%
Confused 0.3%
Calm 92.7%
Angry 0.7%
Sad 4.3%

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a person standing in a room 88.6%
a black and white photo of a person 68%
a person standing in a room 67.9%