Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.8
Person 99.8
Furniture 99.8
Chair 99.8
Tripod 99.7
Photo 67.4
Photography 67.4
Floor 58.5

Clarifai

people 99.6
monochrome 99.3
adult 97.6
one 97.3
room 97.1
man 94.9
woman 93.1
two 92.4
indoors 91.1
portrait 91
window 88
street 87.6
girl 87.5
family 86.8
chair 85.2
group 83.2
furniture 82.3
wall 80.7
music 80.7
wear 80.6

Imagga

rack 100
tripod 100
support 89.3
man 26.2
male 22.7
interior 19.5
business 19.4
silhouette 19
people 19
person 17.2
room 16.4
modern 16.1
equipment 15.8
chair 14.3
indoors 13.2
men 12.9
black 12
office 11.2
window 11
studio 10.6
businessman 10.6
camera 10.4
technology 10.4
table 10.4
building 10.3
floor 10.2
work 10.2
occupation 10.1
light 10
urban 9.6
life 9.4
architecture 9.4
glass 9.3
professional 9.3
painting 9
style 8.9
working 8.8
home 8.8
lifestyle 8.7
adult 8.4
attractive 8.4
house 8.4
digital 8.1
standing 7.8
microphone 7.8
wall 7.7
industry 7.7
outdoor 7.6
bag 7.5
alone 7.3
success 7.2
domestic 7.2
portrait 7.1
worker 7.1
job 7.1
travel 7

Google

Microsoft

floor 94.1
indoor 89

Face analysis

Amazon

AWS Rekognition

Age 38-59
Gender Male, 97%
Sad 1.4%
Calm 94%
Confused 0.5%
Disgusted 0.6%
Surprised 1%
Happy 1.1%
Angry 1.4%

Feature analysis

Amazon

Person 99.8%
Chair 99.8%

Captions

Microsoft

a person standing in a room 95.7%
a person standing in front of a computer 78.1%
a man and a woman standing in a room 78%