Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Chair 99.9
Furniture 99.9
Human 99.6
Person 99.6
Tripod 99.2
Photography 71.5
Photo 71.5
Photographer 57.3

Clarifai

people 99.6
monochrome 99.5
one 97.6
two 96.8
adult 96.7
man 96.3
room 94.4
indoors 90.7
woman 90.5
window 87.6
street 85.4
family 84.1
group 83.6
chair 81.7
portrait 81.6
child 80.7
shadow 78.4
three 77.9
home 76.6
wall 76.2

Imagga

vacuum 32.5
man 25.5
male 22.7
device 19.8
equipment 19.4
business 17.6
people 17.3
person 17.3
tripod 16.4
worker 16
working 15.9
appliance 15.9
businessman 15
support 14.2
men 13.7
adult 12.9
work 12.6
rack 12.4
telephone 12.3
one 11.9
technology 11.9
modern 11.2
radio 11.1
interior 10.6
indoors 10.5
attractive 10.5
cleaner 10.3
occupation 10.1
house 10
office 9.9
suit 9.9
building 9.5
lifestyle 9.4
computer 9.1
room 9.1
silhouette 9.1
fashion 9
microphone 9
style 8.9
job 8.8
home 8.8
chair 8.7
repair 8.6
professional 8.4
portrait 8.4
phone 8.3
holding 8.3
window 8.2
alone 8.2
loudspeaker 8.1
pay-phone 8.1
tool 8.1
durables 8
standing 7.8
black 7.8
studio 7.6
bag 7.5
clean 7.5
laptop 7.5
camera 7.4
digital 7.3
television 7.1
to 7.1

Google

Microsoft

floor 96.6
indoor 94.4
person 88.8

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 53%
Sad 49%
Disgusted 45.3%
Confused 45.4%
Surprised 45.4%
Happy 46.7%
Angry 46%
Calm 47.1%

Feature analysis

Amazon

Chair 99.9%
Person 99.6%

Captions

Microsoft

a person standing in a room 91.9%
a person in a black suitcase 63.2%
a person standing in front of a laptop 62%