Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.7
Person 99.7
Tripod 99.3
Photo 69.9
Photography 69.9

Clarifai

monochrome 99.2
people 99.2
one 95.8
adult 95.5
man 93.7
step 92.8
wall 90.8
room 87.3
music 84.7
wood 83.7
furniture 82.6
portrait 82.6
inside 82.4
two 81.4
house 81
woman 80.8
indoors 80.8
door 80.1
family 79.9
shadow 78.9

Imagga

interior 34.5
wall 32
room 31.9
barrier 28.7
structure 27.1
modern 25.2
window 24.2
architecture 23.2
building 22.3
light 21.4
obstruction 21.4
house 20
inside 19.3
chair 19
floor 18.6
indoors 18.4
home 16.7
urban 16.6
furniture 16.6
city 15.8
office 15.2
lamp 14.9
table 14.7
business 14.6
construction 14.5
device 13.8
design 13.5
glass 13.2
apartment 12.4
step 11.5
support 11.5
door 11
nobody 10.9
decor 10.6
living 10.4
empty 10.3
luxury 10.3
space 10.1
indoor 10
wood 10
equipment 9.7
wooden 9.7
metal 9.6
style 9.6
man 9.4
seat 8.7
hall 8.7
black 8.4
elegance 8.4
old 8.4
technology 8.2
transportation 8.1
stucco 8
passage 8
corridor 7.9
chairs 7.8
travel 7.7
steel 7.7
silhouette 7.4
domestic 7.2

Google

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 51.4%
Happy 22.1%
Disgusted 10.5%
Surprised 9.8%
Confused 5%
Calm 17.5%
Sad 19.6%
Angry 15.5%

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a black and white photo of a man 68.8%
a man standing in a room 68.7%
black and white photo of a man 61.9%