Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon

Tripod 99.8
Person 96.7
Human 96.7
Person 95
Photography 78
Photo 78

Clarifai

people 99.7
monochrome 99.2
one 97.1
man 96.8
adult 96.5
two 96
chair 92
woman 91.5
room 91.5
furniture 90.3
music 87.2
street 85.3
indoors 83.3
wear 82
portrait 80.8
group 79
three 77.6
child 77.5
sit 75.4
black and white 75.2

Imagga

pay-phone 37.8
telephone 33.9
device 28.2
man 26.2
treadmill 23.2
silhouette 23.2
male 22.7
equipment 21.4
harp 18.5
people 18.4
person 15.7
building 13.7
microphone 13.5
black 12.6
support 12.4
adult 11.6
business 10.9
sky 10.8
urban 10.5
window 10.1
alone 10
city 10
outdoor 9.9
call 9.8
attractive 9.8
outdoors 9.7
sun 9.7
modern 9.1
portrait 9.1
tripod 9
sunset 9
men 8.6
chair 8.5
travel 8.4
light 8
interior 8
businessman 7.9
life 7.8
youth 7.7
fashion 7.5
one 7.5
park 7.4
exercise 7.3
office 7.2
worker 7.1

Google

white 95.8
black 95.8
photograph 95.4
black and white 92.2
standing 88.1
furniture 87
photography 84.2
monochrome photography 82.7
snapshot 81.8
table 68.5
monochrome 67.4
chair 60.7
angle 56.9
window 55.7
film noir 52.8

Microsoft

indoor 89.5
white 63.4
tripod 34.3

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 35-52
Gender Female, 71.5%
Disgusted 9.9%
Sad 26%
Calm 22.1%
Surprised 9.7%
Happy 3.7%
Confused 7.7%
Angry 21%

AWS Rekognition

Age 15-25
Gender Female, 96.7%
Confused 1.2%
Surprised 1%
Angry 0.6%
Happy 93.2%
Calm 2.7%
Disgusted 0.5%
Sad 0.8%

Microsoft Cognitive Services

Age 19
Gender Female

Feature analysis

Amazon

Person 96.7%

Captions

Microsoft

a chair sitting in front of a mirror 40.5%
a tripod sitting in front of a mirror 38.3%
a tripod in a room 38.2%