Human Generated Data

Title

[Lyonel Feininger]

Date

1934

People

Artist: Unidentified Artist,

Classification

Photographs

Human Generated Data

Title

[Lyonel Feininger]

People

Artist: Unidentified Artist,

Date

1934

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 95.7
Human 95.7
Face 89.3
Apparel 87.7
Clothing 87.7
Art 85.9
Portrait 67.6
Photo 67.6
Photography 67.6
Sculpture 62.2
Statue 60

Clarifai

people 99.6
portrait 98
one 97.9
adult 97.9
man 97.5
wear 91.8
leader 90.2
monochrome 88.9
facial expression 87.7
veil 84.1
outfit 77.1
music 76.8
administration 75.3
musician 74.2
chair 74.2
sit 74.2
woman 69.8
religion 69.5
actor 69.3
art 67.3

Imagga

negative 26.3
portrait 25.9
sculpture 24.3
statue 23.8
black 23.6
man 22.8
person 22.6
male 21.3
film 20.8
newspaper 20.3
face 19.2
hair 17.4
old 17.4
product 16.9
people 16.7
photographic paper 16.1
senior 15.9
one 15.7
human 15
creation 14.5
adult 14.3
close 13.7
looking 13.6
art 13.5
men 12.9
eye 12.5
attractive 11.9
head 11.8
model 11.7
photographic equipment 10.7
bust 10.6
hand 9.9
culture 9.4
book jacket 9.1
pretty 9.1
sexy 8.8
body 8.8
lifestyle 8.7
serious 8.6
expression 8.5
sensuality 8.2
jacket 8.1
history 8.1
cadaver 8
covering 7.8
ancient 7.8
closeup 7.4
historic 7.3
sensual 7.3
mask 7.2

Microsoft

human face 98.6
person 96.5
clothing 93.9
smile 93.6
man 91.6
indoor 88.3
ceremony 75.5
portrait 55.4
bowed instrument 14.9

Face analysis

Amazon

AWS Rekognition

Age 19-36
Gender Male, 98.3%
Sad 10.2%
Disgusted 3%
Angry 8.2%
Happy 38.3%
Surprised 4.2%
Confused 4.9%
Calm 31.1%

Feature analysis

Amazon

Person 95.7%

Captions

Microsoft

a black and white photo of a man 71.1%
a man looking at the camera 70.3%
an old photo of a man 70.2%