Human Generated Data

Title

[Lyonel Feininger in Deep]

Date

1932

People

Artist: Unidentified Artist,

Classification

Photographs

Human Generated Data

Title

[Lyonel Feininger in Deep]

People

Artist: Unidentified Artist,

Date

1932

Classification

Photographs

Machine Generated Data

Tags

Amazon

Furniture 98.8
Person 97.1
Human 97.1
Apparel 92.1
Clothing 92.1
Chair 86.9
Face 81.6
Sitting 77.3
Couch 75.6
Photo 63.4
Portrait 63.4
Photography 63.4
Shoe 55.4
Footwear 55.4

Clarifai

people 99.9
one 99.3
adult 98.9
man 98.4
wear 97.1
two 96.9
war 92.8
military 92.7
administration 89.5
reclining 87.2
soldier 86.2
woman 86.2
portrait 85.9
blanket 84.2
group 84.2
vehicle 83.8
transportation system 83.2
furniture 80.7
child 80.2
basket 79

Imagga

person 27.9
man 26.2
adult 25.5
sitting 24
people 24
male 23.9
lifestyle 23.1
child 19.9
happy 18.2
attractive 17.5
sofa 15.9
holding 14.9
smiling 14.5
cute 14.3
portrait 14.2
outdoors 14.2
wicker 13.8
fashion 13.6
home 13.6
couch 13.5
women 13.4
love 13.4
couple 13.1
lady 13
human 12.7
room 12.3
face 12.1
clothing 12
pretty 11.9
chair 11.5
one 11.2
casual 11
work 10.9
armchair 10.8
cheerful 10.6
old 10.4
looking 10.4
furniture 10.2
happiness 10.2
model 10.1
relax 10.1
relaxation 10
relaxing 10
park 9.9
family 9.8
kid 9.7
together 9.6
sexy 9.6
expression 9.4
senior 9.4
product 9.3
guy 9.3
summer 9
seat 8.8
indoors 8.8
hair 8.7
boy 8.7
youth 8.5
two 8.5
black 8.4
elegance 8.4
joy 8.4
grandfather 8.3
fun 8.2
bench 8.2
dress 8.1
body 8
look 7.9
husband 7.8
wall 7.7
trainer 7.6
loving 7.6
laptop 7.6
leisure 7.5
20s 7.3
rest 7.3
sensuality 7.3
musical instrument 7

Microsoft

clothing 95
person 91.1
black and white 86.6
man 80.4
human face 60.1

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 97.7%
Sad 6%
Surprised 1%
Confused 6.2%
Calm 81%
Happy 1.6%
Angry 3.6%
Disgusted 0.6%

AWS Rekognition

Age 26-43
Gender Male, 74.3%
Angry 4.1%
Calm 80%
Sad 3.4%
Disgusted 6.3%
Surprised 2.7%
Happy 1.6%
Confused 1.9%

Feature analysis

Amazon

Person 97.1%

Captions

Microsoft

a person sitting on a bed 57.6%
a person sitting in a bag 57.5%
a person sitting on a suitcase 57.4%