Human Generated Data

Title

[Lyonel Feininger with ship model]

Date

1930s

People

Artist: Unidentified Artist,

Classification

Photographs

Human Generated Data

Title

[Lyonel Feininger with ship model]

People

Artist: Unidentified Artist,

Date

1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 97.9
Person 97.9
Finger 81.8
Photography 64.2
Photo 64.2
Face 63.7
Portrait 63.7

Clarifai

people 99.1
monochrome 98.5
portrait 97.5
one 97.3
adult 95.6
man 95.1
music 92.8
light 91.9
black and white 91.6
musician 90.2
profile 90.1
woman 89.5
girl 88.4
dark 88.3
art 88.3
instrument 86.2
concert 85.8
desktop 85.7
studio 84.7
nude 81.7

Imagga

stringed instrument 59.3
musical instrument 42.3
bowed stringed instrument 40.6
piano 34.1
grand piano 33.2
cello 31
keyboard instrument 28.4
percussion instrument 23.9
black 22.8
man 18.8
light 18.7
person 18.1
computer 17.8
people 15.6
adult 15.1
smoke 13.9
male 13.5
happy 13.2
hand 12.9
business 12.8
style 12.6
futuristic 12.6
dark 12.5
digital 12.2
violin 11.6
hair 11.1
work 11.1
art 11.1
portrait 11
energy 10.9
worker 10.7
working 10.6
laptop 10.6
modern 10.5
flame 10.3
equipment 10.2
design 10.1
fractal 10.1
holding 9.9
technology 9.6
artistic 9.6
space 9.3
attractive 9.1
job 8.8
graphic 8.8
wallpaper 8.4
power 8.4
pretty 8.4
color 8.3
one 8.2
effect 8.2
music 8.1
office 8
body 8
night 8
instrument 7.9
smile 7.8
render 7.8
model 7.8
motion 7.7
harpsichord 7.6
device 7.4
viol 7.3
lighting 7.2

Microsoft

Feature analysis

Amazon

Person 97.9%

Captions

Microsoft

a man and a woman looking at the camera 33.7%