Human Generated Data

Title

[Lyonel Feininger with ship model]

Date

1930s

People

Artist: Unidentified Artist,

Classification

Photographs

Human Generated Data

Title

[Lyonel Feininger with ship model]

People

Artist: Unidentified Artist,

Date

1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon

Vehicle 89.7
Transportation 89.7
Boat 89.7
Human 87.6
Person 70.1
Face 68.9
Bowl 58.1
Nature 56
Food 55.8
Meal 55.8

Clarifai

monochrome 99
people 98
nude 96
light 95.3
portrait 94.1
studio 94.1
black and white 93.5
girl 93.3
art 93
abstract 92.5
dark 92.2
shadow 90.4
stage 90.1
desktop 88.9
concert 88.5
model 88.4
man 88.3
street 88
adult 87.6
music 86.3

Imagga

grand piano 100
piano 83.1
stringed instrument 63.4
percussion instrument 62.4
keyboard instrument 62.3
musical instrument 42.6
light 27.4
black 24.8
dark 23.4
flame 22.5
fire 22.5
hot 20.1
smoke 15.9
heat 15.7
vessel 15.3
cooking utensil 13.1
art 13.1
digital 13
wok 12.6
color 12.2
pan 12
water 12
fireplace 11.7
night 11.5
curve 11.4
motion 11.1
pattern 10.9
energy 10.9
burning 10.6
burn 10.6
design 10.1
fractal 10.1
3d 10.1
danger 10
wallpaper 10
flames 9.8
texture 9.7
technology 9.6
people 9.5
evening 9.3
space 9.3
orange 9.2
warm 9.2
effect 9.1
kitchen utensil 8.7
man 8.7
bright 8.6
silhouette 8.3
computer 8.1
sunset 8.1
metal 8
artistic 7.8
render 7.8
equipment 7.6
dynamic 7.5
boat 7.4
style 7.4
flow 7.4
swirl 7.4
fantasy 7.2
futuristic 7.2
colorful 7.2

Microsoft

ship 95.9
boat 89.8
indoor 87.7
watercraft 84.3
black and white 68.6
dark 30.8

Feature analysis

Amazon

Boat 89.7%
Person 70.1%

Captions

Microsoft

a person sitting in a dark room 62.4%
a cat sitting in a dark room 37.5%
a person sitting in a dark room 37.4%