Human Generated Data

Title

[Lyonel Feininger with ship model]

Date

1930s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.454.11

Human Generated Data

Title

[Lyonel Feininger with ship model]

People

Artist: Unidentified Artist,

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.454.11

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Transportation 99.6
Boat 99.6
Vehicle 99.6
Human 99.3
Person 99.3
Finger 84.7
Clothing 77.5
Apparel 77.5
Portrait 65.7
Photography 65.7
Photo 65.7
Face 65.7
Screen 62.8
Display 62.8
Monitor 62.8
LCD Screen 62.8
Electronics 62.8
Silhouette 58.4
People 56.8

Clarifai
created on 2019-05-30

people 97.7
monochrome 95
man 93.7
art 91.9
nude 90.9
dark 89.3
light 86.5
portrait 85.5
shadow 84.8
adult 84.8
woman 83.7
one 83.5
fashion 81.7
child 81.3
studio 78.9
black and white 78.6
artistic 77.7
girl 77.4
abstract 75.9
model 73.9

Imagga
created on 2019-05-30

vessel 79.9
bucket 64.1
container 41.9
tub 22.5
black 20
light 16.7
design 14.1
digital 12.1
anvil 11.9
space 11.6
hat 11.4
render 11.2
paper 11.2
technology 11.1
color 10.6
art 10.5
utensil 10.4
flame 10.3
funnel 10.2
block 10.1
3d 9.3
fractal 9.2
dark 9.2
modern 9.1
futuristic 9
person 8.7
motion 8.6
cap 8.5
business 8.5
adult 8.4
style 8.2
symbol 8.1
object 8.1
success 8
computer 8
holiday 7.9
education 7.8
health 7.6
reflection 7.5
silhouette 7.4
smoke 7.4
effect 7.3
graphic 7.3
yellow 7.3
office 7.2
bathtub 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

ship 94.3
indoor 93.1
black and white 89.5
monochrome 80.7
boat 76.1
watercraft 68.7
dark 48

Color Analysis

Feature analysis

Amazon

Boat 99.6%
Person 99.3%

Categories

Captions

Microsoft
created on 2019-05-30

a person in a dark room 54.5%
a person standing in a dark room 50.1%