Human Generated Data

Title

[Looking down on to patio]

Date

1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.439.20

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Looking down on to patio]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.439.20

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-24

Outdoors 99.3
Nature 94.2
Weather 88.8
Person 76.5
Credit Card 67.7
Text 67.7
Head 62.5
Photographic Film 59.7
Snow 57.6
Winter 56

Clarifai
created on 2023-10-15

filmstrip 99.8
negative 99.6
movie 99.6
slide 99
dirty 98.7
exposed 98.7
cinematography 98.5
old 98.2
desktop 97.8
antique 97.6
collage 97.4
retro 96.7
picture frame 95.8
blank 95.7
monochrome 95.6
vintage 95.5
photograph 95.4
margin 95.1
rough 94.9
illustration 94.5

Imagga
created on 2019-02-03

negative 74.5
film 66.6
sewage system 48.9
facility 41.6
photographic paper 34.4
photographic equipment 22.9
frame 22.5
vintage 22.3
old 21.6
grunge 17.9
strip 17.5
black 17.4
retro 17.2
border 16.3
camera 15.7
movie 15.5
graphic 15.3
texture 15.3
art 15.1
antique 14.7
pattern 13
blank 12.9
photographic 12.7
cinema 12.7
design 11.8
aged 11.8
space 11.6
screen 11.6
business 11.5
ancient 11.2
sky 10.8
silhouette 10.8
slide 10.7
photograph 10.6
damaged 10.5
line 10.3
paper 10.2
35mm 9.8
reel 9.8
landscape 9.7
text 9.6
roll 9.5
canvas 9.5
symbol 9.4
edge 8.7
empty 8.6
grungy 8.5
number 8.4
clip 8.4
entertainment 8.3
digital 8.1
material 8
office 8
clock 8
filmstrip 7.9
cloud 7.8
instrument 7.4
message 7.3
rough 7.3
paint 7.2
dirty 7.2
road 7.2
sunset 7.2

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

Color Analysis

Feature analysis

Amazon

Person
Credit Card
Person 76.5%

Categories

Imagga

interior objects 99.5%

Captions

Text analysis

Amazon

19