Human Generated Data

Title

[View from above of two parked cars]

Date

1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.468.31

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View from above of two parked cars]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Nature 99.8
Outdoors 95
Fog 75.6
Human 75.2
Person 75.2
Vehicle 72.5
Train 72.5
Transportation 72.5
Weather 70.8
Smoke 68.2
Snow 61.4
Building 57.1
City 57.1
Town 57.1
Urban 57.1
Metropolis 57.1
Person 45.3

Clarifai
created on 2019-11-19

people 98.7
abandoned 97.5
no person 97.2
room 96.8
wall 95.4
light 92.8
window 92.3
adult 92.3
one 91.8
decay 90.5
offense 90.5
monochrome 90.5
abstract 90
indoors 89.4
empty 88.8
man 88.8
inside 88.8
house 87.2
art 86.8
texture 86.5

Imagga
created on 2019-11-19

cell 50
old 25.1
building 20.1
architecture 20
wall 18.3
travel 18.3
grunge 17.9
stone 16.9
ancient 16.4
landscape 16.4
ship 16.3
texture 16
vessel 15.9
dark 15.9
aged 14.5
vintage 14.1
city 13.3
tourism 13.2
urban 13.1
rock 13
antique 13
water 12.7
sky 12.1
light 12
black 12
sand 11.8
sunset 11.7
weathered 11.4
grungy 11.4
natural 11.4
construction 11.1
dirty 10.8
history 10.7
brick 10.7
design 10.7
pattern 10.3
industrial 10
structure 9.8
retro 9.8
river 9.8
metal 9.7
shipwreck 9.5
sea 9.4
box 9.1
park 9.1
vehicle 8.9
brown 8.8
textured 8.8
container 8.7
historical 8.5
famous 8.4
wood 8.3
room 8.3
craft 8.3
historic 8.2
scenery 8.1
horizon 8.1
wooden 7.9
scenic 7.9
scene 7.8
rust 7.7
industry 7.7
concrete 7.7
damaged 7.6
house 7.5
monument 7.5
ocean 7.5
outdoors 7.5
part 7.4
town 7.4
exterior 7.4
street 7.4
road 7.2
car 7.2
transportation 7.2
tower 7.2
material 7.1
mountain 7.1
surface 7.1

Google
created on 2019-11-19

Black 94.8
Photography 62.4
Rectangle 50.5

Microsoft
created on 2019-11-19

text 73.6
house 71.5
fog 70.6
white 66.4
black and white 66.2
military vehicle 64.8
curb 6.5

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50%
Happy 49.5%
Surprised 49.5%
Sad 49.6%
Calm 49.5%
Confused 49.5%
Fear 49.5%
Disgusted 49.5%
Angry 50.3%

Feature analysis

Amazon

Person 75.2%
Train 72.5%

Captions

Microsoft

a black and white photo of a vehicle 26.6%

Text analysis

Google

www
www