Human Generated Data

Title

[California]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.141.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.141.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 99.7
Human 99.7
Person 98.2
Shelter 98
Outdoors 98
Building 98
Countryside 98
Rural 98
Nature 98
Food 81
Meal 81
Vehicle 80.2
Transportation 80.2
Housing 74.4
Machine 68.5
Wheel 68.5
Truck 66.8
People 63
Land 62.9
Automobile 61.9
Car 61.3
Vegetation 59.7
Plant 59.7
Caravan 58.8
Van 58.8
Face 56.9
Hut 55.3
Shack 55.3

Clarifai
created on 2021-04-04

abandoned 98.9
vehicle 98.8
broken 96.8
people 95.9
vintage 95.6
decay 95.3
no person 94
retro 93.5
wreck 93.5
disrepair 93
campsite 92.9
war 91.8
old 91.5
wreckage 90.7
transportation system 90.6
adult 88.4
rusty 88
military 87
home 87
wagon 85

Imagga
created on 2021-04-04

gravestone 37.6
old 32.1
structure 31.1
freight car 30.7
memorial 29.8
car 27.5
picket fence 25.6
stone 25.2
fence 24.2
grunge 23.9
tree 23.9
wheeled vehicle 23.6
vehicle 23.4
graffito 23.2
antique 22.5
vintage 21.5
landscape 19.4
texture 18.8
decoration 17.9
barrier 17.3
aged 17.2
forest 16.6
grungy 16.1
bench 15.9
winter 15.3
damaged 15.3
retro 14.8
ancient 14.7
snow 14.7
frame 13.3
building 12.9
park bench 12.7
trees 12.5
old fashioned 12.4
scenic 12.3
empty 12
blank 12
grain 12
dirty 11.8
space 11.7
outdoor 11.5
rural 11.5
rusty 11.4
black 11.4
country 11.4
brown 11.1
paper 11
obstruction 11
rough 10.9
material 10.7
torn 10.7
decay 10.6
art 10.4
scene 10.4
cold 10.3
container 10.2
season 10.1
wall 10
conveyance 10
paint 10
border 10
scenery 9.9
abandoned 9.8
burnt 9.7
rustic 9.6
pattern 9.6
sky 9.6
worn 9.6
field 9.2
wallpaper 9.2
wood 9.2
countryside 9.1
graphic 8.8
text 8.7
weathered 8.6
seat 8.5
travel 8.5
city 8.3
park 8.2
water 8
autumn 7.9
grass 7.9
textured 7.9
artistic 7.8
grime 7.8
snowy 7.8
crumpled 7.8
rust 7.7
stain 7.7
clouds 7.6
dark 7.5
mailbox 7.5
historic 7.3
design 7.3
road 7.2
history 7.2

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

grass 99.7
outdoor 99.4
vehicle 95.8
land vehicle 95
old 95
black and white 92.9
car 90.1
abandoned 90.1
wheel 86.4
monochrome 67.9
tire 57
decay 54
dirty 13.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-65
Gender Male, 82%
Happy 42.4%
Calm 37%
Surprised 11.1%
Sad 2.8%
Angry 2.1%
Disgusted 2.1%
Fear 2%
Confused 0.5%

AWS Rekognition

Age 38-56
Gender Male, 55.2%
Surprised 45.3%
Fear 40.7%
Calm 6.8%
Happy 2.3%
Angry 2.2%
Sad 1.7%
Confused 0.6%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 68.5%
Truck 66.8%
Car 61.3%

Categories