Human Generated Data

Title

[View from ocean liner]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.162.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View from ocean liner]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.162.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Handrail 98.4
Banister 98.4
Railing 93.6
Clothing 91.2
Apparel 91.2
Nature 86.2
Human 86.1
Person 83.9
Face 82.4
Outdoors 81.3
Snowman 81.3
Snow 81.3
Winter 81.3
Person 79.4
Chair 77.3
Furniture 77.3
People 69.9
Person 63.5
Porch 61.8
Female 56.1
Life Buoy 55.4

Clarifai
created on 2021-04-04

people 99.9
adult 98.7
man 97.6
group together 97.2
two 96.1
administration 95
woman 94.3
one 92.6
group 91.2
wear 90.9
vehicle 89.7
military 89.2
watercraft 88.7
war 83.2
three 83.2
monochrome 83.1
street 82.6
leader 81
outfit 78.9
portrait 76.7

Imagga
created on 2021-04-04

snow 90.9
weather 42
picket fence 41.2
winter 36.6
fence 34.3
barrier 25.9
cold 25.8
ice 23.9
sky 19.1
obstruction 16.7
landscape 16.4
travel 16.2
house 15
frost 14.4
structure 13.9
snowy 13.6
covered 13.6
mountain 13.3
architecture 13.3
season 12.5
negative 12.4
tree 12.3
mailbox 12.3
wall 12
tourist 11.8
building 11.6
tourism 11.5
vacation 11.5
old 11.1
road 10.8
freeze 10.7
scene 10.4
vehicle 10.1
water 10
box 10
cool 9.8
forest 9.6
frozen 9.6
city 9.1
film 9
outdoors 9
scenic 8.8
light 8.7
town 8.4
mountains 8.3
trees 8
home 8
holiday 7.9
urban 7.9
work 7.8
car 7.8
outdoor 7.6
device 7.5
wood 7.5
hill 7.5
container 7.3
island 7.3
industrial 7.3
rural 7
machine 7
seasonal 7

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

person 89.9
outdoor 86.4
text 84.6
ship 84.1
black and white 73.4
old 54.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-43
Gender Female, 60%
Fear 73.2%
Surprised 11.6%
Calm 4%
Confused 3.1%
Sad 2.8%
Angry 2.6%
Happy 1.6%
Disgusted 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 83.9%
Snowman 81.3%

Captions