Human Generated Data

Title

[Group kneeling on deck to watch sights]

Date

1930-1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.314.15

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Group kneeling on deck to watch sights]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930-1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.314.15

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 98
Person 98
Person 97.2
Person 95.5
Person 93.9
Person 91.8
Person 89.6
Military 89.3
Vehicle 87
Transportation 87
Vessel 87
Watercraft 87
People 81.1
Person 81
Person 77
Person 75.3
Weapon 70.1
Weaponry 70.1
Military Uniform 69
Soldier 69
Clothing 66
Apparel 66
Navy 61.2
Cruiser 59.3
Ship 59.3
Boat 58.5

Clarifai
created on 2019-05-29

people 99.8
watercraft 99.7
vehicle 99.2
adult 98.8
group together 98.8
group 98.5
transportation system 96.2
man 95.7
many 95.5
water 94.4
river 91.1
sea 90.5
beach 90.5
woman 90.1
recreation 89.2
rowboat 88.1
military 86.3
boatman 86.3
seashore 85.8
athlete 85.3

Imagga
created on 2019-05-29

breakwater 63
shoreline 55.1
barrier 50.6
sea 50.1
ocean 48.9
water 44.1
beach 43.4
sky 38.9
obstruction 37.4
coast 33.2
shore 30.2
cannon 28.7
travel 27.5
sand 26.9
landscape 26
boat 25.5
clouds 24.5
waves 24.2
structure 23.6
summer 22.5
gun 22.4
coastline 21.6
rock 20
vacation 19.7
sunset 18.9
weapon 16.6
ship 16.6
tourism 16.5
island 16.5
lake 15.6
wave 15.6
stone 15.2
bay 15.1
silhouette 14.9
outdoor 14.5
sun 14.5
rocks 14.1
scenic 14.1
sunny 13.8
relax 12.6
horizon 12.6
river 12.5
leisure 12.5
evening 12.1
pier 12
day 11.8
people 11.7
city 11.6
sunrise 11.3
cloud 11.2
tropical 11.1
calm 11
holiday 10.8
seaside 10.7
port 10.6
outdoors 10.6
fishing 10.6
cloudy 10.3
scenery 9.9
harbor 9.6
seascape 9.6
dusk 9.5
paradise 9.4
man 9.4
peaceful 9.2
peace 9.1
recreation 9
romantic 8.9
pacific 8.7
buildings 8.5
fisherman 8.5
relaxation 8.4
tourist 8.3
tranquil 8.2
landmark 8.1
natural 8
scene 7.8
dock 7.8
sandy 7.8
rocky 7.7
outside 7.7
stones 7.6
sunshine 7.5
resort 7.5
reflection 7.3
activity 7.2
season 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

sky 99.8
water 99.3
outdoor 99.1
boat 96
black and white 95
watercraft 79.4
person 74.9
ship 72.4
man 66
people 64.2
river 63.3
monochrome 62.1
lake 60.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 54.4%
Happy 45.1%
Confused 45.1%
Disgusted 45.1%
Surprised 45.3%
Sad 45.4%
Angry 45.1%
Calm 53.9%

AWS Rekognition

Age 9-14
Gender Male, 54.6%
Angry 46.4%
Confused 46%
Happy 45.3%
Calm 48%
Surprised 45.7%
Sad 48.1%
Disgusted 45.5%

AWS Rekognition

Age 12-22
Gender Male, 52.3%
Sad 45.5%
Happy 45.2%
Disgusted 45.2%
Surprised 45.2%
Calm 53.5%
Angry 45.2%
Confused 45.1%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Confused 49.6%
Happy 49.6%
Surprised 49.5%
Calm 50.1%
Sad 49.7%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 29-45
Gender Male, 50.1%
Angry 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Calm 49.5%
Sad 50.4%

Feature analysis

Amazon

Person 98%
Boat 58.5%