Human Generated Data

Title

[Wood bridge]

Date

early 1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.297.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Wood bridge]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

early 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.297.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 99.5
Person 99.5
Person 99.5
Person 99.2
Person 99.2
Person 99
Water 98.8
Waterfront 98.5
Port 97.6
Pier 97.6
Dock 97.6
Transportation 97.5
Vehicle 97.5
Boat 97.5
Outdoors 63.8
People 62.6
Watercraft 58.4
Vessel 58.4
Apparel 55.7
Clothing 55.7
Steamer 55.4

Clarifai
created on 2019-05-29

watercraft 99.9
people 99.9
vehicle 99.8
group together 99.5
transportation system 99.1
adult 98.4
group 95.7
many 95.7
water 95.5
military 94.6
man 94.5
warship 94.1
one 92.7
ship 91.2
suspension bridge 90.2
war 89.9
cargo ship 89.8
two 89.6
pier 87.5
sea 86.8

Imagga
created on 2019-05-29

boat 46.6
pier 44.3
water 44.1
sea 42.2
ocean 38.5
fisherman 32.5
travel 31.7
ship 30.5
device 29.3
sky 28.1
marina 27.8
yacht 26.6
sunset 26.1
harbor 26
river 22.3
port 22.2
catapult 21.8
support 21.7
vacation 20.5
end 19.7
transportation 18.8
summer 18.7
dock 18.5
coast 18
engine 17.9
vessel 17.9
sail 17.5
transport 17.4
fishing 17.3
silhouette 16.6
boats 16.5
lake 16.5
bridge 16.5
beach 16.4
nautical 15.5
reflection 15.4
shore 15
tourism 14.9
holiday 14.3
bay 14.2
landscape 14.1
evening 14
sun 13.9
mast 13.8
sailboat 13.3
leisure 13.3
instrument 12.6
architecture 12.5
gear 11.8
sailing 11.7
city 11.6
island 11
ships 10.8
scenic 10.5
outdoors 10.5
scene 10.4
cloud 10.3
tropical 10.2
equipment 10.2
clouds 10.1
outdoor 9.9
landmark 9.9
yachts 9.9
tower 9.9
cruise 9.7
people 9.5
trip 9.4
serene 9.4
outside 9.4
man 9.4
wealth 9
navigation 8.7
coastline 8.5
wind 8.4
sunrise 8.4
steel 8.1
sand 8.1
recreation 8.1
light 8
building 8
docked 7.9
urban 7.9
quay 7.9
clear 7.9
luxury 7.7
old 7.7
seascape 7.7
fishing gear 7.6
sport 7.6
marine 7.6
destination 7.5
pole 7.5
craft 7.4
seaside 7.3
calm 7.3
industrial 7.3
line 7.2
night 7.1
waterfront 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

outdoor 99.5
boat 98.8
water 95.8
man 91.5
ship 91.1
person 85.1
black 79.7
black and white 76.5
white 74.6
lake 70.1
bridge 69.6
old 67.7
clothing 65.6
sky 65.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 52.3%
Sad 45.4%
Happy 45.2%
Angry 45.5%
Confused 45.2%
Surprised 45.7%
Disgusted 45.4%
Calm 52.7%

AWS Rekognition

Age 23-38
Gender Male, 50.1%
Angry 49.6%
Surprised 49.5%
Confused 49.5%
Calm 49.6%
Happy 49.5%
Disgusted 49.9%
Sad 49.8%

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Angry 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Calm 49.9%
Sad 50%

AWS Rekognition

Age 14-23
Gender Female, 50.3%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Happy 49.7%
Sad 49.6%
Calm 50.1%

Feature analysis

Amazon

Person 99.5%
Boat 97.5%