Human Generated Data

Title

[Harbor scene]

Date

1931-1933

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.243.17

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Harbor scene]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931-1933

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.6
Person 99.6
Person 99.6
Person 99.3
Person 97.2
Furniture 96.9
Transportation 82.2
Boat 82.2
Vehicle 82.2
Apparel 57.8
Clothing 57.8
Light 55.8
Flare 55.8
Sphere 55.5

Clarifai
created on 2019-11-19

people 99.8
man 97.5
adult 97.3
group 96.1
woman 95.8
group together 95.8
two 93.9
one 93.1
three 91.5
child 90.8
street 87.3
recreation 85
transportation system 84.4
vehicle 84.1
wear 84
four 83
water 80.8
several 80.6
music 80.6
boy 80.5

Imagga
created on 2019-11-19

bridge 41.2
suspension bridge 37.8
boat 29.3
structure 25.3
water 24.7
sea 23.4
canvas tent 23.2
catapult 22.1
travel 21.8
sky 18.5
ship 17.6
transport 17.3
engine 17.2
boats 16.5
tourism 16.5
transportation 15.2
instrument 14.7
fisherman 14.5
city 14.1
landscape 14.1
ocean 14.1
river 13.3
old 13.2
architecture 12.9
sunset 12.6
fishing 12.5
vessel 12.4
building 12
nautical 11.6
sun 11.3
island 11
beach 11
wood 10.8
yacht 10.7
port 10.6
reflection 10.6
summer 10.3
device 10.3
town 10.2
vacation 9.8
urban 9.6
people 9.5
person 9.1
tourist 9.1
wooden 8.8
sailing 8.8
sail 8.7
scene 8.7
holiday 8.6
tropical 8.5
outdoor 8.4
outdoors 8.3
canal 8.3
lake 8.2
seller 7.9
happy 7.5
gondola 7.4
silhouette 7.4
shore 7.4
wedding 7.4
sand 7.3
umbrella 7.3
landmark 7.2
portrait 7.1
day 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 98.9
ship 98
black and white 96.6
outdoor 88.4
monochrome 87.3
boat 85.8
watercraft 81.5
clothing 81.4
person 63.1

Face analysis

Amazon

AWS Rekognition

Age 7-17
Gender Male, 53.9%
Confused 45%
Surprised 45.1%
Disgusted 45.1%
Sad 45.4%
Fear 45.1%
Happy 46%
Angry 45.2%
Calm 53.1%

Feature analysis

Amazon

Person 99.6%
Boat 82.2%

Captions

Microsoft

a group of people standing in front of a building 76.2%
a person standing in front of a building 72.8%
a man and a woman standing in front of a building 53.7%

Text analysis

Amazon

90936

Google

0 6
0
6