Human Generated Data

Title

[Panama Canal, viewed from above]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.153.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Panama Canal, viewed from above]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.153.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 97.1
Human 97.1
Transportation 93.6
Vehicle 93.5
Person 90.2
Train 80.7
Ship 58.8
Railway 56.2
Train Track 56.2
Rail 56.2

Clarifai
created on 2021-04-04

train 98.4
people 98.3
locomotive 97.3
transportation system 97
railway 95.5
monochrome 94.6
vehicle 94.5
light 93.4
no person 92
adult 90
one 89.2
car 87.7
street 86.8
grinder 85.1
subway system 84.4
building 84.1
abandoned 83.9
portrait 81.6
city 80.9
industry 80.8

Imagga
created on 2021-04-04

boat 66.3
gondola 65.4
vessel 43
craft 29.4
piano 24.2
grand piano 22.1
water 22
travel 20.4
stringed instrument 18
keyboard instrument 17.8
percussion instrument 17.7
vehicle 16.2
night 16
sea 15.6
sunset 15.3
light 14.7
black 14.6
river 13.3
ocean 13.3
architecture 13
landscape 12.6
musical instrument 12.6
dark 12.5
silhouette 12.4
sky 12.1
lake 11.1
beach 11
city 10.8
tourism 10.7
reflection 10.5
evening 10.3
transport 10
outdoor 9.9
dusk 9.5
building 9.5
sunrise 9.4
bridge 9.2
old 9.1
transportation 9
palace 8.8
boats 8.7
shore 8.4
structure 8.1
landmark 8.1
sun 8
urban 7.9
device 7.8
summer 7.7
fishing 7.7
lights 7.4
calm 7.3
coast 7.2
shadow 7.2
upright 7

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Female, 66.7%
Calm 40.2%
Sad 31.1%
Happy 8.6%
Confused 5.9%
Angry 5.8%
Fear 5%
Surprised 2.2%
Disgusted 1.3%

AWS Rekognition

Age 17-29
Gender Male, 67.8%
Sad 41.4%
Calm 28.2%
Fear 14.3%
Angry 6.3%
Confused 3.7%
Happy 3.3%
Surprised 1.6%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%
Train 80.7%

Categories

Captions

Microsoft
created on 2021-04-04

an old photo of a person 48.5%
old photo of a person 45.8%
a person in a dark room 39.5%

Text analysis

Amazon

1111111IT

Google

032
032