Human Generated Data

Title

[View of people from underpass]

Date

1935-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.309.10

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View of people from underpass]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1935-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.309.10

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Person 99.8
Human 99.8
Person 99.6
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Person 99.4
Person 98.7
Nature 98.5
Person 98.3
Pedestrian 96.5
Person 92.2
Airplane 91.8
Transportation 91.8
Vehicle 91.8
Aircraft 91.8
Outdoors 82.9
Smoke 77.5
Fog 76.3
People 71.2
Crowd 64.5
Waterfront 64.2
Water 64.2
Building 61
Pier 58.4
Port 58.4
Dock 58.4
Silhouette 56.8
Smog 56.6
Walking 56.5
Weather 56.5
Person 50.6

Clarifai
created on 2021-04-05

people 99.9
street 99
group together 98.8
man 97.9
monochrome 97
adult 96.1
group 95.8
vehicle 94.6
two 94.6
transportation system 94.3
many 93.2
aircraft 88.9
child 88.8
woman 86.8
airport 85.1
leader 85
road 84.8
several 84.4
spectator 83.9
airplane 82.7

Imagga
created on 2021-04-05

tourist 36.5
sky 28.7
travel 24.6
traveler 24
landscape 23.1
water 22.7
clouds 21.1
sunset 20.7
city 19.1
cloud 17.2
person 17.1
sea 16.5
beach 16.1
sun 16.1
ocean 15.9
summer 15.4
warship 14.6
ship 13.6
scenery 13.5
military vehicle 13.5
scenic 13.2
architecture 13.1
vehicle 13
building 12.4
urban 12.2
outdoor 12.2
light 12.2
sunrise 12.2
evening 12.1
people 11.7
horizon 11.7
coast 11.7
outdoors 11.6
river 11.6
tourism 11.5
man 11.4
airship 11.4
aircraft 10.3
vacation 9.8
mountain 9.8
vessel 9.6
bay 9.6
stone 9.3
device 9.3
tree 9.2
aircraft carrier 9.2
peaceful 9.2
silhouette 9.1
craft 9.1
color 8.9
scene 8.7
day 8.6
cityscape 8.5
structure 8.3
lake 8.2
calm 8.2
road 8.1
natural 8
grass 7.9
rock 7.8
bridge 7.8
storm 7.7
black 7.6
skyline 7.6
wilderness 7.5
coastline 7.5
hill 7.5
shore 7.4
tranquil 7.2
landmark 7.2
sand 7.2
country 7

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

outdoor 99.5
road 98.9
person 96.5
black and white 95.1
clothing 94.9
man 92.8
people 77.6
white 75.6
street 75.5
black 72.2
transport 66.1
sky 62.7
old 59.7
monochrome 59.2
airplane 54
aircraft 47.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 82.3%
Calm 60.3%
Happy 29%
Sad 7%
Confused 1.8%
Angry 0.8%
Surprised 0.6%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 20-32
Gender Male, 73%
Calm 55.9%
Happy 35.5%
Sad 6.3%
Confused 1%
Angry 0.7%
Surprised 0.3%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 25-39
Gender Male, 50%
Sad 60.4%
Calm 28.1%
Happy 3.5%
Confused 3.3%
Angry 3.2%
Fear 0.5%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 33-49
Gender Male, 60.8%
Calm 66.2%
Happy 19.5%
Sad 7.9%
Confused 3.5%
Angry 1.9%
Fear 0.3%
Surprised 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.8%
Airplane 91.8%

Categories