Human Generated Data

Title

[Figures on shore, viewed from above]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1007.118

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Figures on shore, viewed from above]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1007.118

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Person 98.7
Person 97.9
Person 97.9
Person 97.8
Person 97.4
Person 97.1
Person 96.6
Outdoors 96.5
Person 96.5
Person 96.3
Person 95.1
City 93.1
Person 92.7
Person 92.2
People 91.7
Person 90.5
Person 89.7
Person 88.2
Person 88.1
Person 71.9
Person 63.8
Motorcycle 62.7
Transportation 62.7
Vehicle 62.7
Play Area 62.3
Road 57.6
Walking 57.5
Terminal 57.1
Outdoor Play Area 56.9
Path 56.5
Urban 55.9
Person 55.9
Bicycle 55.7
Sidewalk 55.6
Airport 55.6
Face 55.1
Head 55.1

Clarifai
created on 2023-10-15

people 99.9
group together 99.8
group 98.6
many 98.4
transportation system 97.8
adult 97.7
watercraft 97.3
man 97.2
one 96.9
two 96.2
vehicle 96.1
three 95.8
recreation 95.1
several 92.9
competition 91
outfit 90.2
military 89.8
four 88.1
sports equipment 88.1
war 86.7

Imagga
created on 2019-01-31

city 18.3
device 17.2
building 16.9
equipment 16.6
silhouette 16.5
structure 15.8
musical instrument 15.4
sky 15.3
urban 14.8
water 14
accordion 13.9
architecture 12.8
black 12.6
keyboard instrument 11.6
outdoor 11.5
radio 11.4
fence 11.3
travel 11.3
construction 11.1
industrial 10.9
metal 10.5
technology 10.4
business 10.3
industry 10.2
winter 10.2
man 10.2
barrier 9.7
wheeled vehicle 9.6
light 9.3
transport 9.1
landmark 9
transportation 9
ride 8.9
tower 8.9
park 8.9
wind instrument 8.6
window 8.5
modern 8.4
people 8.4
broadcasting 8.4
car mirror 8
steel 7.9
sea 7.8
ferris wheel 7.7
outside 7.7
ship 7.7
old 7.7
cityscape 7.6
power 7.5
iron 7.5
landscape 7.4
street 7.4
shopping cart 7.2
passenger 7.2
team 7.2
mirror 7.1
river 7.1
bridge 7.1
wall 7

Google
created on 2019-01-31

Microsoft
created on 2019-01-31

window 99.3
indoor 91.1
black and white 91.1
street 61.6
monochrome 26.9
shadow 25.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 100%
Disgusted 61.5%
Happy 29.9%
Surprised 6.6%
Fear 6.1%
Sad 3.2%
Confused 2%
Calm 1.4%
Angry 1.3%

AWS Rekognition

Age 25-35
Gender Male, 94.5%
Happy 43.9%
Calm 32.7%
Surprised 8.2%
Fear 7%
Confused 6.6%
Disgusted 4.9%
Sad 3.5%
Angry 2.4%

AWS Rekognition

Age 21-29
Gender Male, 99.8%
Calm 85.5%
Happy 10.7%
Surprised 6.4%
Fear 6%
Sad 2.4%
Confused 1.4%
Angry 0.5%
Disgusted 0.5%

AWS Rekognition

Age 23-33
Gender Male, 98%
Calm 92.5%
Surprised 7.1%
Fear 6%
Sad 3.4%
Angry 1.5%
Disgusted 0.3%
Happy 0.3%
Confused 0.2%

AWS Rekognition

Age 6-14
Gender Female, 62.3%
Happy 81.3%
Calm 10.4%
Surprised 8%
Fear 6%
Sad 2.4%
Disgusted 2.3%
Angry 1.2%
Confused 0.5%

AWS Rekognition

Age 31-41
Gender Male, 99.3%
Happy 51.5%
Calm 30.6%
Surprised 6.9%
Fear 6.3%
Sad 5.5%
Disgusted 5.4%
Angry 2.3%
Confused 1.2%

AWS Rekognition

Age 16-22
Gender Male, 98.4%
Happy 58.1%
Sad 18.7%
Surprised 13.1%
Calm 9.7%
Fear 6.5%
Angry 2.3%
Disgusted 1.9%
Confused 0.4%

AWS Rekognition

Age 6-16
Gender Male, 81%
Calm 95.8%
Surprised 6.3%
Fear 5.9%
Happy 2.4%
Sad 2.2%
Angry 0.5%
Disgusted 0.3%
Confused 0.3%

AWS Rekognition

Age 18-26
Gender Male, 61.9%
Sad 96.1%
Calm 27.2%
Surprised 8.7%
Fear 8.1%
Happy 5.7%
Angry 4.5%
Confused 2.3%
Disgusted 1.9%

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

paintings art 93.5%
food drinks 4.4%
interior objects 1.5%