Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1878

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1878

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Water 98.9
Waterfront 98.9
Person 98.5
Construction 98.3
Person 97.4
Person 96.8
Person 96.4
Person 96.3
Person 96.3
Machine 95.4
Wheel 95.4
Person 95.3
Person 94.7
Wheel 92.7
Outdoors 92.6
Wood 91.5
Person 88.5
Oilfield 82
Wheel 76.9
Architecture 66.7
Building 66.7
Factory 66.7
Bicycle 65
Transportation 65
Vehicle 65
Worker 57.3
Carpenter 56.5
Manufacturing 56.4

Clarifai
created on 2018-05-11

people 99.9
group together 99.6
group 99.3
many 98.6
adult 98.6
vehicle 98.6
war 97.2
military 96.8
man 96.5
soldier 96
watercraft 95.7
leader 93.7
administration 93.6
transportation system 89.6
several 87.9
skirmish 84.3
railway 84
home 83.5
gun 83.2
weapon 83.2

Imagga
created on 2023-10-07

musical instrument 33.6
marimba 32.8
percussion instrument 32.4
water 26
sky 25.6
architecture 22.8
travel 21.1
city 20
building 19.2
pier 18.9
river 18.7
ocean 17.9
device 17.1
landscape 17.1
sea 16.4
tourism 14.8
brass 14.7
support 14.6
urban 14
bridge 13.9
outdoor 13.8
house 13.5
old 13.2
ship 13.1
man 12.8
cornet 12.5
wind instrument 12.4
boat 12.3
beach 12
fisherman 11.5
cityscape 11.4
structure 11.3
cloud 11.2
historic 11
clouds 11
power 10.9
coast 10.8
vacation 10.6
outdoors 10.6
life 10.3
summer 10.3
construction 10.3
tourist 10.2
lake 10.1
holiday 10
silhouette 9.9
vessel 9.9
rural 9.7
pollution 9.6
bay 9.4
island 9.2
landmark 9
people 8.9
scenic 8.8
steel 8.8
fishing 8.7
industry 8.5
shore 8.5
horizon 8.1
sunset 8.1
history 8.1
skyscraper 7.7
dusk 7.6
skyline 7.6
buildings 7.6
craft 7.4
reflection 7.3
factory 7.3
work 7.2
black 7.2
transportation 7.2
tree 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

black 76.3
white 64.1
people 57.2
old 46.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 70.8%
Calm 90.5%
Surprised 7.2%
Fear 6.1%
Happy 4%
Sad 2.8%
Disgusted 0.6%
Angry 0.4%
Confused 0.3%

AWS Rekognition

Age 10-18
Gender Female, 99.7%
Calm 78.4%
Surprised 13.2%
Fear 9%
Sad 2.9%
Angry 1.2%
Happy 0.8%
Disgusted 0.7%
Confused 0.5%

AWS Rekognition

Age 16-22
Gender Male, 89.7%
Sad 100%
Calm 12.1%
Surprised 6.4%
Fear 6.1%
Angry 0.7%
Disgusted 0.7%
Happy 0.6%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Female, 74.7%
Sad 99.8%
Fear 16.6%
Surprised 6.9%
Calm 3.8%
Angry 3.1%
Disgusted 1.7%
Happy 0.5%
Confused 0.5%

Feature analysis

Amazon

Person 98.5%
Wheel 95.4%
Bicycle 65%