Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1928

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1928

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 99
Person 98.8
Outdoors 87.7
Architecture 83.9
Building 83.9
Person 83.1
Nature 80.3
Person 79.2
Person 78.6
Soil 71.9
Field 66.1
Device 56.6
Construction 56
Oilfield 55.9
Person 55.7
Worker 55.7
Utility Pole 55.1
Ground 55

Clarifai
created on 2018-05-11

people 100
group 99.3
adult 99.3
group together 98.3
vehicle 97.3
man 96.4
many 96.3
war 96.3
home 96
administration 95.8
military 94.8
soldier 94.6
two 92.8
campsite 92
watercraft 89.5
one 89.5
wear 88.1
transportation system 86.7
cavalry 86.1
three 85.2

Imagga
created on 2023-10-06

maze 100
landscape 36.5
grass 32.5
tree 28.9
field 26.8
sky 21.8
trees 21.4
summer 21.3
countryside 21
farm 19.7
rural 18.5
forest 18.3
park 16.5
fence 16.1
autumn 15.8
country 15.8
plow 15.3
scenic 14.9
outdoor 14.5
spring 14.1
bench 13.7
fall 13.6
outdoors 12.8
clouds 12.7
meadow 12.6
park bench 12.4
tool 12.3
day 11.8
agriculture 11.4
path 11.4
land 11.3
sun 11.3
old 11.2
road 10.9
horizon 10.8
recreation 10.8
environment 10.7
travel 10.6
scene 10.4
seat 9.5
sunny 9.5
cloud 9.5
season 9.4
scenery 9
water 8.7
pasture 8.6
golf 8.6
walking 8.5
sport 8.5
cloudy 8.5
hill 8.4
leisure 8.3
putting 7.9
fields 7.7
hole 7.7
course 7.6
house 7.5
wood 7.5
sunrise 7.5
tourism 7.4
area 7.1
weather 7.1
worm fence 7
leaf 7
leaves 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.6
tree 98.8
ground 96.3
field 76.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 64.8%
Sad 43.4%
Angry 39%
Calm 29.8%
Surprised 6.8%
Fear 6.2%
Disgusted 2.6%
Happy 1.1%
Confused 0.7%

Feature analysis

Amazon

Person 99%
Building 83.9%

Categories

Captions