Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1848

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1848

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Water 99.6
Waterfront 99.6
Person 98.6
Wood 98.3
Person 98.3
Person 97.1
Adult 97.1
Male 97.1
Man 97.1
Person 96.9
Adult 96.9
Male 96.9
Man 96.9
Person 96.8
Adult 96.8
Male 96.8
Man 96.8
Person 96.8
Outdoors 92
Person 91.7
Adult 91.7
Female 91.7
Woman 91.7
Architecture 88.2
Building 88.2
Shelter 88.2
Construction 84
Machine 80.7
Wheel 80.7
Oilfield 72.1
Person 69.9
Head 67.8
Nature 61
Lumber 57.6
Countryside 57.2
Rural 57.2
Mining 57
Face 56.4
Pier 55.7
Factory 55

Clarifai
created on 2018-05-11

people 100
group 99.4
group together 99.3
adult 99.3
many 98.3
vehicle 98.2
man 96.1
military 94.2
transportation system 91.9
war 90.9
one 89.8
soldier 89.7
wear 89.2
cavalry 88.1
administration 86.9
two 86
five 85.8
leader 85.8
campsite 84
several 84

Imagga
created on 2023-10-06

mailbox 26.6
chair 25.9
landscape 24.6
vehicle 24.2
machine 24
rural 23.8
snow 22.5
grass 20.6
wheelchair 20.4
seat 19.5
box 18.7
wheeled vehicle 18.5
container 18
tree 17.7
cart 17.6
road 17.2
farm 17
sky 15.9
field 15.9
device 15
old 14.6
transportation 14.3
wagon 14.1
outdoors 13.4
truck 13.3
tractor 13.3
country 12.3
industry 12
winter 11.9
countryside 11.9
outdoor 11.5
park 10.7
carriage 10.5
agriculture 10.5
furniture 10
machinery 9.7
work 9.4
transport 9.1
horse cart 9.1
thresher 9
forest 8.7
architecture 8.6
equipment 8.6
fence 8.5
travel 8.5
power 8.4
summer 8.4
house 8.4
car 8.3
street 8.3
jinrikisha 8.1
farmer 8
season 7.8
cold 7.8
trailer 7.7
heavy 7.6
farming 7.6
wood 7.5
building 7.5
man 7.4
land 7.4
weather 7.3
farm machine 7.3
industrial 7.3
handcart 7.2
trees 7.1
spring 7.1
day 7.1
hay 7
scenic 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
old 41.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Male, 60.4%
Happy 80.4%
Fear 9.4%
Surprised 6.6%
Sad 3.9%
Angry 3.7%
Calm 1.9%
Disgusted 0.9%
Confused 0.5%

AWS Rekognition

Age 34-42
Gender Male, 88.7%
Calm 98.5%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Confused 0.1%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 30-40
Gender Male, 94.8%
Calm 80.4%
Happy 8.9%
Surprised 6.8%
Fear 6.6%
Sad 3.6%
Confused 1.6%
Disgusted 1.2%
Angry 1.1%

Feature analysis

Amazon

Person 98.6%
Adult 97.1%
Male 97.1%
Man 97.1%
Female 91.7%
Woman 91.7%
Wheel 80.7%

Categories