Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1860

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1860

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Worker 100
War 99.9
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 98.8
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 97.9
Person 95.9
Person 92.7
Person 92.1
People 83.1
Wood 82.4
Construction 80.4
Helmet 76.1
Clothing 69.8
Hat 69.8
Footwear 67.3
Shoe 67.3
Photography 57.6
Water 57.2
Waterfront 57.2
Hardhat 56.9
Carpenter 56.7
Oilfield 56.1
Outdoors 56.1
Cap 55.8

Clarifai
created on 2018-05-11

people 99.9
group 98.8
adult 98.7
group together 98.1
man 97.4
vehicle 96.5
many 94.3
administration 92.2
transportation system 91.5
military 90.7
woman 87.4
war 87.1
leader 87
two 85.8
wear 85.4
watercraft 84.9
soldier 84.1
one 82.3
recreation 79.1
three 78.7

Imagga
created on 2023-10-06

man 28.2
people 22.8
musical instrument 21.5
male 19.2
person 16.6
device 15.8
building 15.7
sky 15.3
sitting 14.6
silhouette 13.2
percussion instrument 12.4
outdoor 12.2
summer 12.2
business 12.1
outdoors 12
work 11.8
chair 11.6
seller 10.7
construction 10.3
lifestyle 10.1
water 10
protection 10
city 10
worker 9.8
industry 9.4
sea 9.4
laptop 9.3
adult 9.3
travel 9.1
vacation 9
life 9
group 8.9
drum 8.7
men 8.6
military uniform 8.5
black 8.4
uniform 8.4
old 8.4
clothing 8.3
leisure 8.3
ocean 8.3
tourism 8.2
engineer 8.1
washboard 8
sun 8
urban 7.9
day 7.8
architecture 7.8
disaster 7.8
pollution 7.7
wind instrument 7.6
relax 7.6
landscape 7.4
industrial 7.3
equipment 7.2
sunset 7.2
fisherman 7.2
holiday 7.2
statue 7.1
portrait 7.1
working 7.1
businessman 7.1
happiness 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 95.9
outdoor 94.8
man 93.4
standing 87.2
black 67.7
posing 58.6
old 41.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 68.6%
Surprised 81.9%
Confused 13.6%
Fear 13.2%
Sad 7.9%
Calm 5.8%
Angry 5.2%
Disgusted 2.6%
Happy 1.1%

AWS Rekognition

Age 20-28
Gender Female, 59.5%
Confused 31.9%
Disgusted 26.9%
Happy 21.5%
Surprised 7.6%
Fear 6.2%
Calm 5.2%
Sad 5.2%
Angry 4.7%

AWS Rekognition

Age 19-27
Gender Male, 98.6%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0%
Happy 0%
Disgusted 0%

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Shoe 67.3%

Categories