Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1876

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1876

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Worker 99.9
Person 99
Adult 99
Male 99
Man 99
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 97.9
Person 96.1
Machine 95.6
Wheel 95.6
Wheel 91.7
Person 89.8
Clothing 82.3
Hat 82.3
Weapon 80.7
Outdoors 78.6
Wheel 70.3
Architecture 69.2
Building 69.2
Factory 69.2
Wheel 68.4
Footwear 61.3
Shoe 61.3
Manufacturing 57.4
Cannon 57.3
Spoke 56.5
Transportation 56.2
Vehicle 56.2

Clarifai
created on 2018-05-11

people 100
adult 99.8
group together 99.4
one 99.4
vehicle 99.3
watercraft 99.2
group 99.1
two 98.7
man 98.4
transportation system 97
many 96.4
military 95.8
three 95.4
woman 95.3
war 94.7
wear 93.8
aircraft 93.2
four 91.9
several 91.6
administration 90.1

Imagga
created on 2023-10-07

musical instrument 33
man 27.5
wind instrument 22.3
bass 19.7
male 17.7
brass 16.9
trombone 16.4
people 15.1
accordion 14.4
keyboard instrument 12.1
black 12
device 11.9
person 11.5
working 11.5
water 11.3
weapon 11.3
adult 11.1
protection 10.9
equipment 10.9
industrial 10.9
silhouette 10.8
sky 10.2
sport 10
outdoors 9.7
work 9.1
danger 9.1
sunset 9
destruction 8.8
men 8.6
business 8.5
club 8.5
outdoor 8.4
cleaner 8.4
dark 8.3
building 8
job 8
urban 7.9
disaster 7.8
construction 7.7
industry 7.7
holding 7.4
transportation 7.2
portrait 7.1
stringed instrument 7.1
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98.9
man 96
person 89.4
black 68.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 83.4%
Sad 100%
Calm 7%
Surprised 6.3%
Fear 6%
Angry 0.7%
Confused 0.3%
Disgusted 0.2%
Happy 0.2%

Feature analysis

Amazon

Person 99%
Adult 99%
Male 99%
Man 99%
Wheel 95.6%
Shoe 61.3%

Categories

Captions