Human Generated Data

Title

Untitled (Morgantown, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1272

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Morgantown, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1272

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Architecture 99.8
Building 99.8
Outdoors 99.8
Shelter 99.8
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Male 98.9
Person 98.9
Boy 98.9
Child 98.9
Person 97.6
Machine 97.6
Wheel 97.6
People 94.1
Wheel 90.1
Face 87.6
Head 87.6
Nature 67.2
Wheel 61.8
Transportation 61.8
Vehicle 61.8
Clothing 57.4
Pants 57.4
Wood 57.1
Slum 57.1
Worker 56.4
Shorts 56.3
Railway 56.3
City 56
Road 56
Street 56
Urban 56
Countryside 55.5
Hut 55.3
Rural 55.3
Construction 55.1
Mining 55.1

Clarifai
created on 2018-05-11

people 100
military 99.9
war 99.8
group together 99.8
soldier 99.7
adult 99.7
group 99.7
vehicle 99
skirmish 99
man 98.4
weapon 98.3
two 97.3
gun 97.2
several 96.9
three 96.5
four 96
administration 95.2
rifle 94.5
child 94.5
one 94.3

Imagga
created on 2023-10-05

factory 38.2
machine 31.7
vehicle 22.7
shovel 22.2
plant 21.5
industry 20.5
industrial 20
old 16.7
power shovel 16.6
device 16.6
military 16.4
construction 16.3
building complex 15.6
structure 15.4
dirty 15.4
sky 15.3
heavy 15.3
gun 14.9
soldier 14.7
artillery 14.5
building 14.3
power 14.3
war 13.8
steam shovel 13.8
machinery 13.6
transportation 13.4
dirt 13.4
tool 13.3
equipment 13.1
iron 13.1
man 12.8
danger 12.7
uniform 12.3
bulldozer 12.2
camouflage 11.8
steel 11.7
backhoe 11.6
weapon 11.6
outdoor 11.5
tractor 10.9
armament 10.8
track 10.8
destruction 10.8
field artillery 10.5
clothing 10.5
hand tool 10.4
machinist 10.4
work 10.2
architecture 10.2
tank 10.1
metal 9.7
male 9.2
protection 9.1
environment 9
cannon 9
history 8.9
engineer 8.9
hydraulic 8.9
excavator 8.9
disaster 8.8
bucket 8.8
urban 8.7
sand 8.7
yellow 8.6
tree 8.5
travel 8.4
city 8.3
vintage 8.3
wheeled vehicle 8.2
sport 8.2
outdoors 8.2
landscape 8.2
high-angle gun 8.1
digging 7.9
loader 7.9
steam 7.8
rifle 7.7
weaponry 7.6
thresher 7.5
site 7.5
person 7.3
transport 7.3
activity 7.2
to 7.1
working 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
man 94
old 93.8
standing 76.5
black 65
white 60.7
posing 53
vintage 35.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-12
Gender Female, 76.4%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.3%
Confused 0.2%
Angry 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 9-17
Gender Male, 97.3%
Sad 100%
Surprised 6.3%
Fear 5.9%
Angry 0.8%
Confused 0.8%
Calm 0.5%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 13-21
Gender Male, 85.4%
Calm 97.3%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Confused 0.4%
Angry 0.3%
Happy 0.2%
Disgusted 0%

Microsoft Cognitive Services

Age 25
Gender Male

Microsoft Cognitive Services

Age 13
Gender Female

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Boy 98.9%
Child 98.9%
Wheel 97.6%