Human Generated Data

Title

Untitled (Morgantown, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1275

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Morgantown, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1275

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 99.4
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Machine 87.4
Wheel 87.4
Worker 87.3
Wheel 84
Face 83.4
Head 83.4
Outdoors 81.4
Wheel 71.6
Construction 66.5
City 57.4
Road 57.4
Street 57.4
Urban 57.4
Architecture 57
Building 57
Shelter 57
Wood 56.5
Oilfield 56.3
People 55.8
Clothing 55.7
Pants 55.7

Clarifai
created on 2018-05-11

people 100
group together 99.7
adult 99.6
group 99.4
military 99.1
soldier 99.1
war 98.7
vehicle 98.3
two 97.5
man 97.5
one 97.4
three 96.3
weapon 96.1
several 95.7
uniform 95
transportation system 95
skirmish 94.4
wear 93.9
gun 93.3
many 92.4

Imagga
created on 2023-10-06

stretcher 31.3
litter 25
conveyance 23.1
vehicle 20.7
old 18.1
kin 18.1
barrow 15.4
wheeled vehicle 14.3
man 14.1
machine 12.9
building 12.9
machinist 12.8
industry 12.8
industrial 12.7
outdoors 12.7
dirty 12.6
person 12.1
shovel 12
handcart 11.8
people 11.7
male 11.3
crutch 11.2
construction 11.1
adult 11
work 11
architecture 10.9
power 10.9
danger 10.9
wood 10.8
equipment 10.8
machinery 10.7
dirt 10.5
snow 9.9
transportation 9.9
tool 9.8
military 9.7
urban 9.6
sitting 9.4
factory 9.2
tree 9.2
working 8.8
destruction 8.8
country 8.8
staff 8.7
day 8.6
heavy 8.6
outside 8.6
structure 8.5
winter 8.5
stone 8.4
city 8.3
sky 8.3
vintage 8.3
track 8.1
metal 8
steel 8
wooden 7.9
soldier 7.8
accident 7.8
disaster 7.8
travel 7.7
war 7.7
outdoor 7.6
build 7.6
sand 7.5
iron 7.5
environment 7.4
safety 7.4
street 7.4
road 7.2
stick 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 97.9
ground 96.8
old 89.2
vintage 44.4
cart 32.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Male, 88.9%
Sad 98.7%
Confused 15.8%
Angry 11.6%
Fear 7.4%
Surprised 7%
Calm 6.8%
Disgusted 2.1%
Happy 1.6%

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.2%
Male 99.2%
Man 99.2%
Wheel 87.4%

Categories