Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1870

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1870

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 98.4
Transportation 98.2
Truck 98.2
Vehicle 98.2
Person 98
Person 97.8
Person 97.6
Adult 97.6
Male 97.6
Man 97.6
Person 97.2
Machine 97
Wheel 97
Wheel 96.3
Wheel 95.2
Person 94.5
Wheel 94.1
Person 94.1
Wheel 93.6
Person 93.4
Person 92.9
Person 86.7
Outdoors 85.8
Wheel 84.4
Person 84
Person 83.2
Person 82.4
Person 80.6
Truck 79
Person 78.3
Wheel 78
War 74.4
Person 74.3
Nature 73.5
Wheel 73.2
Wheel 72.6
Person 70.4
Wheel 57.9
Smoke 57.8
Armored 56.7
Half Track 56.7
Military 56.7
Wagon 56.4
Spoke 55.5

Clarifai
created on 2018-05-11

people 99.9
group together 99.7
vehicle 99.5
many 98.2
group 98.2
transportation system 97.4
military 97.3
adult 96.1
war 95.1
soldier 94
man 92.2
military vehicle 91.6
administration 89.6
driver 87.7
uniform 86.7
outfit 85.6
leader 84.9
wagon 83.8
several 83.1
skirmish 81.8

Imagga
created on 2023-10-06

vehicle 86.4
truck 69.6
half track 42.9
machine 41
motor vehicle 40.6
wheeled vehicle 38.2
tow truck 38.2
military vehicle 35.4
tracked vehicle 34.9
tractor 29.1
machinery 24.4
transportation 24.2
farm 24.1
transport 23.7
work 22
wheel 21.7
old 20.9
rural 19.4
car 19
equipment 18.2
grass 18.2
industrial 18.2
road 17.2
agriculture 16.7
cart 16.6
industry 16.2
driving 15.5
steamroller 15.5
tire 14.9
heavy 14.3
field 14.2
fire engine 14.2
landscape 14.1
conveyance 14
outdoor 13.8
dirt 13.4
wheels 12.7
wagon 12.6
working 12.4
device 12.2
hay 12
construction 12
sky 11.5
farming 11.4
outdoors 11.2
land 11.1
power 10.9
farmer 10.8
automobile 10.5
auto 10.5
horse cart 10.4
vintage 9.9
artillery 9.8
cargo 9.7
motor 9.7
engine 9.6
antique 9.5
drive 9.5
man 9.4
action 9.3
male 9.2
field artillery 9.2
job 8.8
lorry 8.7
outside 8.6
ground 8.5
site 8.4
danger 8.2
steel 8
tires 7.9
gun 7.9
riding 7.8
model t 7.8
military 7.7
track 7.7
horse 7.6
speed 7.3
trailer 7.3
countryside 7.3
snow 7.3
yellow 7.3
metal 7.2
building 7.1
animal 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

tree 99.6
outdoor 99.2
sky 98.4
people 85.6
transport 79.8
military vehicle 54.6
drawn 33.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Calm 62.9%
Angry 23.5%
Surprised 8.8%
Fear 6%
Confused 5.9%
Sad 2.3%
Disgusted 2.3%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 97.5%
Calm 90.1%
Surprised 7.2%
Fear 6%
Confused 3.4%
Sad 2.4%
Happy 2.3%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 6-16
Gender Female, 93.5%
Sad 91.6%
Happy 26.8%
Fear 15.2%
Surprised 6.9%
Calm 4.7%
Confused 4.4%
Disgusted 2.1%
Angry 1.6%

AWS Rekognition

Age 7-17
Gender Female, 56.1%
Calm 75.6%
Sad 10.4%
Surprised 7.8%
Fear 6.2%
Confused 3.8%
Angry 3.1%
Disgusted 1.2%
Happy 1.1%

AWS Rekognition

Age 18-24
Gender Male, 98%
Calm 80.6%
Happy 14.4%
Surprised 6.3%
Fear 5.9%
Sad 3.5%
Confused 0.7%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 21-29
Gender Female, 50.1%
Sad 99.7%
Calm 28.4%
Surprised 6.5%
Fear 6.1%
Confused 2.4%
Happy 0.4%
Disgusted 0.3%
Angry 0.1%

Feature analysis

Amazon

Person 98.4%
Truck 98.2%
Adult 97.6%
Male 97.6%
Man 97.6%
Wheel 97%

Categories

Imagga

cars vehicles 98.5%