Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1900

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1900

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

War 99.9
People 99.8
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99
Adult 99
Female 99
Woman 99
Person 97.8
Person 97.7
Person 97.6
Adult 97.6
Male 97.6
Man 97.6
Person 97.5
Person 97.3
Person 95.8
Person 95.6
Person 94.9
Person 94.5
Person 94.3
Person 94.2
Clothing 94.1
Coat 94.1
Person 87.8
Machine 85.5
Wheel 85.5
Wheel 84.2
Person 83.2
Car 81.5
Transportation 81.5
Vehicle 81.5
Person 80.1
Person 79.6
Truck 78.8
Person 76.2
Outdoors 72
Person 71.1
Person 71
Person 70.1
Person 68.6
Person 68.2
Person 65.8
Person 65.2
Bus 60.6
Person 60.2
Car 58
Hat 57.8
Hat 56
Car 55.7
Jeans 55.4
Pants 55.4

Clarifai
created on 2018-05-11

people 99.8
vehicle 99.5
group together 99.3
police 96.6
many 96.6
military 96.3
group 96.3
transportation system 96.3
war 95.3
soldier 94.7
adult 94
uniform 91.7
road 91.3
man 89.8
street 88.4
driver 87.3
car 86.3
military vehicle 85.9
administration 85.9
outfit 82.2

Imagga
created on 2023-10-07

car 99.4
motor vehicle 88.8
jeep 80
truck 72
vehicle 49.5
wheeled vehicle 33.2
road 29.8
transportation 28.7
fire engine 24.6
transport 21.9
tow truck 20.4
auto 20.1
machine 19.9
automobile 18.2
drive 16.1
sky 16
machinery 15.6
danger 15.5
wheel 15.1
industry 14.5
tractor 13.8
bus 13.7
industrial 13.6
travel 13.4
field 12.6
old 12.5
rural 12.3
fire 12.2
street 12
tire 12
landscape 11.9
farm 11.6
engine 11.6
dirt 11.5
heavy 11.5
equipment 11.2
work 11
accident 10.7
agriculture 10.5
grass 9.5
adventure 9.5
power 9.2
land 9.2
garbage truck 9
driving 8.7
help 8.4
safety 8.3
dirty 8.1
minibus 8
working 8
4x4 7.9
rescue 7.8
wheels 7.8
motor 7.8
emergency 7.7
outdoor 7.6
traffic 7.6
outdoors 7.5
mountains 7.4
yellow 7.3
lorry 7.3
metal 7.2
steel 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

sky 99.9
outdoor 99.3
group 76
people 72.4
old 61.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-22
Gender Male, 99.9%
Sad 64.8%
Calm 44.1%
Surprised 10.9%
Happy 9.8%
Fear 6.1%
Angry 5.8%
Disgusted 1.1%
Confused 0.6%

AWS Rekognition

Age 13-21
Gender Female, 74%
Happy 64.1%
Calm 27.1%
Surprised 6.5%
Fear 6.4%
Sad 3.1%
Angry 2.9%
Confused 0.7%
Disgusted 0.6%

AWS Rekognition

Age 21-29
Gender Male, 90.8%
Calm 93.1%
Surprised 6.5%
Fear 6.1%
Angry 3.8%
Sad 2.5%
Confused 0.4%
Happy 0.4%
Disgusted 0.3%

AWS Rekognition

Age 21-29
Gender Male, 97.3%
Calm 95.6%
Surprised 6.4%
Fear 6%
Sad 2.4%
Angry 1.6%
Happy 0.8%
Disgusted 0.4%
Confused 0.2%

AWS Rekognition

Age 18-26
Gender Female, 92.1%
Angry 35.5%
Fear 30.2%
Calm 27.5%
Surprised 7%
Sad 3.3%
Happy 1.6%
Disgusted 1.6%
Confused 0.4%

Feature analysis

Amazon

Person 99.1%
Adult 99.1%
Male 99.1%
Man 99.1%
Female 99%
Woman 99%
Coat 94.1%
Wheel 85.5%
Car 81.5%
Truck 78.8%
Bus 60.6%
Hat 57.8%
Jeans 55.4%

Categories

Imagga

cars vehicles 99.5%

Text analysis

Amazon

3240