Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1832

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1832

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Person 98.6
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Person 98
Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Person 97.3
Person 96.9
Person 95.6
Car 95.5
Transportation 95.5
Vehicle 95.5
Person 95.1
Machine 94.8
Wheel 94.8
Person 93.8
Person 89.8
Car 83.7
Outdoors 77.9
War 73.3
Face 73.2
Head 73.2
Helmet 72.2
Clothing 58.7
Hat 58.7
Hat 57.7
Funeral 57.5
Pickup Truck 57.4
Truck 57.4
Spoke 56.4
Person 55.4
License Plate 55

Clarifai
created on 2018-05-11

people 99.1
vehicle 98.3
group together 98.2
group 97.3
many 96.9
military 92.1
transportation system 91.6
police 90.5
war 90.3
man 90.2
administration 89.7
soldier 88.9
adult 85.6
crowd 84.4
driver 83.4
leader 82.5
car 79.8
military vehicle 76.7
several 74.1
woman 72.5

Imagga
created on 2023-10-06

vehicle 98.7
truck 59.6
military vehicle 52.4
half track 48.5
tracked vehicle 40.7
tow truck 40.4
motor vehicle 38.1
wheeled vehicle 36.7
tractor 32.4
car 31.1
machine 28.9
wheel 26.5
conveyance 23.6
transportation 23.3
farm 21.4
rural 21.2
transport 21
machinery 20.5
tire 19.4
work 18.8
industry 17.9
field 17.6
grass 17.4
agriculture 16.7
equipment 16.5
auto 16.3
sky 15.3
landscape 14.9
old 14.6
steamroller 14.5
road 14.5
dirt 14.3
heavy 14.3
industrial 13.6
driving 13.5
drive 13.2
construction 12.8
bulldozer 12
power 11.8
working 11.5
farmer 11.1
land 11.1
danger 10.9
outdoor 10.7
yellow 10.6
automobile 10.5
farming 10.4
summer 10.3
speed 10.1
hay 9.9
engine 9.6
dangerous 9.5
snow 9
outdoors 9
wheels 8.8
harvest 8.5
crop 8.5
site 8.4
action 8.3
earth 8.2
tank 8.2
lorry 8.1
man 8.1
tires 7.9
bucket 7.8
agricultural 7.8
track 7.7
ground 7.6
sport 7.4
building 7.2
wagon 7.1

Microsoft
created on 2018-05-11

outdoor 99.2
sky 99.1
truck 96.7
transport 76.7
people 74.4
car 13.4

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-36
Gender Male, 99.7%
Angry 54.6%
Surprised 20.3%
Calm 15.3%
Confused 11.3%
Fear 6%
Sad 2.8%
Disgusted 0.7%
Happy 0.4%

Microsoft Cognitive Services

Age 28
Gender Male

Feature analysis

Amazon

Adult 98.8%
Male 98.8%
Man 98.8%
Person 98.8%
Car 95.5%
Wheel 94.8%
Helmet 72.2%
Hat 58.7%

Text analysis

Amazon

109
RA-84

Google

RA-8 109
RA-8
109