Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1908

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1908

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Car 98.7
Transportation 98.7
Vehicle 98.7
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
Machine 96
Wheel 96
Wheel 93.7
Person 93.4
Person 85.5
Person 83.1
War 82.4
Head 81.8
Person 81.3
Face 79.5
Outdoors 77.5
Person 77.3
Car 75.7
Person 74.5
Person 72.2
Truck 71.2
Person 65.3
Person 64
Person 63
Nature 61.6
Pickup Truck 57.4
Person 57.2
Person 55.3
Fire Truck 55.1

Clarifai
created on 2018-05-11

vehicle 100
car 99.9
transportation system 99.8
truck 99.5
driver 98.4
group together 98.2
people 97.8
engine 93.2
convertible 92.4
adult 88
drive 88
military vehicle 87.7
group 87.3
road 86.8
wagon 86.5
bus 84.9
retro 84.6
war 83.8
police 83.8
man 82

Imagga
created on 2023-10-06

tow truck 100
truck 100
motor vehicle 97.5
car 80.6
vehicle 76.4
transportation 44
wheeled vehicle 37.8
auto 36.4
transport 32
road 29.8
wheel 28.5
automobile 27.8
drive 25.6
pickup 24.2
jeep 22.8
military vehicle 21.9
old 21.6
tire 20.5
driving 18.4
engine 16.4
travel 14.8
motor 14.5
adventure 14.2
4x4 13.8
wheels 13.7
half track 13.2
tires 12.8
sky 12.8
machine 12.5
vintage 12.4
retro 12.3
classic 12.1
speed 11.9
tractor 11.8
land 11.1
work 11
tracked vehicle 10.8
equipment 10.6
rural 10.6
power 10.1
danger 10
machinery 9.8
cargo 9.7
landscape 9.7
dirt 9.6
heavy 9.5
antique 9.5
grass 9.5
fire 9.4
field 9.2
sport 9.1
lorry 9.1
farm 8.9
off road 8.9
emergency 8.7
broken 8.7
luxury 8.6
street 8.3
industrial 8.2
rescue 7.8
cars 7.8
delivery 7.8
sunny 7.8
highway 7.7
four 7.7
traffic 7.6
fun 7.5
style 7.4
sports 7.4
metal 7.2
bumper 7.1
summer 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.3
truck 99
sky 98.7
transport 82.8
old 59.6
car 10.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Male, 99.3%
Calm 70.2%
Sad 31.8%
Surprised 6.4%
Fear 6.1%
Happy 5.9%
Confused 0.8%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 27-37
Gender Male, 75.1%
Disgusted 83.3%
Calm 11.1%
Surprised 6.6%
Fear 6.2%
Sad 2.7%
Angry 1.6%
Happy 0.5%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Female, 66.2%
Happy 64.3%
Sad 22.7%
Surprised 7.3%
Fear 6.2%
Angry 6.1%
Calm 5.7%
Disgusted 1.7%
Confused 1.4%

AWS Rekognition

Age 23-31
Gender Male, 77.9%
Calm 90.5%
Surprised 7.3%
Fear 6.1%
Sad 2.8%
Happy 1.6%
Disgusted 1.3%
Angry 1.2%
Confused 0.8%

AWS Rekognition

Age 30-40
Gender Male, 86%
Calm 92.4%
Surprised 6.4%
Fear 5.9%
Sad 4.1%
Angry 0.8%
Happy 0.7%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 56.4%
Happy 41%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 1.6%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 26-36
Gender Male, 98.5%
Sad 75.1%
Calm 26.8%
Confused 17.7%
Happy 15.8%
Surprised 6.9%
Fear 6.1%
Angry 2%
Disgusted 1.5%

Feature analysis

Amazon

Car 98.7%
Adult 97.7%
Male 97.7%
Man 97.7%
Person 97.7%
Wheel 96%

Categories

Text analysis

Amazon

MUTURE