Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1874

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1874

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 97.4
Car 97.1
Transportation 97.1
Vehicle 97.1
Car 96.9
Person 96.1
Person 95.5
Person 95.5
Person 94.9
Person 94.8
Machine 94.4
Wheel 94.4
Wheel 92.2
Car 89.8
Person 89.4
Person 88.5
Wheel 86.3
Person 85.2
Car 84.9
Person 84.6
Wheel 84.4
Wheel 83.5
Person 81.8
Person 80.5
Person 79.2
Person 78.8
Person 78
Outdoors 77.8
Car 75.8
Person 74.3
Road 74
Person 72.2
Car 71.6
Person 71.1
Person 68.8
Person 63.3
Person 62.2
Person 61.9
Wheel 61.9
Head 60.4
Wheel 60.1
Smoke 57.6
Clothing 56.9
Footwear 56.9
Shoe 56.9
Architecture 56.5
Building 56.5
Factory 56.5
Antique Car 56.2
Model T 56.2
Car 56
License Plate 55.4
People 55.2

Clarifai
created on 2018-05-11

vehicle 99.9
transportation system 99.5
group together 99.1
car 99
people 98.6
truck 97.2
driver 97
group 94.6
many 91.9
street 88.6
administration 87
adult 86.5
police 86.4
engine 85.9
wagon 84.9
monochrome 84.9
bus 84.7
road 84.3
man 78
traffic 77.7

Imagga
created on 2023-10-06

truck 100
motor vehicle 87.9
vehicle 68.5
fire engine 50.8
car 40.5
wheeled vehicle 38.6
road 33.4
tow truck 31
transportation 26.9
machine 25
transport 24.7
golf equipment 23.9
equipment 22.9
tractor 19.8
sky 18
sports equipment 17.9
grass 17.4
automobile 17.2
farm 17
landscape 16.4
industry 16.2
rural 15.9
tire 15.2
wheel 15.1
machinery 14.6
auto 14.4
field 14.2
drive 14.2
agriculture 14
street 13.8
outdoor 13.8
industrial 13.6
driving 13.5
dirt 13.4
work 13.4
travel 12.7
engine 12.5
military vehicle 12.5
heavy 12.4
danger 11.8
land 11.1
trailer 10.9
highway 10.6
lorry 10.3
delivery 9.7
cargo 9.7
scene 9.5
jeep 9.2
outdoors 9
farmer 8.8
wheels 8.8
driver 8.7
motor 8.7
traffic 8.6
steamroller 8.1
container 8
pickup 8
yellow 8
cars 7.8
sunny 7.8
summer 7.7
move 7.7
adventure 7.6
farming 7.6
ground 7.6
fire 7.5
city 7.5
speed 7.3
business 7.3
job 7.1

Microsoft
created on 2018-05-11

outdoor 99.6
sky 99.1
truck 98.6
transport 69.6
old 48.2
golfcart 29.5
car 11.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Female, 90%
Calm 81.3%
Surprised 6.5%
Fear 6%
Disgusted 5.8%
Confused 5.2%
Sad 3.4%
Angry 2.7%
Happy 0.9%

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Calm 87.7%
Surprised 6.6%
Fear 6.1%
Angry 3.8%
Happy 3.5%
Sad 3%
Disgusted 1%
Confused 0.4%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Surprised 67.8%
Happy 18.6%
Sad 16.2%
Calm 10.2%
Disgusted 8.3%
Fear 6.2%
Angry 4%
Confused 2.7%

AWS Rekognition

Age 21-29
Gender Male, 98%
Angry 42.3%
Happy 18%
Calm 16%
Fear 9.8%
Surprised 8.4%
Sad 4.7%
Confused 3.1%
Disgusted 2.9%

AWS Rekognition

Age 22-30
Gender Male, 93.1%
Calm 68.8%
Happy 16.2%
Fear 7.5%
Surprised 6.7%
Sad 4.4%
Angry 2.8%
Disgusted 1.4%
Confused 0.9%

Feature analysis

Amazon

Person 97.4%
Car 97.1%
Wheel 94.4%
Shoe 56.9%

Categories

Text analysis

Amazon

RR