Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1920

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1920

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.2
Male 98.2
Man 98.2
Person 98.2
People 97.9
Person 96.6
Person 96.3
Smoke 96
Person 95.9
Person 93.8
Person 93.2
Car 93
Transportation 93
Vehicle 93
Car 92.8
Person 92.5
Person 91.8
Machine 89.7
Wheel 89.7
Person 89
Person 84.8
Person 81.3
Person 81
Outdoors 81
Person 80.8
Wheel 80.3
Wheel 80
Wheel 78.5
Car 77.7
Wheel 76.6
Wheel 73.6
Fog 73.5
Nature 73.5
Smog 73.5
Weather 73.5
Person 71.2
Person 69.5
War 66.1
Person 61.6
Pollution 57.4
Architecture 56
Building 56
Factory 56
Wheel 55.6

Clarifai
created on 2018-05-11

people 99.6
vehicle 99.3
group together 98.4
transportation system 98
adult 96.1
group 94.5
many 93
man 92.6
military 90.4
police 90.2
uniform 89.4
car 86
administration 85.6
accident 83.8
street 82.9
road 82.1
war 81.2
driver 80.5
calamity 79.8
flame 75.5

Imagga
created on 2023-10-06

truck 36.3
vehicle 34.2
car 29.5
road 28
motor vehicle 25.2
transportation 25.1
wheeled vehicle 23.2
container 23.1
street 23
mailbox 21.5
sky 20.5
locomotive 17.4
industry 17.1
city 16.6
urban 16.6
box 16.5
transport 16.4
steam locomotive 16.4
automobile 15.3
traffic 15.2
landscape 14.1
travel 14.1
highway 13.5
building 13.4
drive 13.2
driving 11.6
fire engine 11.6
auto 11.5
snow 11.4
smoke 11.2
pollution 10.6
structure 10.6
grass 10.3
architecture 10.1
power 10.1
industrial 10
night 9.8
steam 9.7
rural 9.7
heavy 9.5
cloud 9.5
house 9.2
danger 9.1
outdoors 9
trees 8.9
factory 8.8
shopping cart 8.5
equipment 8.5
tree 8.5
steamroller 8.4
field 8.4
tourism 8.2
environment 8.2
machine 8.2
station 8.2
light 8
water 8
cars 7.8
slow 7.8
clouds 7.6
ashcan 7.6
weather 7.5
speed 7.3
business 7.3
yellow 7.3
summer 7.1
bin 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98.7
black 85
old 82.9
white 74.5
steam 59.9
vintage 27.4
engine 26.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 98.5%
Disgusted 96.9%
Surprised 6.4%
Fear 5.9%
Sad 2.4%
Calm 0.8%
Angry 0.7%
Confused 0.4%
Happy 0.1%

AWS Rekognition

Age 35-43
Gender Male, 99%
Calm 64.3%
Disgusted 15.7%
Surprised 6.8%
Sad 6.2%
Fear 6.2%
Angry 5.2%
Happy 4.5%
Confused 0.8%

AWS Rekognition

Age 26-36
Gender Male, 100%
Calm 89.4%
Surprised 8.4%
Fear 6%
Sad 4.1%
Disgusted 0.8%
Confused 0.6%
Angry 0.3%
Happy 0.3%

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 86.8%
Surprised 6.7%
Fear 6%
Sad 4.8%
Happy 3%
Angry 2.1%
Confused 0.6%
Disgusted 0.6%

AWS Rekognition

Age 19-27
Gender Male, 100%
Calm 84.3%
Angry 10.2%
Surprised 6.7%
Fear 6.6%
Sad 2.6%
Happy 1.1%
Confused 0.3%
Disgusted 0.1%

AWS Rekognition

Age 7-17
Gender Male, 99.9%
Calm 95%
Surprised 6.5%
Fear 5.9%
Sad 2.3%
Angry 2%
Confused 0.9%
Disgusted 0.6%
Happy 0.3%

AWS Rekognition

Age 13-21
Gender Male, 100%
Angry 43.4%
Calm 31.3%
Happy 12.1%
Fear 7.7%
Surprised 6.9%
Sad 4.9%
Confused 1%
Disgusted 0.7%

AWS Rekognition

Age 19-27
Gender Male, 100%
Surprised 97.8%
Happy 11.9%
Angry 7.1%
Fear 6%
Sad 3%
Calm 1.6%
Disgusted 0.7%
Confused 0.3%

Feature analysis

Amazon

Adult 98.2%
Male 98.2%
Man 98.2%
Person 98.2%
Car 93%
Wheel 89.7%

Text analysis

Amazon

да.