Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1847

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1847

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Person 94.9
Clothing 92.2
Formal Wear 92.2
Suit 92.2
Coat 90.4
Person 86.6
Outdoors 86.5
Terminal 84.8
Face 79.5
Head 79.5
Person 68.2
Nature 65.2
Transportation 63.4
Vehicle 63.4
Person 61.9
Railway 57.7
Electrical Device 56.6
Architecture 56.3
Building 56.3
Monastery 56.3
Utility Pole 56.2
Train 56.1
Train Station 56.1
Shelter 55.9

Clarifai
created on 2018-05-11

people 99.9
group together 98.2
military 97.8
war 97.4
group 96.5
man 96.3
adult 95.8
many 94.8
soldier 94.8
vehicle 93.8
transportation system 92.3
administration 92.3
uniform 89.9
railway 89.2
train 87.7
home 82
cavalry 80.6
wear 79.7
skirmish 78.7
police 77.7

Imagga
created on 2023-10-07

structure 43.5
wheeled vehicle 38
mobile home 30.3
vehicle 26.1
housing 25.8
trailer 25.4
house 25.1
sky 24.2
grass 22.2
building 21.5
billboard 20.5
architecture 20.3
old 20.2
landscape 20.1
car 18.7
field 17.6
conveyance 17.1
signboard 15.9
freight car 15.5
city 14.1
home 13.6
rural 13.2
tree 13.1
travel 12.7
scenic 11.4
outdoor 10.7
trees 10.7
urban 10.5
outdoors 10.5
street 10.1
transport 10
wood 10
farm 9.8
river 9.8
dairy 9.7
roof 9.5
scene 9.5
wall 9.4
water 9.3
clouds 9.3
countryside 9.1
tourism 9.1
summer 9
transportation 9
country 8.8
houses 8.7
winter 8.5
fence 8.5
stone 8.4
town 8.4
mountains 8.3
exterior 8.3
container 8.2
barn 8.2
scenery 8.1
sunset 8.1
history 8.1
sea 7.8
construction 7.7
buildings 7.6
bridge 7.5
weather 7.5
environment 7.4
shopping cart 7.4
industrial 7.3
sun 7.2
wooden 7
agriculture 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

sky 99.9
outdoor 99.2
old 83.9
standing 82.3
black 74.3
group 55.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Calm 92.7%
Surprised 6.4%
Fear 6%
Sad 3.1%
Angry 2.7%
Disgusted 0.8%
Confused 0.5%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 67.9%
Sad 37.7%
Surprised 7%
Fear 6%
Angry 3.3%
Confused 3%
Happy 0.4%
Disgusted 0.4%

AWS Rekognition

Age 24-34
Gender Male, 53.6%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 4.4%
Confused 2.3%
Happy 0.5%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 20-28
Gender Male, 81.4%
Calm 91.9%
Surprised 6.4%
Fear 6%
Sad 3.5%
Confused 2%
Happy 0.9%
Angry 0.6%
Disgusted 0.4%

AWS Rekognition

Age 18-24
Gender Male, 77.4%
Calm 91.7%
Surprised 6.6%
Fear 6.1%
Happy 3%
Sad 2.6%
Confused 1.2%
Angry 1%
Disgusted 0.4%

AWS Rekognition

Age 23-33
Gender Female, 53.9%
Calm 96%
Surprised 6.4%
Fear 5.9%
Sad 3.1%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%
Happy 0.2%

AWS Rekognition

Age 37-45
Gender Male, 99.4%
Calm 79.6%
Surprised 7%
Fear 6.2%
Disgusted 4.6%
Confused 4.5%
Sad 4.3%
Angry 2.2%
Happy 1.8%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Adult 98.3%
Male 98.3%
Man 98.3%
Person 98.3%

Text analysis

Amazon

CROSSING
ROAD
4290
RAIL
BÁLTIMOREX-OHO

Google

ROSSING 10
ROSSING
10