Human Generated Data

Title

Untitled (Nashville, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1201

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Nashville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1201

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

City 100
Road 100
Street 100
Urban 100
Path 99.9
Sidewalk 99.9
Clothing 99.8
Coat 99.8
Person 98.5
Person 98.1
Person 96.3
Adult 96.3
Male 96.3
Man 96.3
Person 96.1
Person 95.9
Person 95.9
Car 95.3
Transportation 95.3
Vehicle 95.3
Person 95.2
Person 95.1
Person 91.1
Person 89.7
Person 88.8
Machine 86.3
Wheel 86.3
Person 80.9
Person 76.5
Face 75.8
Head 75.8
Outdoors 71
Person 68.3
Hat 65.6
Pedestrian 64.9
Person 64
Bus Stop 63.7
Walking 63.4
Footwear 62.5
Shoe 62.5
Formal Wear 61.9
Suit 61.9
Person 61.5
Overcoat 57.8
Accessories 57.1
Bag 57.1
Handbag 57.1
License Plate 55.1

Clarifai
created on 2018-05-11

people 99.9
group together 98.9
many 97.3
military 96.6
adult 96.1
war 96
group 94.7
man 92.2
administration 92.1
soldier 91.5
transportation system 84.9
street 83.2
uniform 83
wear 82.9
vehicle 80.5
leader 78.8
child 75.3
police 74.4
crowd 72.6
weapon 72.3

Imagga
created on 2023-10-05

sidewalk 34.7
city 25.8
street 23
people 19.5
turnstile 17.7
urban 17.5
road 17.2
gate 15.9
men 13.7
concrete 13.4
travel 13.4
person 13.2
industry 12.8
building 12.1
barrier 11.6
man 11.4
movable barrier 10.9
old 10.4
walking 10.4
construction 10.3
architecture 10.2
alone 10
industrial 10
transportation 9.9
steel 9.7
working 9.7
adult 9.7
outdoor 9.2
outdoors 9.1
track 8.8
water 8.7
male 8.6
machine 8.5
site 8.4
town 8.3
device 8.3
sign 8.3
wall 8.3
vacation 8.2
worker 8
parking meter 8
business 7.9
day 7.8
sea 7.8
asphalt 7.8
outside 7.7
train 7.7
sky 7.6
traffic 7.6
equipment 7.6
ocean 7.5
tourism 7.4
action 7.4
life 7.3
hat 7.3
sport 7.2
coast 7.2
staff 7.1
station 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.3
standing 76.3
black 76.1
white 67.4
old 57.6
way 44
sidewalk 31.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Sad 100%
Calm 7.6%
Surprised 6.5%
Fear 6.1%
Angry 0.6%
Confused 0.4%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 34-42
Gender Male, 99.1%
Calm 78.6%
Happy 7.8%
Surprised 7.2%
Fear 6.7%
Angry 4.3%
Sad 3.2%
Disgusted 1.4%
Confused 1%

AWS Rekognition

Age 29-39
Gender Male, 83.4%
Calm 88.3%
Surprised 6.9%
Fear 6%
Happy 5.3%
Sad 2.5%
Confused 1.7%
Angry 1.1%
Disgusted 1%

AWS Rekognition

Age 16-24
Gender Female, 65.7%
Calm 52.5%
Happy 12.4%
Fear 11.5%
Confused 11.3%
Surprised 7.4%
Sad 3.9%
Disgusted 3.7%
Angry 3%

AWS Rekognition

Age 20-28
Gender Male, 93.8%
Calm 96.5%
Surprised 6.4%
Fear 6%
Sad 2.3%
Happy 0.7%
Angry 0.7%
Disgusted 0.6%
Confused 0.4%

Microsoft Cognitive Services

Age 40
Gender Male

Feature analysis

Amazon

Person 98.5%
Adult 96.3%
Male 96.3%
Man 96.3%
Car 95.3%
Wheel 86.3%
Hat 65.6%
Shoe 62.5%