Human Generated Data

Title

Untitled (Natchez, Mississippi)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1476

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Natchez, Mississippi)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1476

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

City 99.9
Road 99.9
Street 99.9
Urban 99.9
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Person 98.8
Bus Stop 97.6
Outdoors 97.6
Clothing 97.3
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Person 96.9
Person 95.9
Adult 93
Male 93
Man 93
Person 93
Person 90.7
People 88.7
Machine 88.4
Wheel 88.4
Coat 81.9
Footwear 78.4
Shoe 78.4
Motorcycle 75.6
Transportation 75.6
Vehicle 75.6
Person 73.4
Person 69.7
Face 69.2
Head 69.2
Hat 67.4
Shoe 67.1
Hat 61.5
Shoe 61
Spoke 57.5
Door 57.1
Shorts 56.3
Alloy Wheel 55.8
Car 55.8
Car Wheel 55.8
Tire 55.8
Path 55.8
Sidewalk 55.8
Scooter 55.6
Person 55.6
Art 55.6
Painting 55.6
Stick 55.5
Furniture 55.4
Hairdresser 55.3
Helmet 55.1

Clarifai
created on 2018-05-11

people 100
group together 99.6
child 98.8
group 97.6
adult 97.2
many 96.7
man 95.1
uniform 93.3
several 90.5
wear 89.2
boy 88.7
administration 88.5
two 87.5
woman 86.9
military 85.3
street 84.1
three 82.9
recreation 82.6
outfit 82.2
four 81.7

Imagga
created on 2023-10-05

man 26.2
people 21.2
street 20.2
sidewalk 18.4
sport 18.2
wheeled vehicle 17.3
city 16.6
adult 16.2
person 14.9
tricycle 14.4
outdoors 14.2
crutch 13.9
leisure 12.5
child 12.3
outdoor 12.2
stick 11.6
black 11.4
walking 11.4
swing 11.2
staff 10.9
lifestyle 10.8
active 10.8
pedestrian 10.8
world 10.5
portrait 10.4
men 10.3
vehicle 10.3
summer 10.3
action 10.2
athlete 10.2
road 9.9
equipment 9.9
fun 9.7
male 9.7
building 9.6
ball 9.5
stone 9.3
dad 9.2
dark 9.2
alone 9.1
player 9.1
activity 9
cricket bat 8.9
conveyance 8.8
urban 8.7
sports equipment 8.7
athletic 8.6
wall 8.6
youth 8.5
mechanical device 8.4
kin 8.4
plaything 8.2
exercise 8.2
parent 8.2
life 8
cricket equipment 8
looking 8
father 7.8
model 7.8
travel 7.7
outside 7.7
walk 7.6
legs 7.5
human 7.5
competition 7.3
teenager 7.3
danger 7.3
dirty 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

building 99.5
outdoor 99.1
black 69.6
old 62.9
vintage 30.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 22-30
Gender Male, 98.6%
Calm 90%
Surprised 6.8%
Fear 6.1%
Happy 3.8%
Sad 2.8%
Confused 0.9%
Disgusted 0.8%
Angry 0.7%

AWS Rekognition

Age 39-47
Gender Male, 100%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 30-40
Gender Male, 100%
Surprised 67.3%
Fear 35.4%
Calm 14.8%
Sad 5.5%
Confused 2.5%
Disgusted 1.9%
Angry 1.2%
Happy 0.7%

AWS Rekognition

Age 23-31
Gender Male, 99.7%
Surprised 99.2%
Fear 9.5%
Angry 2.5%
Sad 2.2%
Confused 1.2%
Calm 0.8%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Female, 91.3%
Fear 97%
Surprised 6.7%
Calm 3.8%
Sad 2.4%
Confused 0.7%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Male, 94.7%
Sad 99.3%
Fear 12.5%
Disgusted 11.3%
Surprised 7.7%
Confused 6.1%
Angry 4.2%
Happy 1.7%
Calm 0.5%

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Wheel 88.4%
Coat 81.9%
Shoe 78.4%
Motorcycle 75.6%
Hat 67.4%

Text analysis

Amazon

Cafe