Human Generated Data

Title

Untitled (Natchez, Mississippi)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1450

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Natchez, Mississippi)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1450

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Path 100
Sidewalk 100
City 100
Road 100
Street 100
Urban 100
Clothing 99.9
Adult 99.6
Female 99.6
Person 99.6
Woman 99.6
Adult 99.4
Female 99.4
Person 99.4
Woman 99.4
Person 95.2
Car 93.6
Transportation 93.6
Vehicle 93.6
Coat 91.5
Walking 88.6
Person 86.3
Accessories 84.4
Bag 84.4
Handbag 84.4
Machine 80.4
Wheel 80.4
Car 78.4
Overcoat 75.4
Footwear 74.8
Shoe 74.8
Lady 74
Formal Wear 72.3
Face 71.3
Head 71.3
Bus Stop 70.2
Outdoors 70.2
Shoe 69.1
Person 63.1
Hat 59.8
Suit 57.9
Advertisement 57.5
Poster 57.5
Shoe 57.1
Neighborhood 57
Reading 57
Alley 56.8
Pedestrian 55.8
Dress 55.8
Art 55.6
Painting 55.6
Sun Hat 55.3

Clarifai
created on 2018-05-11

people 99.9
group together 98
adult 97.5
group 97.2
administration 97.1
man 94.7
two 94.3
woman 94.2
leader 93.1
street 92.9
military 90.8
one 90.5
three 90.3
wear 89.8
many 89.2
war 86.7
vehicle 86.6
outfit 86.2
transportation system 85.4
several 85.1

Imagga
created on 2023-10-06

city 29.1
sidewalk 25.7
urban 24.5
street 23.9
people 22.3
man 22.2
business 18.8
male 18.8
swing 18.7
adult 18.1
men 18
building 17.5
person 17
mechanical device 15.8
architecture 15.6
plaything 15.1
wall 12.8
walking 12.3
black 12.2
outdoors 11.9
mechanism 11.7
transportation 11.7
travel 11.3
window 11.2
outside 11.1
construction 11.1
industry 11.1
portrait 11
megaphone 10.7
standing 10.4
women 10.3
suit 10
industrial 10
call 9.8
device 9.8
job 9.7
walk 9.5
door 9.5
corporate 9.5
work 9.4
stone 9.3
office 9.2
world 9.2
outdoor 9.2
worker 9.1
happy 8.8
station 8.7
acoustic device 8.6
attractive 8.4
fashion 8.3
road 8.1
working 8
day 7.8
telephone 7.8
scene 7.8
modern 7.7
buildings 7.6
bag 7.5
house 7.5
one 7.5
prison 7.4
passenger 7.4
seller 7.4
exterior 7.4
safety 7.4
businessman 7.1
life 7
glass 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
ground 96.3
way 92.5
sidewalk 91.5
street 91.3
scene 87
standing 77.2
curb 11.9
bus stop 7.9

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 48-56
Gender Female, 95.1%
Angry 78.7%
Sad 16.5%
Surprised 6.5%
Fear 6.1%
Disgusted 1.9%
Calm 1.5%
Confused 0.9%
Happy 0.5%

AWS Rekognition

Age 50-58
Gender Male, 99.7%
Calm 61.7%
Sad 29.8%
Angry 10.8%
Surprised 6.8%
Fear 6.3%
Confused 3.5%
Disgusted 0.9%
Happy 0.3%

Microsoft Cognitive Services

Age 23
Gender Male

Feature analysis

Amazon

Adult 99.6%
Female 99.6%
Person 99.6%
Woman 99.6%
Car 93.6%
Coat 91.5%
Wheel 80.4%
Shoe 74.8%
Hat 59.8%

Categories

Text analysis

Amazon

STUD*