Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3663

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3663

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Coat 100
City 99.9
People 99.8
Road 99.5
Street 99.5
Urban 99.5
Person 97.5
Adult 97.5
Male 97.5
Man 97.5
Person 96.8
Adult 96.8
Male 96.8
Man 96.8
Hat 95.5
Person 95.2
Person 94.8
Male 94.8
Boy 94.8
Child 94.8
Person 94.8
Person 93.5
Person 92
Person 91.9
Baby 91.9
Person 91.1
Adult 91.1
Bride 91.1
Female 91.1
Wedding 91.1
Woman 91.1
Person 91.1
Face 89.9
Head 89.9
Person 86.6
Adult 86.6
Bride 86.6
Female 86.6
Woman 86.6
Person 85.2
Person 82.1
Person 78.4
Person 75.7
Overcoat 73.9
Person 71.9
Metropolis 64.6
Transportation 62.3
Vehicle 62.3
Person 62.3
Outdoors 60.9
Crowd 57.7
Neighborhood 56.8
Walking 56.7
Cap 56.6
Baseball Cap 55.8

Clarifai
created on 2018-05-10

people 99.8
group 97.9
child 96.8
many 95.7
adult 95.2
monochrome 95
man 93.3
woman 90.4
group together 89.2
wear 83.4
boy 82.1
war 80.9
education 80.7
uniform 77.2
outfit 76
leader 74.1
several 73
facial expression 72.5
administration 72
music 70

Imagga
created on 2023-10-07

negative 22.2
business 21.2
currency 19.7
money 19.5
film 19.1
dollar 18.5
cash 16.5
bank 16.2
finance 16
newspaper 15.6
old 15.3
grunge 15.3
art 15.1
daily 13.6
bill 13.3
photographic paper 13.2
economy 13
banking 12.9
paper 12.6
hundred 12.6
product 12.1
drawing 12.1
sign 12
vintage 11.7
architecture 10.9
sax 10.8
retro 10.6
symbol 10.1
history 9.8
creation 9.7
brass 9.7
dollars 9.6
pattern 9.6
exchange 9.5
wind instrument 9.5
man 9.4
city 9.1
design 9
wealth 9
people 8.9
photographic equipment 8.8
us 8.7
pay 8.6
historical 8.5
savings 8.4
technology 8.2
building 8.1
group 8.1
financial 8
franklin 7.9
work 7.8
ancient 7.8
modern 7.7
loan 7.7
texture 7.6
rich 7.4
sketch 7.4
close 7.4
world 7.3
black 7.2
antique 7.1
market 7.1

Microsoft
created on 2018-05-10

text 85.5
old 75

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Female, 51.2%
Calm 87.2%
Happy 9.9%
Surprised 6.3%
Fear 5.9%
Sad 2.6%
Disgusted 0.6%
Angry 0.5%
Confused 0.3%

AWS Rekognition

Age 23-31
Gender Female, 82.6%
Calm 53.8%
Confused 28%
Surprised 8%
Fear 6.2%
Sad 5.5%
Angry 3.7%
Disgusted 2.7%
Happy 1%

AWS Rekognition

Age 35-43
Gender Female, 88.6%
Calm 93.9%
Surprised 6.7%
Fear 5.9%
Sad 3.3%
Disgusted 0.6%
Happy 0.6%
Confused 0.4%
Angry 0.4%

Feature analysis

Amazon

Person 97.5%
Adult 97.5%
Male 97.5%
Man 97.5%
Boy 94.8%
Child 94.8%
Baby 91.9%
Bride 91.1%
Female 91.1%
Woman 91.1%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

M.BOSCOACO