Human Generated Data

Title

Untitled (Bethune Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2930

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Bethune Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2930

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Boy 99.4
Child 99.4
Male 99.4
Person 99.4
Boy 99.2
Child 99.2
Male 99.2
Person 99.2
Terminal 98.8
Male 98.4
Person 98.4
Adult 98.4
Man 98.4
Art 90.2
Painting 90.2
Railway 88.1
Train 88.1
Train Station 88.1
Transportation 88.1
Vehicle 88.1
Clothing 86.3
Footwear 86.3
Shoe 86.3
Person 85.3
Shoe 84.5
Shoe 78.9
Head 70.7
Face 65.1
Shoe 60.9
Shoe 57.3
Reading 57.2
Shoe 55

Clarifai
created on 2018-05-10

people 99.9
adult 99.2
two 98.5
one 98.3
man 97.7
group 97.3
administration 96.1
group together 95.7
three 95.5
wear 94.3
woman 94
four 93.4
veil 90.1
outfit 87.9
several 87.9
leader 87.4
furniture 86.9
vehicle 85.5
five 84.9
home 84.9

Imagga
created on 2023-10-06

musical instrument 23.4
man 22.3
adult 20
male 16.4
stringed instrument 14.1
men 13.7
sitting 12.9
person 12.5
barbershop 12.4
shop 12.3
people 12.3
banjo 12.1
lifestyle 11.6
happy 11.3
building 10.8
smile 10.7
window 10.4
architecture 10.2
percussion instrument 9.8
old 9.7
worker 9.2
chair 9.2
pretty 9.1
portrait 9
black 9
mercantile establishment 9
instrument 8.9
statue 8.8
business 8.5
modern 8.4
city 8.3
fashion 8.3
work 8.2
indoor 8.2
one 8.2
religion 8.1
kitchen 8
interior 8
indoors 7.9
couple 7.8
travel 7.7
outside 7.7
scale 7.7
industry 7.7
sculpture 7.6
equipment 7.5
house 7.5
device 7.5
traditional 7.5
vacation 7.4
alone 7.3
sexy 7.2
family 7.1
face 7.1
job 7.1
happiness 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 91.7
person 87.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 96%
Calm 81.8%
Disgusted 11.4%
Surprised 6.5%
Fear 6.2%
Sad 3.2%
Angry 1.3%
Happy 1%
Confused 0.3%

Feature analysis

Amazon

Boy 99.4%
Child 99.4%
Male 99.4%
Person 99.4%
Adult 98.4%
Man 98.4%
Shoe 86.3%

Categories