Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3660

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3660

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Person 98.9
Adult 98.9
Female 98.9
Woman 98.9
Clothing 98.9
Dress 98.9
Person 98.8
Door 95.2
Dancing 95.1
Leisure Activities 95.1
Formal Wear 93.8
Lady 68.2
Cross 65.8
Symbol 65.8
Dance Pose 57.7
Blouse 57.4
Archaeology 57.3
Furniture 57.3
Crypt 57.2
Window 57.1
Altar 57
Architecture 57
Building 57
Church 57
Prayer 57
Outdoors 56.9
Tango 56.7
Sitting 56.1
Art 55.6
Painting 55.6
Fashion 55.4
Gown 55.4

Clarifai
created on 2018-05-10

people 99.9
portrait 98.5
adult 98.5
one 97.4
woman 96.6
music 96.3
man 96.2
wear 92.4
monochrome 92
sit 91.7
two 91
indoors 89
musician 88.4
street 86.4
actress 84.9
actor 84
facial expression 83.8
singer 81.7
administration 81.6
leader 80.7

Imagga
created on 2023-10-05

man 22.8
person 20.6
newspaper 17.9
male 16.5
architecture 15
ancient 14.7
people 14.5
sculpture 13.7
statue 13.4
art 13.1
old 12.5
room 12.3
building 12.2
product 12
adult 11.7
portrait 11.6
religion 10.8
city 10
groom 9.9
history 9.8
black 9.8
one 9.7
sitting 9.4
historical 9.4
creation 9.3
travel 9.2
holding 9.1
human 9
grandma 9
musical instrument 8.9
home 8.8
hair 8.7
light 8.7
world 8.7
window 8.6
stone 8.5
business 8.5
monument 8.4
silhouette 8.3
historic 8.2
alone 8.2
grandfather 8
antique 7.8
chair 7.6
famous 7.4
dress 7.2
wind instrument 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.6
window 92.7
posing 74.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Male, 95.7%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%

Feature analysis

Amazon

Person 98.9%
Adult 98.9%
Female 98.9%
Woman 98.9%