Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4235

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4235

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 96.5
Human 96.5
Person 90.9
Building 78.6
Porch 72.3
Vehicle 68.7
Transportation 68.7
Steamer 60

Clarifai
created on 2023-10-25

negative 99.8
movie 99.5
filmstrip 99.3
slide 99.2
cinematography 98.8
picture frame 98.3
vintage 97
collage 96.8
wear 96.7
retro 96
old 95
noisy 95
bobbin 94.4
woman 93.8
people 93.1
desktop 93.1
photograph 92.9
margin 92.7
exposed 92.6
blank 92.1

Imagga
created on 2022-01-08

musical instrument 35.8
architecture 30
building 25.5
hole 23
old 21.6
device 21.2
wind instrument 21.1
stringed instrument 21
psaltery 17.5
landmark 16.2
travel 16.2
historic 15.6
sky 14.7
city 14.1
tourism 14
ancient 13.8
rule 13.7
stone 13.5
pipe 12.5
history 12.5
wall 12.3
church 12
culture 12
structure 11.8
clothespin 11.6
palace 11.2
texture 11.1
business 10.9
wood 10.8
vintage 10.8
tower 10.7
retro 10.7
antique 10.4
construction 10.3
measuring stick 10.2
paper 10.2
house 9.8
fastener 9.8
work 9.6
office 9.6
brown 9.6
grunge 9.4
place 9.3
famous 9.3
square 9
brick 8.9
instrument 8.9
wooden 8.8
paperwork 8.8
folder 8.8
urban 8.7
art 8.5
vacation 8.2
tourist 8.2
religion 8.1
design 7.9
measuring instrument 7.8
finance 7.6
word 7.5
window 7.4
style 7.4
restraint 7.4
detail 7.2
fortress 7.1
ocarina 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

indoor 91.1
clothing 76.6
text 72.3
person 71

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 97.8%
Calm 79.8%
Sad 6.7%
Happy 5.4%
Confused 3.6%
Angry 1.5%
Surprised 1.2%
Fear 1.1%
Disgusted 0.8%

Feature analysis

Amazon

Person 96.5%

Categories

Imagga

interior objects 78%
paintings art 21.9%

Captions

Microsoft
created on 2022-01-08

a display in a store 46.7%

Text analysis

Amazon

ee
Tart ee
Tart