Human Generated Data

Title

Untitled (Washington Square, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4236

Human Generated Data

Title

Untitled (Washington Square, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4236

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 97.4
Human 97.4
Building 95.9
Person 94.9
Person 84.5
Architecture 69.9
Clinic 69.6
Person 68.2
Person 65.6
Factory 63.9
People 63.2
Person 54.2

Clarifai
created on 2023-10-25

filmstrip 99.8
negative 99.8
slide 99.7
movie 99.4
noisy 99.3
cinematography 99
retro 98.6
collage 98.4
vintage 98
picture frame 97.4
emulsion 96.6
wear 96.4
dirty 96
old 96
sepia 95.9
exposed 95.9
photograph 94.6
margin 94.6
blank 94.6
art 94

Imagga
created on 2022-01-08

old 45.3
grunge 33.2
wall 32
vintage 29
texture 27.8
ancient 27.7
antique 26.5
architecture 26.2
brown 24.3
aged 20.8
history 19.7
retro 18.8
historic 18.3
art 18.2
material 18
fortress 17
frame 16.9
paper 16.6
building 16.6
culture 16.2
brick 16.1
stone 16.1
wood 15.8
textured 15.8
stucco 15.6
dirty 15.4
design 15.3
landmark 14.4
pattern 14.4
travel 14.1
tourism 14
grain 13.8
structure 13.3
wooden 13.2
detail 12.9
city 12.5
backdrop 12.4
weathered 12.3
monument 12.1
empty 12
construction 12
horizontal 11.7
surface 11.5
grungy 11.4
famous 11.2
style 11.1
hole 11.1
rough 10.9
border 10.9
wallpaper 10.7
ruins 10.7
damaged 10.5
board 10.3
plank 9.8
crumpled 9.7
decay 9.6
worn 9.5
historical 9.4
space 9.3
exterior 9.2
tourist 9.1
paint 9.1
text 8.7
artistic 8.7
page 8.4
sky 8.3
map 8.3
film 8.3
religion 8.1
temple 7.8
timber 7.8
rust 7.7
roman 7.7
old fashioned 7.6
dark 7.5
window 7.5
place 7.4
closeup 7.4
natural 7.4
note 7.4
graphic 7.3
black 7.2
castle 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

indoor 94
person 88.3
clothing 84.2
text 79.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Male, 87.3%
Calm 98.9%
Sad 0.3%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 18-26
Gender Male, 83.6%
Calm 97.8%
Happy 0.9%
Sad 0.9%
Angry 0.2%
Confused 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 42-50
Gender Male, 96.9%
Calm 48.4%
Confused 14.5%
Sad 11.3%
Disgusted 8%
Fear 5%
Happy 4.7%
Angry 4.1%
Surprised 4.1%

AWS Rekognition

Age 25-35
Gender Female, 94.9%
Calm 87.9%
Sad 3.4%
Happy 3.4%
Angry 1.9%
Confused 1.2%
Disgusted 1.1%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 23-31
Gender Female, 50.5%
Calm 62%
Sad 35%
Fear 0.7%
Confused 0.7%
Happy 0.6%
Disgusted 0.5%
Angry 0.3%
Surprised 0.3%

Feature analysis

Amazon

Person 97.4%

Categories

Captions

Microsoft
created on 2022-01-08

a display in a store 41.8%
a store inside of a building 41.7%
a display in a building 41.6%

Text analysis

Amazon

ЭІТАМОЯНЗИАЯ

Google

31TAMOAHONA9
31TAMOAHONA9