Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4215

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4215

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.9
Human 98.9
Person 98.3
Person 98.2
Person 98
Person 97.9
Clothing 97.4
Apparel 97.4
Person 91.6
People 87.3
Person 87.1
Sailor Suit 75.6
Overcoat 69.4
Coat 69.4
Person 69.4
Suit 59.5
Pedestrian 56.9
Shorts 56.8
Duel 55.8
Archaeology 55.6
Standing 55.3

Clarifai
created on 2023-10-25

movie 99.8
negative 99.8
filmstrip 99.8
retro 99.6
cinematography 99.5
vintage 99.2
sepia 98.8
old 98.7
collage 98.6
slide 98.3
wear 98.3
exposed 97.7
dirty 97.5
antique 97.4
sepia pigment 97.4
emulsion 97.3
art 97.1
photograph 96.8
picture frame 96.4
noisy 94.3

Imagga
created on 2021-12-15

hole 100
old 40.4
architecture 27.4
ancient 26.8
stone 22.8
building 22.2
wall 21.8
grunge 19.6
texture 19.4
antique 19
vintage 19
history 17
brown 16.9
tourism 16.5
retro 15.6
pattern 15
travel 14.8
textured 14
dirty 13.6
landmark 13.5
wood 13.3
wooden 13.2
brick 13
construction 12.8
culture 12.8
material 12.5
famous 12.1
historic 11.9
aged 11.8
sky 11.5
surface 11.5
weathered 11.4
monument 11.2
empty 11.2
art 11.1
frame 10.8
city 10.8
castle 10.6
structure 10.5
metal 10.5
iron 10.3
design 10.1
rough 10
tower 9.8
detail 9.7
obsolete 9.6
damaged 9.5
door 9.5
decay 8.7
rust 8.7
paper 8.6
industry 8.5
historical 8.5
dark 8.3
church 8.3
backdrop 8.2
industrial 8.2
film 7.9
rusty 7.6
old fashioned 7.6
place 7.4
style 7.4
grain 7.4
tourist 7.2
religion 7.2

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 81.3
clothing 81.3
hat 78.3
man 78.2
old 68.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Female, 64.9%
Calm 95.5%
Happy 3.2%
Sad 0.8%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 36-54
Gender Male, 88.6%
Calm 88.3%
Surprised 3%
Sad 2.6%
Happy 2.2%
Confused 1.8%
Angry 1.5%
Fear 0.4%
Disgusted 0.1%

AWS Rekognition

Age 25-39
Gender Male, 60.4%
Calm 97.2%
Surprised 1.4%
Confused 1.2%
Sad 0.1%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 20-32
Gender Male, 61%
Calm 99.6%
Surprised 0.2%
Sad 0.1%
Happy 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 15-27
Gender Male, 84.1%
Calm 99.3%
Happy 0.4%
Surprised 0.2%
Angry 0.1%
Sad 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 23-35
Gender Male, 52.8%
Calm 93.7%
Happy 4.6%
Surprised 0.7%
Sad 0.5%
Confused 0.4%
Angry 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 26-40
Gender Female, 64.9%
Calm 71.4%
Happy 26.7%
Sad 0.8%
Angry 0.5%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 31-47
Gender Male, 57.8%
Calm 96.4%
Sad 1.5%
Confused 0.9%
Happy 0.6%
Surprised 0.5%
Angry 0.1%
Fear 0.1%
Disgusted 0%

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2021-12-15

an old photo of a person 40.8%
old photo of a person 38.6%
an old photo of a person 33.1%

Text analysis

Amazon

PUBLIC
TAXICA
3 TAXICA
3
HACK
HACK STAN
STAN
delose
VIEBURS

Google

PUBLIC HACK STAN 3 TAXICA
PUBLIC
STAN
HACK
3
TAXICA