Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4233

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4233

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 96.2
Apparel 96.2
Person 88.4
Human 88.4
Person 86.2
Person 83.1
Art 67.7
People 66.5
Hat 63.4
Person 60.7
Painting 59.3
Sailor Suit 55.2

Clarifai
created on 2023-10-25

wear 98.2
retro 96.4
people 96.2
sepia 96.1
vintage 95.9
slide 94.7
collage 94.6
old 94.4
art 94
picture frame 93.4
negative 93.2
movie 93
dirty 91.8
paper 91
adult 90.5
man 89.1
cardboard 88.9
sepia pigment 88.2
antique 88
filmstrip 86.4

Imagga
created on 2022-01-08

architecture 32.3
old 30.7
plane 24.1
stucco 23
wood 20.9
texture 20.8
wall 20.8
brown 20.6
building 20.4
hand tool 19.6
stone 19.4
edge tool 18.9
tool 18.7
structure 18.6
board 18.2
tourism 18.2
ancient 17.3
history 17
travel 16.9
landmark 16.3
grunge 15.3
wooden 14.9
art 14.9
culture 14.5
detail 14.5
material 14.5
house 14.4
cutter 14.1
historical 14.1
historic 13.8
construction 13.7
panel 13.4
retro 13.1
support 12.8
home 12.8
temple 12.6
vintage 12.4
monument 12.1
aged 11.8
box 11.8
timber 11.7
religion 11.7
pattern 11.6
sculpture 11.4
antique 11.3
famous 11.2
paper 11.2
tourist 10.9
device 10.7
surface 10.6
design 10.1
city 10
ruin 9.7
cutting implement 9.5
container 9.4
church 9.3
exterior 9.2
frame 9.2
sky 8.9
column 8.8
interior 8.8
textured 8.8
marble 8.8
damaged 8.6
weathered 8.6
table 8.3
plank 7.8
classical 7.6
worn 7.6
grungy 7.6
traditional 7.5
floor 7.4
rough 7.3
tower 7.2
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

clothing 97.1
indoor 91.5
person 85.4
text 84.2
man 69.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Female, 95.2%
Calm 82.2%
Happy 7.1%
Confused 5.6%
Sad 1.9%
Surprised 1.6%
Angry 0.8%
Disgusted 0.5%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Male, 94.8%
Calm 97.5%
Sad 1%
Angry 0.5%
Disgusted 0.4%
Happy 0.2%
Surprised 0.2%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Calm 43.8%
Fear 24%
Disgusted 8.6%
Angry 8.4%
Sad 5.1%
Surprised 5%
Happy 3.8%
Confused 1.2%

AWS Rekognition

Age 27-37
Gender Female, 80.5%
Calm 98.1%
Sad 1.4%
Happy 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0%

Feature analysis

Amazon

Person 88.4%

Categories

Imagga

interior objects 97.8%
paintings art 2.1%

Text analysis

Amazon

5
FRANKFURTERS
Hot FRANKFURTERS
Hot
BADE

Google

Hot FRANMFURTERS
Hot
FRANMFURTERS