Human Generated Data

Title

Untitled (Williamson, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1689

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Williamson, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1689

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clarifai
created on 2018-05-11

people 97.4
no person 97
adult 96.2
monochrome 95.9
illustration 94.3
picture frame 91.4
retro 91
horizontal 87.1
transportation system 86.5
horizontal plane 83.9
education 83.5
chalkboard 81.5
text 81.4
outdoors 80.8
one 80.7
dirty 80.4
chalk 79.7
old 79.2
vertical 78.6
vector 77.8

Imagga
created on 2023-10-06

brass 72.6
memorial 60.6
structure 51.5
blackboard 50.7
grunge 34.1
texture 32.7
vintage 32.3
pattern 32.2
old 30.7
frame 30
material 27.7
design 27.6
border 25.3
dirty 23.5
antique 22.5
black 22.3
damaged 22
retro 21.3
rough 21
art 20.3
chalk 19.5
space 18.6
graphic 18.3
textured 17.5
backdrop 17.3
chalkboard 16.7
empty 16.3
aged 16.3
board 15.2
grungy 15.2
wall 14.5
weathered 14.3
paper 14.1
element 14.1
web site 14
ancient 13.8
rust 13.5
collage 13.5
equipment 13.3
text 13.1
education 13
blank 12.9
messy 12.6
rusty 12.4
wallpaper 12.3
horizontal 11.7
color 11.7
your 11.6
radio receiver 11.6
electronic equipment 11.5
surface 11.5
digital 11.4
communication 10.9
computer 10.9
paint 10.9
decoration 10.9
close 10.9
school 10.8
dark 10
faded 9.7
business 9.7
classroom 9.7
class 9.7
edge 9.6
film 9.6
brown 9.6
old fashioned 9.5
college 9.5
closeup 9.4
fabric 9.4
receiver 9.2
decorative 9.2
message 9.1
backgrounds 8.9
designed 8.9
photographic 8.8
frames 8.8
noise 8.8
scratch 8.8
negative 8.8
lesson 8.8
movie 8.7
spotted 8.7
screen 8.7
stain 8.7
mask 8.6
canvas 8.5
template 8.3
copy 8
noisy 7.9
highly 7.9
layered 7.9
mess 7.9
slide 7.8
nobody 7.8
strip 7.8
layer 7.7
detailed 7.7
dirt 7.6
learn 7.6
window 7.5
grain 7.4
science 7.1
wooden 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

text 95.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 98.9%
Disgusted 31.4%
Calm 26.6%
Confused 19.2%
Surprised 15.1%
Fear 6.1%
Angry 4%
Sad 3.7%
Happy 2.9%

AWS Rekognition

Age 7-17
Gender Female, 72%
Sad 99.3%
Calm 20%
Disgusted 10.1%
Surprised 6.5%
Fear 6.4%
Happy 3.1%
Angry 2.6%
Confused 1.1%

AWS Rekognition

Age 23-31
Gender Male, 99%
Calm 46%
Happy 33.2%
Sad 12%
Surprised 6.6%
Fear 6.2%
Disgusted 4.7%
Confused 1.3%
Angry 0.6%

AWS Rekognition

Age 26-36
Gender Male, 98.3%
Calm 73.7%
Angry 14.3%
Surprised 7.5%
Fear 6.6%
Disgusted 3.5%
Sad 2.8%
Happy 1.7%
Confused 1%

AWS Rekognition

Age 16-22
Gender Male, 93.1%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Disgusted 0.1%
Confused 0%
Angry 0%

Feature analysis

Amazon

Person 78.5%
Train 66%

Categories

Imagga

text visuals 99.6%

Captions

Text analysis

Amazon

CAPY
86936
AND
NOR
NORFOLK
41100
CAPY 115000
&
N&W
WESTERN
N
N & W
W
НА
115000
R
TERN
POLK
FOL
1035
ITWT 41100
PO 1035
PO
ITWT
I
NON
CU FT
CU FT ISSO
ISSO
I ...
RFO
96363
...
NET 3 30
LD1NT127900
NAS
KU

Google

NORFOLK
AND
N
W
86936
CAPY
TWT
41100
FoI NORFOLK AND WESTERN N & W 86936 CAPY 115 TWT 41100
FoI
WESTERN
&
115