Read this if you're having trouble unpacking Tcl

Kent Paul Dolan xanthian at zorch.SF-Bay.ORG
Fri Dec 28 18:51:23 AEST 1990


Many of your points are good ones, however, I cannot let you so lightly
dismiss a couple of them.

Uuencoded files have nice, regular, short lines, free of control
characters, that transit gateways and news software well. I don't want
to tell someone with a 132 character wide screen who's trying to decide
whether it's worth the pain and torment to publish their code for the
benefit of the net that s/he can only write code in the left 3/5ths or
so of the screen because the USENet news software is braindead.

Allowing programmers to transport the code in a manner that will survive
the real world net without a prior hand reformat is a must.

Moreover, uuencoded files of the more modern kind do a line by line
validity check, much more robust than shar's character count.  I've
unpacked many damaged source files from the net that had correct
character counts, but damaged bytes in the files.  This leads to
subtle and time consuming debugging, since you can easily get errors
that don't cause compiler errors by trashing just a byte or two,
especially if you get lucky and hit an operator and convert it to
a diffent operator.

The transit from ASCII to EBCDIC and back irreversably destroys some of
the bracket characters, I forget which ones. This is not a trivial
problem to fix in the source code. Sending the code with a uuencode
varient that avoids characters that don't exist in both character sets
avoids that damage.

The savings of 600Kbytes of spool storage space for tcl as sent means
about 300 news articles can escape the expire cleaver until that
distribution expires. On a small system like the home hobbiest system on
which I have an account, that is a great benefit. With most traffic
volume passing through the Internet, just now communications is not the
overwhelming, all other consideration dismissing bottleneck to USENet
that it was four years ago; disk space, however, is in even shorter
supply; a posting in another newsgroup mentions that the net news volume
doubles every eighteen months, which is faster than spinning storage
grows cheaper by half.

Attention to efficient storage methods in news spools is thus a valid
and ongoing issue (in fact, I wish news were stored compressed and
accessed with a pipe through zcat or by a "copy and uncompress to /temp,
read, and discard the copy" strategy; I'd willingly pay the wait time to
have a longer expire time), so receiving _and_ _storing_ until its
expiration time a more space efficient format such as the compressed,
uuencoded tcl distribution helps every site's expire times and helps
avoid news spool overflow.

Your expressed concern is that the files do not meet the "USENet way" of
distributing source code. This is probably not a surprise to you, but
we're not just USENet any more; we have subscribers on BITNET, EUnet,
FidoNet, and many other networks, even CompuServe. Getting source
material intact through all the possible protocals is a non-trivial
challenge, but the regularity and limited character set of uuencoded
data sure helps.  Paying a penalty (around 10%) in communication time is
at least arguably worth it to be able to tie so much bigger a world
together.

Like you, I prefer to see nice neat open ASCII shars posted, but I grow
more and more willing to tolerate ever stranger formats as my own coping
skills for them increase, especially when many of the options offered,
such as damaged receipt, or news spool space crunches, are worse.

Kent, the man from xanth.
<xanthian at Zorch.SF-Bay.ORG> <xanthian at well.sf.ca.us>



More information about the Alt.sources.d mailing list