Read this if you're having trouble unpacking Tcl

Kent Paul Dolan xanthian at zorch.SF-Bay.ORG
Tue Jan 1 11:08:29 AEST 1991


tneff at bfmny0.BFM.COM (Tom Neff) writes:
> xanthian at zorch.SF-Bay.ORG (Kent Paul Dolan) writes:

>> All as unarguable as motherhood and apple pie. Now you go tell Joe or
>> Suzie GreatSoftwareHacker that that spiffy 132 character wide
>> terminal s/he bought to write code is such a hazard to the net that
>> we _insist_ s/he stop using the right 52 character positions so that
>> _we_ aren't inconvenienced dealing with the free efforts of his/her
>> skullsweat.

> They can use 700 columns if it makes them feel better, but when it
> comes time to take something they've written and post it to Usenet,
> it's much easier and more considerate for THEM to reformat it
> portably, ONCE, than it is for thousands of cursing recipients to have
> to compensate for their laziness after the fact.

Sorry?  You seem to be bringing a lot of emotion baggage, and not much
logic, to this discussion.

My compiler handles code wider than 80 columns just fine, as do my text
displayers and editors. Where's the problem? It is much easier for me to
unpack, join, and download a uuencoded split zoo or lharc archive than a
shar file, since I'm going to store the source in the latter format in
any case, and I have a much better chance of having it arrive intact and
prove itself to be intact. If part arrives munged, I seek it out from
another site, just as I would if part of a shar got clobbered. I don't
see any more effort on my part, and I do this several times a day.

> Nor are UUENCODE-style subterfuges to preserve the precise original
> bitstream through the netnews channel really a solution, since unless
> the recipient's text architecture happens to match the author's, he
> has TWO decode passes to try and make it through: one to undo the
> UUENCODE and another to turn the decoded alien data into something he
> can compile. The hard work of the people behind the scenes who make
> this second transformation happen transparently in his news gateway is
> thrown away, of course, since UUENCODE deliverately denies the gateway
> access to the real source text.

Well, the only case where I really have a problem is with source shar-ed
on an MS-DOS carriage return/ line feed style box and used on a newline
style box. This fails miserably in clear text, since the ever so helpful
intermediate sites change the carriage return/line feeds to newlines,
and now the shar character counts are all wrong, destroying any chance
to confirm even a limited way the correct transmission of the data. My
most recent unpleasent experience with this is all of 16 hours old, and
I am _not_ thrilled with the prospect of eyeballing a 32 part archive,
some 800K of source code, for transit damage because the transport
mechanism has _no_ remaining sanity checks. It is exactly the delightful
habit of uuencode of denying the gateway the chance to muck about with
the internals of the data that makes it most valuable. Any fool can
write a filter to convert carriage return/line feed pairs to newslines
in 20 minutes, tops, and it only needs doing once. This is a boogeyman,
not a real issue.

>> If I were Kent GreatSoftwareHacker, I'd suggest you write the damn
>> code yourself, if you can't cope with my coding style, or tolerate
>> posting methods that will transport it.

> Au contraire. With all the source that's posted to the net every year,
> a user can stuff his disk many times over JUST with what appears in
> appropriate cleartext format.

Perhaps I show a bit too little paranoia; when a multipart posting shows
up, I get as far as the README file, and never bother to look at the
rest before packing it up in a binary compressed archive and downloading
it for later processing; I'm having a great deal of trouble imagining
the person who sits and reads a source posting of several hundred
kilobytes _while_ in the form of a news article, and from the news
reader. That has to be the least convenient possible way to peruse the
source, since it is not broken out as files, even. In fact, I doubt any
such person exists, so why are we designing our source code transmission
paradigm for this non-existant, frankly ridiculous, strawman?

> When a huge glop of encoded gibberish shows up in a "source"
> newsgroup, many people just sigh and 'K' it.

I'm afraid you're showing a bit of parochialism there; the "source"
distributions in several groups are _always_ encoded; are you suggesting
that people reading those groups bypass all the code, or that people
reading the present clear text groups would do so if all the source were
encoded instead? I think not. I save the code that looks interesting to
me, in whatever format it chances to be forwarded, and nuke the rest,
even if all the serifs are polished by the author before transmission. I
just don't believe in the vast audience of helpless news subscribers you
evoke.

> In the Darwinian order of things, the cleartext will always tend to
> prevail.

Not even close; in the Darwinian order of things, the _intact_ text will
prevail.  As noted above, no one with sense reads all or even most of it
as a news article in any case.

> As someone else has pointed out, it is Netnews whose text format
> interoperability should be improved, transparently to the user, rather
> than promulgating egregious hacks at the expense of news readability.

As pointed out above several times, the "reader" you hold up as the
model being for whom we must transmit source as clear text is unlikely
to exist. If you have no better reason than this strawman for digging in
your heels against the tide of change, why not bow out of the discussion
gracefully?

Kent, the man from xanth.
<xanthian at Zorch.SF-Bay.ORG> <xanthian at well.sf.ca.us>



More information about the Alt.sources.d mailing list