The traffic volume problems on Usenet and ARPANET

Conde.osbunorth at XEROX.ARPA Conde.osbunorth at XEROX.ARPA
Tue Feb 26 07:59:00 AEST 1985


Lauren,

Here's an approach taken by some Xerox mailing lists which may be
adapted to your situation.

Some lists in digest form will mail out the table of contents only. If
the user is interested, he will retrieve the entire contents to  his
machine. The file copy command is typically embedded in the table of
contents to make it easier. In the particular mail system that I am
using, there is no equivalent of the Unix netnews command to share news
messages.

I do not know if this is feasible, but here's how this may be adapted to
USENET sites. If you are willing to deal with 1-2 day delays in reading
messages:

- Each digest mails out its table of contents. A non-digest message
sends out the subject line only.
- A user uses some program  to peruse the table of contents (TOC) If the
message is available locally, (for some value of local) the user has the
option of reading it. Otherwise, it is simply marked for later
retrieval.
- During that evening, a program will try to retrieve all files which
are marked "interested" but is not already available locally. The
messages will be retrieved from a set of hosts which may have them.
- The following day, the user may read the messages.

As an implementation issue, some kind of universal message id scheme and
a database could be used to index into message contents/subject lines.
This way, the user could ignore all message which say: "What's the
termcap entry for a Trash-80?".

It may be possible on ARPA sites, but I do not know if this will even be
worth considering for usenet sites that redistribute messages to other
sites.  The hard part is knowing who has the replicated copies/when one
is capable of doing cleanup operations (i.e. zapping files) without
causing hardship to others. Some kind of expiration date scheme may work
too...

Daniel Conde
conde.pa at Xerox.ARPA



More information about the Comp.unix.wizards mailing list