More on safer rm

utzoo!decvax!ucbvax!unix-wizards utzoo!decvax!ucbvax!unix-wizards
Fri Oct 16 00:51:26 AEST 1981


>From decvax!ittvax!swatt at Berkeley Thu Oct 15 21:59:48 1981


The USENET traffic on this subject has quieted down so I hesitate to
start it up again...  All the stuff in the past sort of got me thinking
about a good solution to the problem and I came up with the following.
It has the advantage of not changing the way existing programs
APPEAR to work (i.e. no "Do you really want to remove this file?").
It has the disadvantage of considerable extra overhead (quite a lot
of which could be eliminated by implementing it in the kernel).

I have tried to identify all the standard programs which would have
to be changed to accomodate this scheme.  The actual time involved
to modify "ls" was < 20 minutes, so I suspect the effort involved
is reasonable.

I have felt the need for some time now for a set of a dozen or so
programs to do the obvious sorts of file operations (edit, copy,
remove, etc.) which are specifically tailored to the beginning
user (prompt for missing arguments, interactive help, and so on ...).
These programs will probably use the non-destructive primitives
described below.  I am interested in comments.

	- Alan S. Watt (decvax!ittvax!swatt)

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This directory contains an initial set of "non-destructive" file
operation primitives, specifically "ndremove" and "ndcreate".  The
calling sequences of these two are identical to the UNIX system calls
"unlink" and "creat".

When a file is removed using "ndremove" it is not actually destroyed,
but its name changed to a backup version.  When a new file is created
with "ndcreate", an existing file of that name is non-destructively
removed.  The backup file retains its position in the filesystem
hierarchy and may later be retrieved.

The scheme uses the high parity (octal 200) bit in the characters of a
filename.  This set allows 14 backup versions, each one indicated by
that number of characters with high parity bits set.  A scheme could
easily be adopted to allow up to 2^14 backup versions, but this seems
excessive.  Detecting existing backup versions and assigning new
version numbers is done by binary search to limit the number of
"access" operations performed.

There is an internal "purge(N)" function which takes as an argument the
number of backup versions to purge.  The first N backup versions are
"unlinked" and all higher-numbered backup versions are percolated down
to the lowest available numbers.

Overhead:
	ndcreat:
		1 "access" to see if file exists ; "ndremove" if so.
		1 "creat" call in any case.
	
	ndremove:
		4 "access" calls [log2(14)] to find first available
		backup version ; 1 "purge(7)" call if none available.
		1 "link" call
		1 "unlink" call
	
	purge(N): (where N is number of backup versions to purge)
		If M is number of existing backup versions,
			(M-N) "link" calls
			M "unlink" calls

Utilities which must be modified to work with this scheme:
(Or new utilities written ...)

	ls	to ignore backup files normally and show them
		if a flag is specified.
	
	rm	to use "ndremove" instead of "unlink".

	cp	to use "ndcreate" instead of "creat".

	du	to ignore backup files normally and to have
		a flag to include them.

	find	to ignore backup files.

	tar,tp,stp,nstp,...
		to ignore backup files.

	mv	to use "ndremove" instead of "unlink".

	sh,csh	to ignore backup files on glob expansion.

New utilities needed:

	retrieve
		to retrieve a file from a backup version.

	purge	to explicitly "unlink" backup files.  Needs options
		to purge all backups older than X and to recursively
		wander down a directory tree.

Unsettled issues:

  1)	How to treat "remove" operations on directories.



More information about the Comp.unix.wizards mailing list