Precedent for use of =

Barry Shein bzs at bu-cs.UUCP
Sun Jun 22 10:56:53 AEST 1986


I'll risk stating it even more strongly than has been said so far:

It is not even clear that assignment and equality are clearly
distinguished in any mathematical sense, you just think they
are because you are applying a particular model of the machine
you are computing on. One can envision an environment where the
distinction breaks down even further:

A few years ago I worked for a company which had me write a
compiler with a very similar syntax to C (actually, a superset)
who's semantics created a constraint network for a directed
graph pseudo-machine.

Basically, '=' was simply an assertion that at this point in
the computation this statement was 'true' or should be made
such (or, possibly, an error if it was not.)

Full blown expressions were allowed on either sides of an
'assignment' thus:

	j = 5;
	i + 4 = j;

simply meant that by the time the second statement was executed
if 'i' was empty it was to be given the value '1'. If it was not
empty it either already was '1' or indicated an error. Thus statements
could be run 'backwards' to infer missing data or verify existing
state of the data base. Essentially the whole program ran at once
in parallel (as far as the semantics were concerned, not in fact)
so the phrase above "by the time the second statement was executed..."
is simply a pedagogical convenience.

I think such an example should sufficiently blur rigid views
towards assignment and equality.

Another point would be that I have never seen a mathematician hesitate
to introduce notational systems when convenient.

Yet another point would be it is not clear that programming and
it's languages *isn't* mathematics, rather than two things to be
judged side by side. Perhaps the paradigm has shifted?

	-Barry Shein, Boston University



More information about the Comp.lang.c mailing list