short circuit evaluation (44 lines)

mckeeman at wanginst.UUCP mckeeman at wanginst.UUCP
Tue Feb 17 03:01:49 AEST 1987


The discussion about side effects is the tip of a very subtle iceberg.  I
would like to separate concerns as follows: visible side effects are those
which are reflected in specific C constructs (essentially all of these are
assignments of one kind or another imbedded within larger expressions or
hidden in arithmetic tautologies), and invisible side effects which are
properties of the underlying hardware (this is a mixed bag including setting
condition codes, causing overflow, causing normalization of floating point
numbers by known hardware hacks, going through states that cannot be 
unwound by a symbolic debugger, and so on.).

The compiler writer is surely allowed any optimization that will not
change the program results.  Some will work harder on this than others
but the programmer should not care.  The question arises when an unlikely
set of circumstances would change the results (like aborting on overflow).
This is a hard problem because the compiler writer cannot know what the
programmer is looking for.  If, for instance, run-time is a legitimate
"result", then no optimization can be allowed at all.

The safest approach is to warn the programmer when the compiler is about to
throw away some of his/her carefully crafted C.  A value that is computed,
and not used is a significant programming error, not an optimization
opportunity.  With that approach, the following fragment would generate
a single store of 0 to x and several warning messages about discarded code
for the visible C commands  /, +, *, *p and ++.

x = 0; x /= 1; x += 0 * *p++;

I regard this approach as an indication that the compiler writer respects the
users of their product; the programmer would not write code unless the
programmer expected it to matter.  A numercial analyst once complained
bitterly that my throwing away the assignment
X = X + 0.0
prevented him from normalizing floating point numbers on the IBM 7090.  I was
unable to give him advice on what he could have written that I could not (at
least in theory) have optimized away.  And he didn't want to stand on his
head to get numbers normalized, either.  Enough said: compiler writers cannot
know what is in the mind of the programmer, and anyway do not want to be
limited to make all their object code worse because of rarely occurring
special cases.

The invisible side effects are more difficult to deal with at the language
definition level.  I often hear complaints, backed by debugger output, that
some assignment never happens.  True enough, the value is held in a register
for use until context exit, and never ends up in memory.  There is no
consequence except for a bedeviled programmer trying to find a bug with
optimizer-obscured clues.  In this case the program is not being looked at as
an input/output mapping, but rather as a living thing to be examined in vivo.

A solution to both problems is some sort of construct in C that says "from
here to here, do everything by the abstract machine rules".  Associate
left-to-right, evaluate operands left-to-right, do stores immediately, and so
on.  This would make the unwelcome warning messages go away (because the
compiler would not be throwing away the code after all) and allow the rare
but necessary forcing of non-optimal code.
-- 
W. M. McKeeman            mckeeman at WangInst
Wang Institute            decvax!wanginst!mckeeman
Tyngsboro MA 01879



More information about the Comp.lang.c mailing list