volatiles

Eddie Wyatt edw at IUS1.CS.CMU.EDU
Wed Apr 13 01:12:19 AEST 1988


  I've sat back and digested some of this debate over volatile.  I've come
to the conclusion that its not a good idea to add it to the language.

  Lets first discuss the basic premise behind it's proposed addition.
For volatiles, the rational seems to be they are needed to correctly
handle variables that may be modified by multiple threads of execution.  I
think this statement covers the problems associated with  direct multi-tasking,
signal handles, and memory mapped i/o.

  The above rational it not totally correct however.  It misses a key point
that I belief is critical to the whole argument for the addition of volatiles
which is "heavy" optimization of the data flow variety must also be taking
place in order to justify the addition of volatile.  It is with this last
clause that I find many problem.

	1)  Volatile is being used to make up for a deficiency in the
	    data flow algorithm (their inability to handle multiple
	    threads).  I have a couple of complaints along this line.
	    One being, it is not clear to me that volatile will
	    be sufficient in handling the deficiencies of data
	    flow optimizations.  Is there "prior art" to suggest
	    that it will?  Do there exist better technique to
	    handle data flow analysis (or similar optimizations)
	    within a multi-thread environment?

	2)  When variables are not correctly declared as volatiles,
	    a program will exhibit different behavior between
	    the optimized and unoptimized versions.  I have two
	    complaints about this.  One being this sort of 
	    behavior is contradictory to the over all philosophy
	    behind optimization.  A optimization on a language is
	    the set of  transformations that do not change the
	    behavior of the programs but are beneficial by
	    some metric.  Clearly, the first clause has been
	    violated.  Conclusion, it's inappropriate to
	    try to perform standard data flow analysis techinques
	    in a multi-threaded environment.  My second complain
	    stems from a more pragmatic stand point.  Mainly,
	    how does one go about debugging a program that
	    works in the unoptimized version, pukes in the
	    optimized version.   All the source language debuggers
	    I know of only work only on unoptimized code.  If you
	    try the printf technique, you may find your programming
	    changing behavior simple because of the presents of
	    the printf statement (loop invariant may not migrate out
	    of the loop if accessed by the print statement).
	    I can only picture the horrors of trying to debug in
	    that sort of environment.
-- 

Eddie Wyatt 				e-mail: edw at ius1.cs.cmu.edu



More information about the Comp.lang.c mailing list