An Ubiquitous C bug

Norman Diamond diamond at jit345.swstokyo.dec.com
Tue Jan 22 13:33:59 AEST 1991


In article <2831 at casbah.acns.nwu.edu> hpa at casbah.acns.nwu.edu (Peter Anvin) writes:

>Should NULL be all ones?  Performance issues aside, such a machine would
>only need to subtract one when converting an int to a pointer, and add one
>the other way.  In constant expressions, such as when using the macro NULL,
>that can of course be done at compile time.

Why would it have to subtract one and add one?  At compile time, an integer
constant 0 (which has to be evaluated at compile time anyway) in a pointer
context (which has to be determined at compile time anyway) would yield a
bit-string of all 1's.  The library's implementation of free() would either
have to be compiled using this compiler, or else be sure that it compares
against this compiler's representation of NULL.

Casts between pointers and integers (other than compile-time null pointers)
are not required to do anything fancy.  If I were coding a compiler for a
machine where pointers and ints had the same size, I'd just copy the bits.
--
Norman Diamond       diamond at tkov50.enet.dec.com
If this were the company's opinion, I wouldn't be allowed to post it.



More information about the Comp.lang.c mailing list