Why NULL is 0

Chris Torek chris at mimsy.UUCP
Wed Mar 9 12:26:10 AEST 1988


(You may wish to save this, keeping it handy to show to anyone who
claims `#define NULL 0 is wrong, it should be #define NULL <xyzzy>'.
I intend to do so, at any rate.)

Let us begin by postulating the existence of a machine and a compiler
for that machine.  This machine, which I will call a `Prime', or
sometimes `PR1ME', for obscure reasons such as the fact that it
exists, has two kinds of pointers.  `Character pointers', or objects
of type (char *), are 48 bits wide.  All other pointers, such as
(int *) and (double *), are 32 bits wide.

Now suppose we have the following C code:

 	main()
	{
 		f1(NULL);	/* wrong */
 		f2(NULL);	/* wrong */
 		exit(0);
 	}
 
 	f1(cp) char *cp; { if (cp != NULL) *cp = 'a'; }
 	f2(dp) double *dp; { if (dp != NULL) *dp = 2.2; }

There are two lines marked `wrong'.  Now suppose we were to define NULL
as 0.  Clearly both calls are then wrong: both pass `(int)0', when the
first should be a 48 bit (char *) nil pointer and the second a 32 bit
(double *) nil pointer.

Someone claims we can fix that by defining NULL as (char *)0.  Suppose
we do.  Then the first call is correct, but the second now passes a
48 bit (char *) nil pointer instead of a 32 bit (double *) nil pointer.
So much for that solution.

Ah, I hear another.  We should define NULL as (void *)0.  Suppose we
do.  Then at least one call is not correct, because one should pass
a 32 bit value and one a 48 bit value.  If (void *) is 48 bits, the
second is wrong; if it is 32 bits, the first is wrong.

Obviously there is no solution.  Or is there?  Suppose we change
the calls themselves, rather than the definition of NULL:

	main()
	{
		f1((char *)0);
		f2((double *)0);
		exit(0);
	}

Now both calls are correct, because the first passes a 48 bit (char *)
nil pointer, and the second a 32 bit (double *) nil pointer.  And
if we define NULL with

	#define NULL 0

we can then replace the two `0's with `NULL's:

	main()
	{
		f1((char *)NULL);
		f2((double *)NULL);
		exit(0);
	}

The preprocessor changes both NULLs to 0s, and the code remains
correct.

On a machine such as the hypothetical `Prime', there is no single
definition of NULL that will make uncasted, un-prototyped arguments
correct in all cases.  The C language provides a reasonable means
of making the arguments correct, but it is not via `#define'.
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163)
Domain:	chris at mimsy.umd.edu	Path:	uunet!mimsy!chris



More information about the Comp.lang.c mailing list