Let's define our own NULL

Len Reed lbr at holos0.UUCP
Sat Jun 25 02:51:09 AEST 1988


in article <11326 at steinmetz.ge.com>, davidsen at steinmetz.ge.com (William E. Davidsen Jr) says:
> 
> In article <160 at navtech.uucp> mark at navtech.uucp (Mark Stevans) writes:
> | Back in the days when I was a contract C programmer, I once got into a
> | disagreement over the advisability of the following idea:
> | 
> | 	"Let's define NULL in our product-wide header file like so:
> | 
> | 		#ifndef NULL
> | 		#define NULL	0
> | 		#endif
> | 
> | 	That way, if someone needs NULL but doesn't need to use the standard
> | 	I/O library, he won't need to pull in <stdio.h>."
> 
>   I know I'll get flamed for disagreeing with K&R, but this is WRONG.
> The original book was written before segmented archetectures were
> commonly used, and the idea of "near" and "far" pointers was not an
> issue. When defining NULL as zero, you open the possibility of errors in
> argument lists terminating with NULL, since *only* assignment will
                                               ^^^^^^^^^^^^^^^^
   What about initialization?  Surely this works:
		 char far * cfptr = 0;

> translate zero to a pointer. Look at the exec family to see where NULL
> is passed where a pointer is expected.
> 
>   Better is to define NULL:
> 	#define NULL	((void *) 0)
> and be sure it works. Some compilers have all sorts of tests to use zero
> or zero long, but this always leaves room for a problem in mixed mode
> programming.
> 
>   Obviously there are ill-written programs which expect NULL to be an
> int, even though the term "NULL pointer" is used in K&R, even in the
> index. These programs may break in some obscure ways when a true pointer
> is used, so my solution is not perfect.

Such programs also often assume sizeof(char *) == sizeof(int) and even
sizeof(long) == sizeof(int)!  Brain damage....
> 

On the PC family I'm using Microsoft 5.x and SCO Xenix.  I use
function prototypes with typed arguments; I don't see why "0" won't work
in all cases.  The compilers cast "0" to the proper (far or near) null
value.  Do you have an example using function prototypes with type,
information, where this would fail?  Is there a compiler that allows
((void *)0) but not prototypes?

If you don't have function prototypes, though, assuming "func"
expects (int, pointer, int),
	x = func(2, ((void *)0), 34);
should work for all "memory models" while
	x = func(2, 0, 34);
will fail if you're using multiple data segments, e.g., "large" model.

Even ((void *)0) won't get you out of all mixed-mode pointer problems.
Try compiling func(2, ((void *)0), 34) small model where func is
actually expecting (int, char far *, int)!  You've got to have function
prototypes for mixed-mode programming unless you're careful and willing
to trust to luck.

I like the idea of
#define NULL ((void *)0)
though, since much imported code won't use function prototypes.
Also, ((void *)0 will provide better type checking than 0.  My compilers
will complain if I use a (void *) where something other than a pointer is
expected, but they allow 0 to be used (almost?) anywhere.

Some folks use NULL when they should say '\0'; ((void *)0) can
cause a warning here.  Maybe that's just as well: use 0 or '\0' for
a null character, but use NULL only for pointers.

Microsoft C 5.1 stdio.h defines NULL as 0 or 0L depending upon the model.
This can cause spurious loss of precision warning messages when using
explicit "near" pointers in the large model, and is thus inferior
to ((void *)0).

All said, I like ((void *)0) and may adopt it.  One additional
philosophical reason to use it.  (void *) means generic pointer, and
thus ((void *)0) means generic pointer with value zero, which is
surely closer to "null pointer" than 0.

-- 
    -    Len Reed



More information about the Comp.lang.c mailing list