Why NULL is 0

Gregg Wonderly gregg at a.cs.okstate.edu
Tue Mar 15 03:15:23 AEST 1988


>From article <10576 at mimsy.UUCP>, by chris at mimsy.UUCP (Chris Torek):
> (You may wish to save this, keeping it handy to show to anyone who
> claims `#define NULL 0 is wrong, it should be #define NULL <xyzzy>'.
> I intend to do so, at any rate.)
> 
> 	[Great example of PR1ME casting problem.]
> 

Another great example is the not so loved Intel segmentation.  For most
Xenix and other pseudo UNIX's for these pseudo computers the following
is true.

When you use small model, everything works great because sizeof(int) == sizeof
((any) *).  If you move to middle model, then everything still works pretty
well except that now sizeof(int) != sizeof ((*)()), but sizeof(int) == sizeof
((anything but function) *).  Now move to large model, and sizeof (int) !=
sizeof ((any) *).  This can really cause problems with routines which accept
NULL as a parameter, because if you do not cast it to (??? *)NULL, then things
break, quite spectacularly.  Now for small and large model, #define NULL
((char *)0) would work, but not for middle model because sizeof (char *) !=
sizeof ((*)()).

Moral:   As always stated in the past, ``Use typecasts, they make your program
         portable, not ugly!''



More information about the Comp.lang.c mailing list