"do ... while ((NULL + 1) - 1);" -- valid C?

D. Chadwick Gibbons chad at lakesys.UUCP
Thu Aug 10 10:08:36 AEST 1989


In article <696 at ftp.COM> wjr at ftp.UUCP (Bill Rust) writes:
|In my experience, NULL is always defined using the preprocessor line
|"#define NULL 0" (or 0L). Since the while construct is relying on the
|fact NULL is, in fact, 0, doing NULL + 1 - 1 is ok. I certainly wouldn't
|recommend using it as a reference to memory. But, unless NULL is a 
|reserved word to your compiler, the compiler sees 0 + 1 - 1 and that is
|ok.

	Bad assumption.  In most decent implementations, NULL is indeed
defined as either 0 or 0L.  But this can't be, and isn't, true in all
implementations, which immediately prohibts use of it, if for nothing else
than portability reasons.  In many current implmentations, NULL is often
defined as ((char *)0) since it is the only "safe" thing to do--since many
programmers are not safe.

	As defined by K&R2, NULL is an expression "with value 0, or such
cast to type void *." (A6.6, p.198) This allows implementations to define NULL
as (void *)0, which would cause your NULL +1 -1 to fail.

	In spite of all that, why the hell would you want to use something
designed to designate a _nil pointer_ as an integer expression?! All of the
above is moot; NULL should not be used as an integer in an integer expression!
If using a symbolic is that important to you, do the ASCII thing:

#define NUL	(0)

Or perhaps something a little more readable:

#define ZERO	(0)

(Don't you feel sorry for those you don't know what a "0" means when they see
it inside code?  I know I sure do.)
-- 
D. Chadwick Gibbons, chad at lakesys.lakesys.com, ...!uunet!marque!lakesys!chad



More information about the Comp.lang.c mailing list